Navigation method, device, system and equipment based on augmented reality technology
Technical Field
The invention relates to the technical field of navigation, in particular to a navigation method, a navigation device, a navigation system and navigation equipment based on an augmented reality technology.
Background
Currently, a user often guides driving through a navigation technology in the process of driving a vehicle: and the information such as the current position of the vehicle, the road condition in front and the like is obtained through a navigation technology. The AR (augmented reality) technology can acquire road condition information in front of a driving road of a vehicle through a camera installed in front of the vehicle, and projects an image in front of the vehicle through a display device, so that a real environment and a virtual image are superimposed on the same picture in real time, and seamless integration of the real environment and the virtual information is realized.
The AR navigation is a navigation method implemented on the basis of combining the AR technology and the map information, and can provide more intuitive, more vivid, more accurate, and safer navigation services for people.
In the prior art, the AR navigation technology improves the accuracy of navigation based on the recognition of street view and GPS signal coordinate position and attitude associated with a front lane line and an environmental road. The AR navigation technology is more concerned with the recognition of the road ahead and the guidance of the navigation direction during the driving and navigation of the vehicle, and lacks the combination with the information of the vehicle itself and the information of the surroundings during the driving of the vehicle (especially the information of the surrounding road conditions when the vehicle changes lanes and turns around). When a driver changes lanes or turns at an intersection according to a navigation-guided route, if the driver cannot find road condition information with safety risks around the vehicle in time, such as overtaking and accelerating of a vehicle behind; when vehicles close to a lane change and are jammed and still run according to the guidance of a navigation route, traffic accidents such as scraping, even collision and the like can be caused, and potential safety hazards are brought to driving. Therefore, how to combine the AR navigation technology with the vehicle-mounted sensor and the vehicle active safety technology remains a technical problem to be overcome urgently.
Aiming at the defects in the prior art, the application aims to provide a navigation method, a navigation device, a navigation system and navigation equipment based on an augmented reality technology, so that the navigation accuracy and the driving safety can be greatly improved.
Disclosure of Invention
In view of the foregoing problems in the prior art, an object of the present invention is to provide a navigation method, apparatus, system and device based on augmented reality technology.
In order to solve the above problems, the present invention provides a navigation method based on augmented reality technology, the method comprising:
acquiring navigation demand information, and planning a navigation path according to the navigation demand information;
acquiring current position information of a vehicle and surrounding environment information of the vehicle, wherein the current position information of the vehicle comprises a name of a road where the vehicle is located;
acquiring the guiding information of the front route of the vehicle according to the navigation path and the current position information of the vehicle;
acquiring first navigation information according to the current position information of the vehicle, the surrounding environment information of the vehicle and the guiding information of the route in front of the vehicle;
acquiring vehicle active safety information according to the current position information of the vehicle and the surrounding environment information of the vehicle;
acquiring second navigation information according to the first navigation information and the vehicle active safety information;
and outputting the second navigation information.
Further, the vehicle active safety information comprises lane departure information, vehicle distance information and blind area detection alarm information.
Further, the vehicle active safety information comprises lane departure information, vehicle distance information, blind area detection alarm information and warning information;
the warning information is determined by the following method:
acquiring vehicle interior image information, and acquiring behavior information of drivers and passengers in the vehicle according to the vehicle interior image information;
and judging the behavior information of the drivers and passengers in the automobile, and generating corresponding warning information when the behavior of the drivers and passengers in the automobile is judged to be dangerous.
Specifically, the vehicle surrounding environment information includes vehicle surrounding lane line information, vehicle surrounding traffic identification information, vehicle surrounding other vehicle information, vehicle surrounding pedestrian information, and vehicle surrounding obstacle information.
Specifically, the obtaining of the vehicle active safety information according to the vehicle current position information and the vehicle surrounding environment information includes:
acquiring lane departure information according to the current position information of the vehicle and the surrounding lane line information;
acquiring the vehicle distance information according to the current position information of the vehicle, the information of other vehicles around the vehicle, the information of pedestrians around the vehicle and the information of obstacles around the vehicle;
and detecting the blind area of the vehicle according to the surrounding environment information of the vehicle to acquire blind area alarm information.
Specifically, the vehicle front route guidance information includes distance information of the vehicle from the front intersection, steering information of the vehicle at the front intersection, and a front road name.
Further, the vehicle front route guidance information comprises distance information of the vehicle from the front intersection, steering information of the vehicle at the front intersection, a front road name and an estimated passing time of the vehicle from the current position to the front intersection;
the expected transit time is determined by:
acquiring vehicle running information, wherein the vehicle running information comprises vehicle speed information, acceleration information and steering wheel angle information;
and acquiring the predicted passing time according to the navigation path, the current position information of the vehicle and the running information of the vehicle.
In another aspect, the present invention provides a navigation device based on augmented reality technology, including:
the first acquisition module is used for acquiring navigation demand information;
the second acquisition module is used for acquiring the navigation path according to the navigation demand information;
the third acquisition module is used for acquiring the current position information of the vehicle and the surrounding environment information of the vehicle, wherein the current position information of the vehicle comprises the name of the road where the vehicle is located;
the fourth acquisition module is used for acquiring the guiding information of the front route of the vehicle according to the navigation path and the current position information of the vehicle;
the first navigation information acquisition module is used for acquiring first navigation information according to the current position information of the vehicle, the surrounding environment information of the vehicle and the guide information of the route in front of the vehicle;
the fifth acquisition module is used for acquiring vehicle active safety information according to the current position information of the vehicle and the surrounding environment information of the vehicle;
the second navigation information acquisition module is used for acquiring second navigation information according to the first navigation information and the vehicle active safety information;
and the output module is used for outputting the second navigation information.
The invention also discloses a navigation system based on the augmented reality technology, which comprises a camera, a radar and the navigation device based on the augmented reality technology, wherein the camera and the radar are both connected with the navigation device;
the camera is used for acquiring the surrounding environment information of the vehicle and the image information in the vehicle;
the radar is used for acquiring vehicle distance information according to the current position information of the vehicle and the surrounding environment information of the vehicle;
the navigation device can receive the vehicle distance information acquired by the radar and the vehicle surrounding environment information and the in-vehicle image information acquired by the camera.
The present invention also protects an electronic device comprising:
one or more processors;
a memory; and
one or more programs, stored in the memory and executed by the one or more processors, the programs comprising instructions for performing an augmented reality technology-based navigation method of the above-described aspects.
Due to the technical scheme, the invention has the following beneficial effects:
according to the navigation method based on the augmented reality technology, the visualization of the surrounding environment information in the driving process can be realized through facility equipment installed on a vehicle; the combination of navigation information and vehicle active safety technology is realized, the reminding of blind area alarm, safe distance alarm, lane change alarm and the like of drivers and passengers is provided, more accurate and safer driving guide is provided, potential traffic safety hidden dangers are avoided in advance, and the driving safety is improved; by means of the AR navigation technology, a more visual and visual AR navigation method is provided.
Drawings
In order to more clearly illustrate the technical solution of the present invention, the drawings used in the description of the embodiment or the prior art will be briefly described below. It is obvious that the drawings in the following description are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
Fig. 1 is a schematic flowchart of a navigation method based on augmented reality technology according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a navigation method based on augmented reality technology according to another embodiment of the present invention;
fig. 3 is a schematic structural diagram of a navigation device based on augmented reality technology according to an embodiment of the present invention.
In the figure: 10-a first acquisition module, 20-a second acquisition module, 30-a third acquisition module, 40-a fourth acquisition module, 50-a first navigation information acquisition module, 60-a fifth acquisition module, 70-a second navigation information acquisition module, and 80-an output module.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprising" and "having," as well as any variations thereof, are intended to cover non-exclusive inclusions.
Example 1
With reference to fig. 1, the present embodiment provides a navigation method based on augmented reality technology, including:
s110: acquiring navigation demand information, and planning a navigation path according to the navigation demand information; the navigation demand information comprises navigation request information, departure place information and destination information, and after the navigation request information is confirmed to be correct, a navigation path is planned according to the departure place information and the destination information.
S120: acquiring current position information of a vehicle and surrounding environment information of the vehicle, wherein the current position information of the vehicle comprises a name of a road where the vehicle is located; in the embodiment of the specification, the current position information of the vehicle can be acquired through a vehicle-mounted GPS; the vehicle surrounding environment information can be obtained through a radar and a look-around camera arranged on the top of the vehicle.
After obtaining the original materials of the vehicle surrounding environment information, the method performs multi-task simulation detection and identification on the vehicle surrounding environment information obtained by the camera through a deep learning algorithm, captures and distinguishes traffic identification information such as lane line information, vehicle surrounding object information, traffic light information, speed limit identification and the like, and further subdivides the vehicle surrounding object information into other vehicle information, pedestrian information, obstacle information and building information;
and calculating the distance information between the vehicle and the surrounding objects according to the Doppler effect and the time difference of the received reflected waves.
S130: acquiring the guiding information of the front route of the vehicle according to the navigation path and the current position information of the vehicle; the vehicle front route guidance information includes distance information of the vehicle from a front intersection, turning information of the vehicle at the front intersection, and a front road name.
The turning information of the vehicle at the front intersection comprises left turning, straight going, right turning, rotary island and turning around of the vehicle at the front intersection.
S140: acquiring first navigation information according to the current position information of the vehicle, the surrounding environment information of the vehicle and the guiding information of the route in front of the vehicle;
therefore, the first navigation information includes not only the own vehicle surrounding environment information but also the front road guidance information giving driving guidance to the driver.
S150: acquiring active safety information of the vehicle through a fusion algorithm according to the current position information of the vehicle and the surrounding environment information of the vehicle; the vehicle active safety information comprises lane departure information, vehicle distance information and blind area detection alarm information.
The lane departure information can be obtained through the current position information of the vehicle and the lane line information around the vehicle;
the distance information is distance information between the vehicle and surrounding objects and is obtained by a radar, and the distance information comprises first distance information between the vehicle and other surrounding vehicles, second distance information between the vehicle and surrounding pedestrians and third distance information between the vehicle and surrounding obstacles; the first distance information is obtained through the current position information of the vehicle and the information of other vehicles around the vehicle; the second distance information is obtained through the current position information of the vehicle and the information of pedestrians around the vehicle; the third distance information is obtained through the current position information of the vehicle and the information of obstacles around the vehicle;
and detecting the blind area of the vehicle according to the surrounding environment information of the vehicle to obtain blind area alarm information.
S160: acquiring second navigation information according to the first navigation information and the vehicle active safety information;
in the embodiment of the present specification, the second navigation information includes the current position information of the vehicle, the surrounding environment information of the vehicle obtained by the camera radar, the distance information between the vehicle and the intersection ahead, the steering information of the vehicle at the intersection ahead, the road guide information ahead of the name of the road ahead, and the active safety information such as lane departure information, vehicle distance information, and blind zone detection alarm information, so that when the second navigation information is output to the driver, the driver can not only know the position of the vehicle and the surrounding environment information of the vehicle, but also drive the vehicle according to the guidance of the guide information, and besides, can obtain the safety information of the vehicle in real time, and avoid that the driver is lack of knowledge of the surrounding dangerous situation when driving according to the guide information, for example, the rear vehicle located at the blind zone of the driver overtakes, or overtakes, Acceleration, lane change, jam and the like of vehicles close to a lane, if the driver does not obtain similar dangerous information and still drives according to the guidance of a navigation route, traffic accidents such as scraping, even collision and the like can be caused, and potential safety hazards are brought to driving.
S170: and outputting the second navigation information.
In the embodiment of the description, the second navigation information includes both image information and voice information, wherein the image information is visually presented in front of the driver through a display instrument and other devices installed on the vehicle on the basis of the first navigation information through a fusion algorithm, image rendering and layer superposition with the vehicle active safety information, and includes vehicle current position information, vehicle surrounding environment information, front road guidance information and vehicle active safety information;
the image information is presented as follows:
on an image in front of a driver, the current position of the vehicle and the surrounding environment information of the vehicle are displayed and superposed with front route guide information, including left/right turn, U-turn, straight movement and lane change and the like, traffic indicator light information, front speed limit photographing and other prompting information, lane departure, vehicle/pedestrian detection, distance collision reminding according to vehicle distance information and the like;
1) when lane departure is detected, highlighting the departing lane line on the AR navigation image picture to prompt a driver and passengers;
2) when the fact that the vehicle and the front vehicle/pedestrian are in the unsafe distance or the vehicle/pedestrian in the unsafe distance exists in the blind area of the driver is detected, the vehicle, the pedestrian or other objects are circled on the AR navigation picture;
3) and when the user turns on the steering lamp but judges that the adjacent lane is not allowed to change lanes immediately at the moment through the vehicle distance information, highlighting the target lane on the AR navigation picture and prompting the driver to wait.
The voice information can be output in a voice broadcasting mode; besides, other output modes of the warning lamp and the flashing lamp can be provided.
The navigation method based on the augmented reality technology provided by the embodiment of the specification can realize the visualization of the peripheral environment information in the driving process through the facility equipment installed on the vehicle; the navigation information is combined with the vehicle active safety technology, the warnings such as driver and passenger blind area warning, safe distance warning and lane change warning are provided, more accurate and safer driving guidance is provided, and the driving safety is improved; by means of the AR navigation technology, a more visual and visual AR navigation method is provided.
Example 2
As shown in fig. 2, the present embodiment provides a navigation method based on augmented reality technology, including:
s210: acquiring navigation demand information, and planning a navigation path according to the navigation demand information; the navigation demand information comprises navigation request information, departure place information and destination information, and after the navigation request information is confirmed to be correct, a navigation path is planned according to the departure place information and the destination information.
S220: acquiring current position information of a vehicle and surrounding environment information of the vehicle, wherein the current position information of the vehicle comprises the current position of the vehicle and the name of a road where the vehicle is located; in the embodiment of the specification, the current position information of the vehicle can be acquired through a vehicle-mounted GPS; the vehicle surrounding environment information includes: traffic sign information such as lane line information, vehicle surrounding object information and traffic light information, speed limit sign, furtherly, vehicle surrounding object information can divide into: other vehicle information (i.e., other vehicles except the own vehicle), pedestrian information, obstacle information, and building information, the vehicle surrounding environment information being obtained by radar and a look-around camera and a front camera installed on the roof and/or the periphery of the vehicle.
S230: acquiring the guiding information of the front route of the vehicle according to the navigation path and the current position information of the vehicle; the vehicle front route guidance information comprises distance information of the vehicle and a front intersection, steering information of the vehicle at the front intersection, a front road name and predicted passing time of the vehicle from a current position to the front intersection.
Wherein, the steering information of the vehicle at the front intersection comprises: the vehicle turns left, goes straight, turns right, rotary island and turns around at the front intersection.
The expected transit time is determined as follows:
acquiring vehicle running information, wherein the vehicle running information comprises vehicle speed information, acceleration information and steering wheel angle information; the vehicle travel information also includes information such as electric quantity information.
And acquiring the predicted passing time according to the navigation path, the current position information of the vehicle and the vehicle running information.
S240: acquiring first navigation information according to the current position information of the vehicle, the surrounding environment information of the vehicle, the driving information of the vehicle and the guiding information of the route in front of the vehicle;
s250: acquiring vehicle active safety information according to the current position information of the vehicle and the surrounding environment information of the vehicle; the vehicle active safety information comprises lane departure information, vehicle distance information, blind area detection alarm information and warning information.
The lane departure information is obtained by calculation through a big data algorithm according to the current position information of the vehicle and the lane line information around the vehicle;
the vehicle distance information comprises first distance information between the vehicle and other surrounding vehicles, second distance information between the vehicle and surrounding pedestrians, and third distance information between the vehicle and surrounding obstacles;
the blind area alarm information is obtained by detecting the blind area of the vehicle according to the surrounding environment information of the vehicle, which is obtained by the camera and the radar and is positioned at the blind area of the field of vision of the driver;
the warning information is determined by:
acquiring vehicle interior image information, and acquiring behavior information of drivers and passengers in the vehicle according to the vehicle interior image information; the vehicle interior image information can be obtained through a camera arranged in a carriage, after the original material of the vehicle interior image information is obtained, the image information is captured and identified through a deep learning algorithm, and behavior information of drivers and passengers is obtained.
And judging the behavior information of the drivers and passengers in the automobile, and generating corresponding warning information when the behavior of the drivers and passengers in the automobile is judged to be dangerous. For example: when the dangerous behaviors that the driver and the passenger do not fasten the safety belt, stretch out of the window by hands and the like are judged, corresponding warning information and the like are generated and finally output and displayed to the driver.
S260: acquiring second navigation information according to the first navigation information and the vehicle active safety information;
s270: and outputting the second navigation information.
The second navigation information comprises image information and voice information, wherein the voice information can be output to a driver in a voice broadcasting mode, for example, when the behavior of a driver and a passenger is dangerous, the timeliness of information acquisition can be improved through voice output, and therefore the driving safety is greatly improved;
when the image information can be visually projected to the front of a driver through equipment such as a head-up display instrument, the driver can obtain various information required by vehicle running without lowering head;
the image information is presented as follows: by AR display technology, the driver can know: the system comprises a vehicle driving information such as the current position of the vehicle, the name of the road where the vehicle is located, the information of lane lines around the vehicle, objects around the vehicle, traffic signs, the speed and the oil quantity of the vehicle, a distance between the vehicle and a front crossing, steering information of the front crossing, the name of the front road and the predicted passing time of the front crossing, lane departure information, vehicle distance information, blind area detection alarm information, warning information of the behaviors of drivers and passengers, and the like, wherein the vehicle driving information comprises the distance between the vehicle and the front crossing, the steering information of the front crossing, the;
1) when overspeed or low electric quantity is detected, vehicle speed information and an electric quantity icon are highlighted on the AR navigation picture to remind a driver of decelerating or charging in time;
2) and when the situation that the driver and the passenger don't wear the safety belt is detected, highlighting an icon without the safety belt on the navigation image to remind the driver and the passenger.
In a navigation method based on an augmented reality technology provided in an embodiment of the present specification, the second navigation information further includes active safety information; therefore, the driver can drive along with the front route guide information and monitor the surrounding environment in real time, lane departure is avoided, the following distance is short, and safety risks exist in blind areas. Other similar parts of this embodiment and embodiment 1 can be referred to each other, and are not described in detail herein.
Example 3
As shown in fig. 3, an embodiment of the present specification provides a navigation device based on augmented reality technology, including:
a first obtaining module 10, configured to obtain navigation requirement information;
the second obtaining module 20 is configured to obtain a navigation path according to the navigation requirement information;
the third obtaining module 30 is configured to obtain current position information of a vehicle and surrounding environment information of the vehicle, where the current position information of the vehicle includes a name of a road where the vehicle is currently located;
a fourth obtaining module 40, configured to obtain, according to the navigation path and the current position information of the vehicle, route guidance information in front of the vehicle;
a first navigation information obtaining module 50, configured to obtain first navigation information according to the vehicle surrounding environment information and the vehicle front route guidance information;
a fifth obtaining module 60, configured to obtain vehicle active safety information according to the vehicle current position information and the vehicle surrounding environment information;
the second navigation information obtaining module 70 is configured to obtain second navigation information according to the first navigation information and the vehicle active safety information;
and an output module 80, configured to output the second navigation information.
Example 4
The embodiment of the specification provides a navigation system based on augmented reality technology, which comprises a camera, a radar and a navigation device provided in the technical scheme, wherein the camera and the radar are both connected with the navigation device;
the camera comprises a front camera, a look-around camera and an in-vehicle camera, and is used for acquiring vehicle surrounding environment information and in-vehicle image information;
the radar comprises a laser radar and an ultrasonic radar, and is used for acquiring vehicle distance information according to the current position information of the vehicle and the surrounding environment information of the vehicle;
the navigation device can receive the vehicle distance information acquired by the radar and the vehicle surrounding environment information and the vehicle interior image information acquired by the camera.
Example 5
An embodiment of the present specification provides an electronic device, including:
one or more processors;
a memory; and
one or more programs, stored in the memory and executed by the one or more processors, the programs comprising instructions for performing the augmented reality technology-based navigation method as provided in the above-described solution.
While the invention has been described with reference to specific embodiments, it will be appreciated by those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the invention can be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.
Also, in some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results.