CN108921360A - A kind of social interaction mode, device and electronic equipment based on unmanned vehicle - Google Patents
A kind of social interaction mode, device and electronic equipment based on unmanned vehicle Download PDFInfo
- Publication number
- CN108921360A CN108921360A CN201810846539.7A CN201810846539A CN108921360A CN 108921360 A CN108921360 A CN 108921360A CN 201810846539 A CN201810846539 A CN 201810846539A CN 108921360 A CN108921360 A CN 108921360A
- Authority
- CN
- China
- Prior art keywords
- unmanned vehicle
- route
- road condition
- condition information
- unmanned
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003997 social interaction Effects 0.000 title claims abstract description 31
- 238000000034 method Methods 0.000 claims abstract description 33
- 206010039203 Road traffic accident Diseases 0.000 description 6
- 238000004590 computer program Methods 0.000 description 5
- 230000003993 interaction Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 230000008451 emotion Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000001815 facial effect Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000036651 mood Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/04—Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
- G06Q10/047—Optimisation of routes or paths, e.g. travelling salesman problem
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/40—Business processes related to the transportation industry
Landscapes
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Engineering & Computer Science (AREA)
- Economics (AREA)
- Strategic Management (AREA)
- Tourism & Hospitality (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Game Theory and Decision Science (AREA)
- Quality & Reliability (AREA)
- Operations Research (AREA)
- Entrepreneurship & Innovation (AREA)
- Development Economics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Traffic Control Systems (AREA)
Abstract
The social interaction mode based on unmanned vehicle that this application provides a kind of, wherein this method includes:Obtain the first traffic information around unmanned vehicle;Based on the programme path of the first traffic information and the unmanned vehicle around the unmanned vehicle, obtain the unmanned vehicle to the second traffic information on travel route around other unmanned vehicles;Based on around the unmanned vehicle the first traffic information and the unmanned vehicle to the second traffic information on travel route around other unmanned vehicles, determine the current programme path of the unmanned vehicle.The embodiment of the present application adjusts the current programme path of unmanned vehicle according to the route information of acquisition in real time, not only can provide convenience for unmanned vehicle programme path, and the traffic information that can also be realized between vehicle and vehicle is shared.
Description
Technical Field
The application relates to the technical field of social interaction, in particular to a social interaction mode and device based on an unmanned vehicle and electronic equipment.
Background
With the rapid development of the unmanned vehicle technology, the unmanned vehicle is used as an important vehicle for people to go out, and the life style of people is changed from the aspect of the technology. At present, a network interaction platform with an unmanned vehicle as a background is still a blank. Although a small number of human-vehicle interaction platforms for assisting driving appear in the market, interaction between a human and a vehicle-mounted system is realized to a certain extent, for example, when an unmanned vehicle runs, a vehicle owner inputs a destination to a navigation system, and the unmanned vehicle acquires surrounding road condition information through electronic equipment such as a laser radar and a camera and analyzes the information, so that the running direction and speed are judged.
Disclosure of Invention
In view of this, an object of the present application is to provide a social interaction method, device and electronic device based on an unmanned vehicle, so as to obtain more accurate road condition information.
In a first aspect, an embodiment of the present application provides an unmanned vehicle-based social interaction method, including:
acquiring first road condition information around the unmanned vehicle;
acquiring second road condition information around other unmanned vehicles on a to-be-driven route of the unmanned vehicle based on the first road condition information around the unmanned vehicle and the planned route of the unmanned vehicle;
and determining the current planned route of the unmanned vehicle based on the first road condition information around the unmanned vehicle and the second road condition information around other unmanned vehicles on the to-be-driven route of the unmanned vehicle.
In combination with the first aspect, the present examples provide a first possible implementation manner of the first aspect, where,
the acquiring second road condition information around other unmanned vehicles on the to-be-driven route of the unmanned vehicle based on the first road condition information around the unmanned vehicle and the planned route of the unmanned vehicle includes:
acquiring the current position of the unmanned vehicle in first road condition information and a destination in the planned route;
and acquiring second road condition information around other unmanned vehicles on the to-be-driven route of the unmanned vehicle according to the current position and the destination.
In combination with the first aspect, the present examples provide a second possible implementation manner of the first aspect, wherein,
the determining a current planned route of the unmanned vehicle based on the first road condition information around the unmanned vehicle and the second road condition information around other unmanned vehicles on the to-be-traveled route of the unmanned vehicle includes:
acquiring road condition information corresponding to the current position and the destination based on first road condition information around the unmanned vehicle and second road condition information around other unmanned vehicles on a to-be-driven route of the unmanned vehicle;
judging whether to adjust the planned route of the unmanned vehicle or not according to the road condition information corresponding to the current position and the destination;
and determining the current planned route of the unmanned vehicle according to the judgment result.
In combination with the second possible implementation manner of the first aspect, the present application provides a third possible implementation manner of the first aspect, wherein,
the judging whether to adjust the planned route of the unmanned vehicle according to the road condition information corresponding to the current position and the destination comprises the following steps:
if the road condition information corresponding to the current position and the destination indicates that the traffic condition corresponding to the planned route is smooth, determining the planned route as the current planned route;
and if the road condition information corresponding to the current position and the destination indicates that the traffic condition corresponding to the planned route is slow or congested, generating the current planned route of the unmanned vehicle based on the first road condition information around the unmanned vehicle and the second road condition information around other unmanned vehicles on the to-be-driven route of the unmanned vehicle.
In combination with the third possible implementation manner of the first aspect, the present application provides an example of a fourth possible implementation manner of the first aspect, wherein,
if the road condition information corresponding to the current position and the destination indicates that the traffic condition corresponding to the planned route is slow or congested, generating the current planned route of the unmanned vehicle based on first road condition information around the unmanned vehicle and second road condition information around other unmanned vehicles on the to-be-traveled route of the unmanned vehicle, including:
determining at least one route from the current location to the destination based on the current location and the destination;
and selecting a route with smooth traffic road from the at least one route as the current planning route according to the road condition information of each route.
In combination with the first aspect, the present application provides a fifth possible implementation manner of the first aspect, where,
the method further comprises the following steps: and uploading the acquired first road condition information around the unmanned vehicle.
In combination with the first aspect, the present examples provide a sixth possible implementation manner of the first aspect, where,
further comprising:
acquiring characteristic information of passengers in the unmanned vehicle;
determining a multimedia playing type matched with each passenger based on the acquired characteristic information of the passenger in the unmanned vehicle;
and recommending the multimedia file corresponding to the determined multimedia playing type.
In a second aspect, the embodiment of the application also provides a device based on the social interaction mode of the unmanned vehicle,
the device comprises a first acquisition module, a second acquisition module and a first determination module;
the first acquisition module is used for acquiring first road condition information around the unmanned vehicle;
the second obtaining module is used for obtaining second road condition information around other unmanned vehicles on the to-be-driven route of the unmanned vehicle according to the first road condition information around the unmanned vehicle and the planned route of the unmanned vehicle;
the first determining module is used for determining the current planned route of the unmanned vehicle based on the first road condition information around the unmanned vehicle and the second road condition information around other unmanned vehicles on the to-be-driven route of the unmanned vehicle.
In a second aspect, the present examples provide a first possible implementation manner of the second aspect, wherein,
the system also comprises a third acquisition module, a second determination module and a recommendation module;
the third acquisition module is used for acquiring characteristic information of passengers in the unmanned vehicle;
the second determining module is used for determining a multimedia playing type matched with each passenger based on the acquired characteristic information of the passengers in the unmanned vehicle;
and the recommending module is used for recommending the determined multimedia file corresponding to the multimedia playing type.
In a third aspect, an embodiment of the present application further provides an electronic device, including: the device comprises a processor, a memory and a bus, wherein the memory stores machine readable instructions executable by the processor, when a network side device runs, the processor and the memory are communicated through the bus, and when the machine readable instructions are executed by the processor, the method of any one of the above items is executed.
In a fourth aspect, the present application further provides a computer scale storage medium, where a computer program is stored on the computer scale storage medium, and when the computer program is executed by a processor, the computer scale storage medium performs any one of the methods described above.
The social interaction mode, the social interaction device and the electronic equipment based on the unmanned vehicles determine the current planned route of the unmanned vehicles by acquiring the first road condition information around the unmanned vehicles and the second road condition information around other unmanned vehicles to be driven on the planned route of the unmanned vehicles and according to the first road condition information around the unmanned vehicles and the second road condition information around other unmanned vehicles on the planned route of the unmanned vehicles.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
FIG. 1 is a flow chart illustrating an unmanned vehicle based social interaction approach provided by an embodiment of the present application;
FIG. 2 is a flow chart illustrating another social interaction method based on unmanned vehicles according to an embodiment of the present disclosure;
FIG. 3 is a flow chart illustrating another social interaction method based on unmanned vehicles according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram illustrating an apparatus based on an unmanned vehicle social interaction manner according to an embodiment of the present application;
FIG. 5 is a schematic structural diagram of another unmanned vehicle social interaction manner-based device provided in the embodiment of the present application;
fig. 6 shows a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
Considering that the existing unmanned vehicle does not have an interaction mode based on the internet, and real-time road condition information cannot be shared between vehicles, embodiments of the present application provide a social interaction mode, a social interaction device and electronic equipment based on the unmanned vehicle, which are described below by embodiments.
Example one
An embodiment of the present application provides a social interaction method based on unmanned vehicle, as shown in fig. 1, includes:
s101: first road condition information around the unmanned vehicle is acquired.
Here, the acquiring of the first road condition information around the unmanned vehicle may include image information, sound information, and text information, and the first road condition information may include current time, current speed information, current location, and current environment information. The current time can be acquired through a timing device on the unmanned vehicle, the current speed information can be acquired through a speed sensor, the current position can be acquired through a positioning system, and the current environment information can be acquired through a laser radar or a camera.
After the first road condition information around the unmanned vehicle is obtained, the first road condition information around the unmanned vehicle can be uploaded to a corresponding cloud platform through a wireless network, so that other unmanned vehicles can conveniently obtain the first road condition information around the unmanned vehicle through the cloud platform, and accurate road condition information can be obtained.
S102: and acquiring second road condition information around other unmanned vehicles on the to-be-driven route of the unmanned vehicle based on the first road condition information around the unmanned vehicle and the planned route of the unmanned vehicle.
In a specific implementation, before the unmanned vehicle travels, at least one travel route from the starting location to the destination may be planned according to the obtained starting location information and destination information, and the selected travel route may be used as the planned route for the unmanned vehicle to travel this time.
During the driving process of the unmanned vehicle, a route between the current position and the destination can be determined as a route to be driven according to the current position in the first road condition information around the unmanned vehicle and the destination of the planned route of the unmanned vehicle. The unmanned vehicle downloads second road condition information around the unmanned vehicle uploaded by other unmanned vehicles on a to-be-driven route of the unmanned vehicle from corresponding cloud platforms, and the second road condition information may include current time, current speed information, current position and current environment information of the other unmanned vehicles.
For example, the unmanned vehicle acquires second road condition information of other unmanned vehicles at the front intersection from the corresponding cloud platform, the current time of the other unmanned vehicles is consistent with the current time of the unmanned vehicle, the current positions of the other unmanned vehicles are waiting for a red light at the front intersection, and the current environment information of the other unmanned vehicles may include that more vehicles are in front and waiting for the red light at the same time.
S103: and determining the current planned route of the unmanned vehicle based on the first road condition information around the unmanned vehicle and the second road condition information around other unmanned vehicles on the to-be-driven route of the unmanned vehicle.
Here, the route to be traveled may be determined based on a current position in the first road condition information around the unmanned vehicle and a destination in the planned route of the unmanned vehicle, that is, a route not traveled from the current position to the destination, and road condition information corresponding to the current position and the destination may be acquired based on the first road condition information around the unmanned vehicle and second road condition information around other unmanned vehicles on the route to be traveled of the unmanned vehicle.
Specifically, the length of the congested road segment, the length of the speed-limiting road segment, the number of traffic lights, and the number of traffic accident sites in the road condition information corresponding to the route between the current position and the destination may be obtained.
Further, the traffic condition of the route to be traveled can be judged according to the acquired road condition information corresponding to the current position and the destination, and the current planned route of the unmanned vehicle can be determined according to the judgment result. For example, it may be specifically determined that the traffic condition of the route to be traveled is clear, slow, or congested, and if the acquired traffic condition information corresponding to the current location and the destination indicates that the traffic condition corresponding to the route to be traveled is clear, the route to be traveled is determined as the currently planned route; and if the acquired road condition information corresponding to the current position and the destination indicates that the traffic condition corresponding to the route to be traveled is slow or congested, generating a current planned route of the unmanned vehicle based on first road condition information around the unmanned vehicle and second road condition information around other unmanned vehicles on the route to be traveled of the unmanned vehicle.
For example, if the current speed of the unmanned vehicle and the other unmanned vehicles on the route to be traveled at the current time meets the specified travel speed of the current road, the number of congestion-free road sections on the route to be traveled and the number of traffic accident occurrence places on the route to be traveled is zero, it may be determined that the traffic condition corresponding to the route to be traveled is smooth, and the route to be traveled is determined as the currently planned route. If the current speed of the unmanned vehicle and other unmanned vehicles on the route to be traveled is far less than the specified travel speed of the current road, the congestion road section on the route to be traveled is more than 1 kilometer, and a plurality of traffic accident places are arranged on the route to be traveled at the current time, the traffic congestion of the route formation to be traveled can be judged, and the current planned route of the unmanned vehicle is generated.
Further, if the acquired road condition information corresponding to the current position and the destination indicates that the traffic condition corresponding to the route to be traveled is slow or congested, determining at least one route from the current position to the destination according to the current position and the position of the destination; and selecting a route with smooth traffic road from the at least one route as the current planned route of the unmanned vehicle according to the road condition information of each route.
If the traffic condition corresponding to the route to be traveled is slow or congested, determining at least one route from the current position to the destination, downloading second road condition information around the unmanned vehicle uploaded by other unmanned vehicles on each route from the current position to the destination from a corresponding cloud platform by the unmanned vehicle, judging the traffic condition corresponding to each route according to the road condition information of each route, and selecting a route with smooth traffic channel from the at least one route as the current planned route of the unmanned vehicle.
Example two
The social interaction mode based on the unmanned vehicle provided by the second embodiment of the present application, as shown in fig. 2, further includes:
s201: and acquiring characteristic information of passengers in the unmanned vehicle.
The characteristic information may include graph information or sound information. The characteristic information of the passenger can be obtained through a face recognition device, a voice recognition device or a camera device in the unmanned vehicle.
The characteristic information may include age, gender, emotion, and character.
In a specific implementation, the age of the passenger can be judged by a face recognition device and/or a voice recognition device; the gender of the passenger can be judged by a face recognition device and/or a voice recognition device; the emotion of the passenger can be judged by the camera device; the personality of the passenger may be determined by a voice recognition device or a camera device.
For example, it may be determined that the passenger is a child, a young person, a middle-aged person, or an elderly person, based on the voice or facial features of the passenger in the unmanned vehicle; according to the voice or facial features of the passenger, the passenger can be judged to be male or female; according to the image or the sound of the passenger, the sadness or the happiness of the passenger can be judged; the character of the passenger can be judged to be outward or inward according to the image or the sound of the passenger.
S202: and determining a multimedia playing type matched with each passenger based on the acquired characteristic information of the passenger in the unmanned vehicle.
The determining of the multimedia playing type matched with each passenger based on the acquired characteristic information of the passenger in the unmanned vehicle comprises the following steps:
classifying the passengers according to the characteristic information of the passengers, for example, the passengers may be classified into children, young, middle-aged and elderly according to age; or may be divided into a quiet group and an active group according to the mood of the occupant.
And determining the multimedia playing type matched with the passenger in each category according to the categories of the passengers.
For example, for the categories of classifying the passengers by age, music programs matching the passengers of each category may be determined, in particular, children songs may be determined for passengers of the children category, pop songs may be determined for passengers of the young category, classic songs may be determined for passengers of the middle category, and nostalgic songs may be determined for passengers of the old category.
S203: and recommending the multimedia file corresponding to the determined multimedia playing type.
And downloading the determined multimedia files of the multimedia playing types matched with the passengers of each category, and recommending the multimedia files to the passengers.
And after the determined multimedia file corresponding to the multimedia playing type is recommended, receiving an operation instruction made by the passenger aiming at the multimedia playing type.
Here, the operation instruction may include rejection of play, pause of play, continuation of play, and re-recommendation.
Further, the operation instruction can be executed according to the operation instruction which is received by the passenger and is made for the multimedia playing type.
The social interaction mode based on the unmanned vehicle classifies the passengers according to the characteristic information of the passengers in the unmanned vehicle, recommends different multimedia playing types for the passengers of different types, better realizes the interaction between the passengers and the vehicle, can effectively improve the riding experience of the passengers, and further meets the riding requirements of the passengers.
The embodiment of the application also provides another social interaction mode based on the unmanned vehicle, and the current planned route of the unmanned vehicle is determined by acquiring the first road condition information around the unmanned vehicle and the second road condition information around other unmanned vehicles to be driven on the planned route of the unmanned vehicle and according to the first road condition information around the unmanned vehicle and the second road condition information around other unmanned vehicles on the planned route of the unmanned vehicle, so that the road condition information sharing mode by utilizing the internet can acquire accurate road condition information and can adjust the current planned route of the unmanned vehicle in real time according to the acquired route information. For a specific implementation, refer to another social interaction mode based on unmanned vehicles provided in example three.
EXAMPLE III
Another social interaction method based on an unmanned vehicle provided in the third embodiment of the present application is shown in fig. 3, and includes:
s301: first road condition information around the unmanned vehicle is acquired.
Here, the acquiring of the first road condition information around the unmanned vehicle may include acquiring image information, sound information, and text information around the unmanned vehicle, and the first road condition information may include a current time, current speed information, a current location, and current environment information. The current time can be acquired through a timing device on the unmanned vehicle, the current speed information can be acquired through a speed sensor, the current position can be acquired through a positioning system, and the current environment information can be acquired through a laser radar or a camera.
After the first road condition information around the unmanned vehicle is obtained, the first road condition information around the unmanned vehicle can be uploaded to a corresponding cloud platform through a wireless network, so that other unmanned vehicles can conveniently obtain the first road condition information around the unmanned vehicle through the cloud platform, and accurate road condition information can be obtained.
S302: determining a route between the current position and a destination as a to-be-traveled route of the unmanned vehicle based on the current position in the first road condition information around the unmanned vehicle and the destination in the planned route of the unmanned vehicle.
In a specific implementation, before the unmanned vehicle travels, at least one travel route from the starting location to the destination may be planned according to the obtained starting location information and destination information, and the travel route selected by the passenger may be used as the planned route for the unmanned vehicle to travel this time. According to the current position in the first road condition information around the unmanned vehicle and the destination in the planned route of the unmanned vehicle, the route between the current position and the destination can be determined as the route to be driven
S303: and acquiring second road condition information around other unmanned vehicles on the to-be-driven route of the unmanned vehicle.
The unmanned vehicle downloads second road condition information around the unmanned vehicle uploaded by other unmanned vehicles on a to-be-driven route of the unmanned vehicle from corresponding cloud platforms, and the second road condition information may include current time, current speed information, current position and current environment information of the other unmanned vehicles.
For example, the unmanned vehicle acquires second road condition information of other unmanned vehicles at the front intersection from the corresponding cloud platform, the current time of the other unmanned vehicles is consistent with the current time of the unmanned vehicle, the current positions of the other unmanned vehicles are waiting for a red light at the front intersection, and the current environment information of the other unmanned vehicles may include that more vehicles are in front and waiting for the red light at the same time.
S304: and acquiring the road condition information corresponding to the current position and the destination based on the first road condition information around the unmanned vehicle and the second road condition information around other unmanned vehicles on the to-be-driven route of the unmanned vehicle.
Here, the route to be traveled may be determined based on a current position in the first road condition information around the unmanned vehicle and a destination in the planned route of the unmanned vehicle, that is, a route not traveled from the current position to the destination, and the road condition information corresponding to the current position and the destination may be acquired based on the first road condition information around the unmanned vehicle and the second road condition information around other unmanned vehicles on the route to be traveled of the unmanned vehicle.
Specifically, for example, the length of a congested road segment, the length of a speed-limit road segment, the number of traffic lights, and the number of traffic accident sites in the road condition information corresponding to the route between the current location and the destination may be acquired.
S305: and judging the traffic condition of the route to be driven based on the acquired road condition information corresponding to the current position and the destination.
And judging the traffic condition of the route to be traveled according to the acquired road condition information corresponding to the current position and the destination, for example, specifically, judging that the traffic condition of the route to be traveled is unobstructed, slow or congested.
S306: and determining the current planned route of the unmanned vehicle according to the judgment result.
Specifically, if the acquired road condition information corresponding to the current position and the destination indicates that the traffic condition corresponding to the route to be traveled is unobstructed, determining the route to be traveled as the current planned route; and if the acquired road condition information corresponding to the current position and the destination indicates that the traffic condition corresponding to the route to be traveled is slow or congested, generating a current planned route of the unmanned vehicle based on first road condition information around the unmanned vehicle and second road condition information around other unmanned vehicles on the route to be traveled of the unmanned vehicle.
S307: and if the acquired road condition information corresponding to the current position and the destination indicates that the traffic condition corresponding to the route to be driven is slow or congested, determining at least one route from the current position to the destination.
For example, if the current speed of the unmanned vehicle and the other unmanned vehicles on the route to be traveled are both much lower than the specified travel speed of the current road, the congested road segment on the route to be traveled is greater than 1 kilometer, and there are multiple traffic accident places on the route to be traveled, it may be determined that the traffic condition of the route formation to be traveled is congested, and at least one route from the current position to the destination is determined.
S308: and selecting a route with smooth traffic road from the at least one route as the current planned route of the unmanned vehicle based on the road condition information of each route.
And judging the traffic condition of each route according to the road condition information of each route, sequencing all the routes according to smooth, slow and congested traffic conditions, and selecting one route with smooth traffic channel from the at least one route as the current planned route of the unmanned vehicle.
S309: and if the acquired road condition information corresponding to the current position and the destination indicates that the traffic condition corresponding to the route to be driven is smooth, determining the route to be driven as the current planned route.
For example, if the current speed of the unmanned vehicle and the other unmanned vehicles on the route to be traveled at the current time meets the specified travel speed of the current road, the number of congestion-free road sections on the route to be traveled and the number of traffic accident occurrence places on the route to be traveled is zero, it may be determined that the traffic condition corresponding to the route to be traveled is smooth, and the route to be traveled is determined as the currently planned route.
Example four
A device based on the social interaction manner of the unmanned vehicle is provided in the fourth embodiment of the present application, as shown in fig. 4, and includes a first obtaining module 401, a second obtaining module 402, and a first determining module 403; wherein,
the first obtaining module 401 is configured to obtain first road condition information around the unmanned vehicle;
here, the acquiring of the first road condition information around the unmanned vehicle may include acquiring image information, sound information, and text information around the unmanned vehicle, and the first road condition information may include a current time, current speed information, a current location, and current environment information. The current time can be acquired through a timing device on the unmanned vehicle, the current speed information can be acquired through a speed sensor, the current position can be acquired through a positioning system, and the current environment information can be acquired through a laser radar or a camera.
After the first road condition information around the unmanned vehicle is obtained, the first road condition information around the unmanned vehicle can be uploaded to a corresponding cloud platform through a wireless network.
The second obtaining module 402 is configured to obtain second road condition information around other unmanned vehicles on the to-be-traveled route of the unmanned vehicle according to the first road condition information around the unmanned vehicle and the planned route of the unmanned vehicle;
here, the second obtaining module 402 determines, according to a current location in the first road condition information around the unmanned vehicle and a destination in the planned route of the unmanned vehicle, that the route between the current location and the destination is a route to be traveled, and the unmanned vehicle downloads, from a corresponding cloud platform, second road condition information around the unmanned vehicle, which is uploaded by other unmanned vehicles on the route to be traveled of the unmanned vehicle, where the second road condition information may include current time, current speed information, a current location, and current environment information of the other unmanned vehicles.
The first determining module 403 is configured to determine a currently planned route of the unmanned vehicle based on first road condition information around the unmanned vehicle and second road condition information around other unmanned vehicles on a route to be traveled by the unmanned vehicle.
The route to be traveled is determined according to the current position in the first road condition information around the unmanned vehicle and the destination in the planned route of the unmanned vehicle, namely the route which is not traveled from the current position to the destination, and the road condition information corresponding to the current position and the destination is obtained according to the first road condition information around the unmanned vehicle and the second road condition information around other unmanned vehicles on the route to be traveled of the unmanned vehicle.
And judging the traffic condition of the route to be traveled according to the acquired road condition information corresponding to the current position and the destination, for example, specifically, judging that the traffic condition of the route to be traveled is unobstructed, slow or congested.
And determining the current planned route of the unmanned vehicle according to the judgment result.
Specifically, if the acquired road condition information corresponding to the current position and the destination indicates that the traffic condition corresponding to the route to be traveled is unobstructed, determining the route to be traveled as the current planned route; and if the acquired road condition information corresponding to the current position and the destination indicates that the traffic condition corresponding to the route to be traveled is slow or congested, generating a current planned route of the unmanned vehicle based on first road condition information around the unmanned vehicle and second road condition information around other unmanned vehicles on the route to be traveled of the unmanned vehicle.
Further, if the acquired road condition information corresponding to the current position and the destination indicates that the traffic condition corresponding to the route to be traveled is slow or congested, determining at least one route from the current position to the destination according to the current position and the position of the destination; and selecting a route with smooth traffic road from the at least one route as the current planned route of the unmanned vehicle according to the road condition information of each route.
If the traffic condition corresponding to the route to be traveled is slow or congested, determining at least one route from the current position to the destination, downloading second road condition information around the unmanned vehicle uploaded by other unmanned vehicles on each route from the current position to the destination from a corresponding cloud platform by the unmanned vehicle, judging the traffic condition corresponding to each route according to the road condition information of each route, and selecting a route with smooth traffic channel from the at least one route as the current planned route of the unmanned vehicle.
EXAMPLE five
The fifth embodiment of the application provides a device based on the social interaction mode of the unmanned vehicle, as shown in fig. 5, the device further includes a third obtaining module 501, a second determining module 502 and a recommending module 503; wherein,
the third obtaining module 501 is configured to obtain feature information of passengers in the unmanned vehicle;
the feature information may include image information and sound information. The characteristic information of the passenger can be obtained through a face recognition device, a voice recognition device or a camera device in the unmanned vehicle.
The characteristic information may include age, gender, emotion, and character.
Specifically, the age of the passenger can be judged by a face recognition device and/or a voice recognition device; the gender of the passenger can be judged by a face recognition device and/or a voice recognition device; the emotion of the passenger can be judged by the camera device; the personality of the passenger may be determined by a voice recognition device or a camera device.
The second determining module 502 is configured to determine a multimedia playing type matched with each passenger based on the obtained feature information of the passenger in the unmanned vehicle;
and determining a multimedia playing type matched with each passenger based on the acquired characteristic information of the passenger in the unmanned vehicle, and determining the multimedia playing type matched with each class of passenger according to the class of the passenger.
The recommending module 503 is configured to recommend the determined multimedia file corresponding to the multimedia playing type.
And downloading the determined multimedia files of the multimedia playing types matched with the passengers of each category, and recommending the multimedia files to the passengers.
And after the determined multimedia file corresponding to the multimedia playing type is recommended, receiving an operation instruction made by the passenger aiming at the multimedia playing type.
Here, the operation instruction may include rejection of play, pause of play, continuation of play, and re-recommendation.
Further, the operation instruction can be executed according to the operation instruction which is received by the passenger and is made for the multimedia playing type.
EXAMPLE six
As shown in fig. 6, a schematic structural diagram of an electronic device 600 according to a sixth embodiment of the present application includes: a processor 601, a memory 602, and a bus 603;
the memory 602 stores machine-readable instructions executable by the processor 601, when the network side device runs, the processor 601 and the memory 602 communicate with each other through the bus 603, and when the processor 601 executes the following processes:
acquiring first road condition information around the unmanned vehicle;
acquiring second road condition information around other unmanned vehicles on a to-be-driven route of the unmanned vehicle based on the first road condition information around the unmanned vehicle and the planned route of the unmanned vehicle;
and determining the current planned route of the unmanned vehicle based on the first road condition information around the unmanned vehicle and the second road condition information around other unmanned vehicles on the to-be-driven route of the unmanned vehicle.
In a specific implementation, in the processing executed by the processor 601, the obtaining second road condition information around other unmanned vehicles on the to-be-traveled route of the unmanned vehicle based on the first road condition information around the unmanned vehicle and the planned route of the unmanned vehicle includes:
acquiring the current position of the unmanned vehicle in first road condition information and a destination in the planned route;
and acquiring second road condition information around other unmanned vehicles on the to-be-driven route of the unmanned vehicle according to the current position and the destination.
In a specific implementation, the determining a current planned route of the unmanned vehicle based on the first road condition information around the unmanned vehicle and the second road condition information around other unmanned vehicles on the to-be-traveled route of the unmanned vehicle includes:
acquiring road condition information corresponding to the current position and the destination based on first road condition information around the unmanned vehicle and second road condition information around other unmanned vehicles on a to-be-driven route of the unmanned vehicle;
judging whether to adjust the planned route of the unmanned vehicle or not according to the road condition information corresponding to the current position and the destination;
and determining the current planned route of the unmanned vehicle according to the judgment result.
In a specific implementation, the determining whether to adjust the planned route of the unmanned vehicle according to the road condition information corresponding to the current location and the destination includes:
if the road condition information corresponding to the current position and the destination indicates that the traffic condition corresponding to the route to be driven is smooth, determining the planned route as the current planned route;
and if the road condition information corresponding to the current position and the destination indicates that the traffic condition corresponding to the route to be traveled is slow or congested, generating a current planned route of the unmanned vehicle based on the first road condition information around the unmanned vehicle and the second road condition information around other unmanned vehicles on the route to be traveled of the unmanned vehicle.
In a specific implementation, if the traffic information corresponding to the current location and the destination indicates that the traffic condition corresponding to the route to be traveled is slow or congested, generating a current planned route of the unmanned vehicle based on first traffic information around the unmanned vehicle and second traffic information around other unmanned vehicles on the route to be traveled of the unmanned vehicle includes:
determining at least one route from the current location to the destination based on the current location and the destination;
and selecting a route with smooth traffic road from the at least one route as the current planning route according to the road condition information of each route.
In a specific implementation, the method further comprises: and uploading the acquired first road condition information around the unmanned vehicle.
In a specific implementation, the processing executed by the processor 601 further includes obtaining characteristic information of the passenger in the unmanned vehicle;
determining a multimedia playing type matched with each passenger based on the acquired characteristic information of the passenger in the unmanned vehicle;
and recommending the multimedia file corresponding to the determined multimedia playing type.
EXAMPLE seven
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the interactive unmanned vehicle simulation method are executed.
Based on the above analysis, it can be seen that according to the social interaction method, the social interaction device, and the electronic device based on the unmanned vehicle, provided by the embodiment of the application, the current planned route of the unmanned vehicle is determined by acquiring the first road condition information around the unmanned vehicle and the second road condition information around the other unmanned vehicles to be driven on the planned route of the unmanned vehicle, and according to the first road condition information around the unmanned vehicle and the second road condition information around the other unmanned vehicles on the planned route of the unmanned vehicle, the accurate road condition information can be acquired by the method.
The computer program product for performing the unmanned vehicle social interaction manner provided in the embodiment of the present application includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute the method described in the foregoing method embodiment, and specific implementation may refer to the method embodiment, and will not be described herein again.
The device based on the social interaction mode of the unmanned vehicle can be specific hardware on equipment or software or firmware installed on the equipment. The device provided by the embodiment of the present application has the same implementation principle and technical effect as the foregoing method embodiments, and for the sake of brief description, reference may be made to the corresponding contents in the foregoing method embodiments where no part of the device embodiments is mentioned. It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the foregoing systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments provided in the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus once an item is defined in one figure, it need not be further defined and explained in subsequent figures, and moreover, the terms "first", "second", "third", etc. are used merely to distinguish one description from another and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present application, and are used for illustrating the technical solutions of the present application, but not limiting the same, and the scope of the present application is not limited thereto, and although the present application is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope disclosed in the present application; such modifications, changes or substitutions do not depart from the spirit and scope of the present disclosure, which should be construed in light of the above teachings. Are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (10)
1. A social interaction mode based on unmanned vehicles is characterized by comprising the following steps:
acquiring first road condition information around the unmanned vehicle;
acquiring second road condition information around other unmanned vehicles on a to-be-driven route of the unmanned vehicle based on the first road condition information around the unmanned vehicle and the planned route of the unmanned vehicle;
and determining the current planned route of the unmanned vehicle based on the first road condition information around the unmanned vehicle and the second road condition information around other unmanned vehicles on the to-be-driven route of the unmanned vehicle.
2. The method according to claim 1, wherein the obtaining of the second road condition information around the other unmanned vehicles on the route to be traveled by the unmanned vehicle based on the first road condition information around the unmanned vehicle and the planned route of the unmanned vehicle comprises:
acquiring the current position of the unmanned vehicle in first road condition information and a destination in the planned route;
and acquiring second road condition information around other unmanned vehicles on the to-be-driven route of the unmanned vehicle according to the current position and the destination.
3. The method of claim 1, wherein determining the current planned route for the unmanned vehicle based on first road condition information around the unmanned vehicle and second road condition information around other unmanned vehicles on the route to be traveled by the unmanned vehicle comprises:
acquiring road condition information corresponding to the current position and the destination based on first road condition information around the unmanned vehicle and second road condition information around other unmanned vehicles on a to-be-driven route of the unmanned vehicle;
judging whether to adjust the planned route of the unmanned vehicle or not according to the road condition information corresponding to the current position and the destination;
and determining the current planned route of the unmanned vehicle according to the judgment result.
4. The method according to claim 3, wherein the determining whether to adjust the planned route of the unmanned vehicle according to the road condition information corresponding to the current location and the destination comprises:
if the road condition information corresponding to the current position and the destination indicates that the traffic condition corresponding to the route to be driven is smooth, determining the planned route as the current planned route;
and if the road condition information corresponding to the current position and the destination indicates that the traffic condition corresponding to the route to be traveled is slow or congested, generating a current planned route of the unmanned vehicle based on the first road condition information around the unmanned vehicle and the second road condition information around other unmanned vehicles on the route to be traveled of the unmanned vehicle.
5. The method according to claim 4, wherein if the traffic information corresponding to the current location and the destination indicates that the traffic condition corresponding to the route to be traveled is slow or congested, generating the current planned route of the unmanned vehicle based on first traffic information around the unmanned vehicle and second traffic information around other unmanned vehicles on the route to be traveled of the unmanned vehicle comprises:
determining at least one route from the current location to the destination based on the current location and the destination;
and selecting a route with smooth traffic road from the at least one route as the current planning route according to the road condition information of each route.
6. The method of claim 1, further comprising: and uploading the acquired first road condition information around the unmanned vehicle.
7. The method of claim 1, further comprising:
acquiring characteristic information of passengers in the unmanned vehicle;
determining a multimedia playing type matched with each passenger based on the acquired characteristic information of the passenger in the unmanned vehicle;
and recommending the multimedia file corresponding to the determined multimedia playing type.
8. The device based on the unmanned vehicle social interaction mode is characterized by comprising a first acquisition module, a second acquisition module and a first determination module;
the first acquisition module is used for acquiring first road condition information around the unmanned vehicle;
the second obtaining module is used for obtaining second road condition information around other unmanned vehicles on the to-be-driven route of the unmanned vehicle according to the first road condition information around the unmanned vehicle and the planned route of the unmanned vehicle;
the first determining module is used for determining the current planned route of the unmanned vehicle based on the first road condition information around the unmanned vehicle and the second road condition information around other unmanned vehicles on the to-be-driven route of the unmanned vehicle.
9. The apparatus of claim 8, further comprising a third obtaining module, a second determining module, and a recommending module;
the third acquisition module is used for acquiring characteristic information of passengers in the unmanned vehicle;
the second determining module is used for determining a multimedia playing type matched with each passenger based on the acquired characteristic information of the passengers in the unmanned vehicle;
and the recommending module is used for recommending the determined multimedia file corresponding to the multimedia playing type.
10. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when the network-side device is running, the machine-readable instructions, when executed by the processor, performing the method of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810846539.7A CN108921360A (en) | 2018-07-27 | 2018-07-27 | A kind of social interaction mode, device and electronic equipment based on unmanned vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810846539.7A CN108921360A (en) | 2018-07-27 | 2018-07-27 | A kind of social interaction mode, device and electronic equipment based on unmanned vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108921360A true CN108921360A (en) | 2018-11-30 |
Family
ID=64417459
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810846539.7A Pending CN108921360A (en) | 2018-07-27 | 2018-07-27 | A kind of social interaction mode, device and electronic equipment based on unmanned vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108921360A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109557918A (en) * | 2018-12-17 | 2019-04-02 | 北京百度网讯科技有限公司 | Control method, device, equipment, vehicle and the storage medium of vehicle |
CN109785655A (en) * | 2018-12-11 | 2019-05-21 | 北京百度网讯科技有限公司 | Control method for vehicle, device, equipment, automatic driving vehicle and storage medium |
CN111461368A (en) * | 2019-01-21 | 2020-07-28 | 北京嘀嘀无限科技发展有限公司 | Abnormal order processing method, device, equipment and computer readable storage medium |
CN114964284A (en) * | 2022-04-22 | 2022-08-30 | 合众新能源汽车有限公司 | Vehicle path planning method and device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102014201282A1 (en) * | 2014-01-24 | 2015-07-30 | Volkswagen Aktiengesellschaft | Method for a driver assistance system |
CN105739534A (en) * | 2016-04-22 | 2016-07-06 | 百度在线网络技术(北京)有限公司 | Multi-vehicle cooperative driving method and apparatus for driverless vehicles based on Internet-of-vehicles |
CN106652515A (en) * | 2015-11-03 | 2017-05-10 | 中国电信股份有限公司 | Vehicle automatic control method, vehicle automatic control device, and vehicle automatic control system |
CN108259579A (en) * | 2017-12-29 | 2018-07-06 | 深圳云天励飞技术有限公司 | Vehicle crew's information-pushing method, equipment, readable storage medium storing program for executing and onboard system |
-
2018
- 2018-07-27 CN CN201810846539.7A patent/CN108921360A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102014201282A1 (en) * | 2014-01-24 | 2015-07-30 | Volkswagen Aktiengesellschaft | Method for a driver assistance system |
CN106652515A (en) * | 2015-11-03 | 2017-05-10 | 中国电信股份有限公司 | Vehicle automatic control method, vehicle automatic control device, and vehicle automatic control system |
CN105739534A (en) * | 2016-04-22 | 2016-07-06 | 百度在线网络技术(北京)有限公司 | Multi-vehicle cooperative driving method and apparatus for driverless vehicles based on Internet-of-vehicles |
CN108259579A (en) * | 2017-12-29 | 2018-07-06 | 深圳云天励飞技术有限公司 | Vehicle crew's information-pushing method, equipment, readable storage medium storing program for executing and onboard system |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109785655A (en) * | 2018-12-11 | 2019-05-21 | 北京百度网讯科技有限公司 | Control method for vehicle, device, equipment, automatic driving vehicle and storage medium |
CN109557918A (en) * | 2018-12-17 | 2019-04-02 | 北京百度网讯科技有限公司 | Control method, device, equipment, vehicle and the storage medium of vehicle |
CN111461368A (en) * | 2019-01-21 | 2020-07-28 | 北京嘀嘀无限科技发展有限公司 | Abnormal order processing method, device, equipment and computer readable storage medium |
CN111461368B (en) * | 2019-01-21 | 2024-01-09 | 北京嘀嘀无限科技发展有限公司 | Abnormal order processing method, device, equipment and computer readable storage medium |
CN114964284A (en) * | 2022-04-22 | 2022-08-30 | 合众新能源汽车有限公司 | Vehicle path planning method and device |
WO2023201955A1 (en) * | 2022-04-22 | 2023-10-26 | 合众新能源汽车股份有限公司 | Vehicle path planning method and device |
CN114964284B (en) * | 2022-04-22 | 2024-06-18 | 合众新能源汽车股份有限公司 | Vehicle path planning method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10222226B2 (en) | Navigation systems and associated methods | |
US20240249363A1 (en) | Traveling-based insurance ratings | |
CN108944939B (en) | Method and system for providing driving directions | |
CN108921360A (en) | A kind of social interaction mode, device and electronic equipment based on unmanned vehicle | |
JP6027280B1 (en) | Provision system | |
US11727451B2 (en) | Implementing and optimizing safety interventions | |
DE102014203724A1 (en) | Method and system for selecting navigation routes and providing advertising on the route | |
CN111859178B (en) | Method and system for recommending get-on point | |
JP2004171060A (en) | Driving support device, driving support system and driving support program | |
US10373499B1 (en) | Cognitively filtered and recipient-actualized vehicle horn activation | |
JP2020095475A (en) | Matching method, matching server, matching system, and program | |
US20240344840A1 (en) | Sentiment-based navigation | |
CN111882112B (en) | Method and system for predicting arrival time | |
US12112245B2 (en) | Information processing apparatus and information processing method | |
CN113734187A (en) | Method and device for information interaction with vehicle user and vehicle machine | |
CN113320537A (en) | Vehicle control method and system | |
CN111238514A (en) | Generating device, control method of generating device, and storage medium | |
JP7264804B2 (en) | Recommendation system, recommendation method and program | |
CN114148342A (en) | Automatic driving judgment system, automatic driving control system and vehicle | |
JP7294205B2 (en) | In-vehicle signage system | |
CN113474827A (en) | Traffic environment recognition device and vehicle control device | |
JP2019104354A (en) | Information processing method and information processor | |
CN115631550A (en) | User feedback method and system | |
CN117521877A (en) | Travel team recommendation method and device, electronic equipment and storage medium | |
CN118781843A (en) | Bus driving route determining method, device, equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20181130 |