KR20160148394A - Autonomous vehicle - Google Patents
Autonomous vehicle Download PDFInfo
- Publication number
- KR20160148394A KR20160148394A KR1020150085407A KR20150085407A KR20160148394A KR 20160148394 A KR20160148394 A KR 20160148394A KR 1020150085407 A KR1020150085407 A KR 1020150085407A KR 20150085407 A KR20150085407 A KR 20150085407A KR 20160148394 A KR20160148394 A KR 20160148394A
- Authority
- KR
- South Korea
- Prior art keywords
- vehicle
- information
- driver
- route
- processor
- Prior art date
Links
- 238000004891 communication Methods 0.000 claims abstract description 50
- 230000007958 sleep Effects 0.000 claims description 49
- 238000000034 method Methods 0.000 claims description 42
- 238000001514 detection method Methods 0.000 claims description 36
- 230000018044 dehydration Effects 0.000 claims description 23
- 238000006297 dehydration reaction Methods 0.000 claims description 23
- 230000001133 acceleration Effects 0.000 claims description 8
- 239000000284 extract Substances 0.000 claims description 4
- 206010041349 Somnolence Diseases 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 27
- 238000012545 processing Methods 0.000 description 14
- 230000033001 locomotion Effects 0.000 description 12
- 230000005236 sound signal Effects 0.000 description 11
- 238000010276 construction Methods 0.000 description 10
- 238000013459 approach Methods 0.000 description 7
- 239000000446 fuel Substances 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 6
- 238000007781 pre-processing Methods 0.000 description 5
- 230000011218 segmentation Effects 0.000 description 5
- 238000007667 floating Methods 0.000 description 4
- 238000012706 support-vector machine Methods 0.000 description 4
- 238000012795 verification Methods 0.000 description 4
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 230000005684 electric field Effects 0.000 description 3
- 238000005286 illumination Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 239000000725 suspension Substances 0.000 description 3
- 238000004378 air conditioning Methods 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 231100001261 hazardous Toxicity 0.000 description 2
- 230000035807 sensation Effects 0.000 description 2
- 241001300198 Caperonia palustris Species 0.000 description 1
- 235000000384 Veronica chamaedrys Nutrition 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000001816 cooling Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 239000002803 fossil fuel Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000033764 rhythmic process Effects 0.000 description 1
- 238000002834 transmittance Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/14—Adaptive cruise control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K28/00—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
- B60K28/02—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
- B60K28/06—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R16/00—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
- B60R16/02—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
- B60R16/037—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for occupant comfort, e.g. for automatic adjustment of appliances according to personal settings, e.g. seats, mirrors, steering wheel
- B60R16/0373—Voice control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W10/00—Conjoint control of vehicle sub-units of different type or different function
- B60W10/18—Conjoint control of vehicle sub-units of different type or different function including control of braking systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W10/00—Conjoint control of vehicle sub-units of different type or different function
- B60W10/20—Conjoint control of vehicle sub-units of different type or different function including control of steering systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0818—Inactivity or incapacity of driver
- B60W2040/0827—Inactivity or incapacity of driver due to sleepiness
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Automation & Control Theory (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
BACKGROUND OF THE
A vehicle is a device that moves a user in a desired direction by a boarding user. Typically, automobiles are examples.
On the other hand, for the convenience of users who use the vehicle, various sensors and electronic devices are provided. In particular, various devices and the like for the user's driving comfort have been developed, and images photographed from a rear camera provided when the vehicle is backed up or when the vehicle is parked are provided.
An object of the present invention is to provide an autonomous vehicle capable of varying a route to a destination in a sleep state of a driver.
In order to achieve the above object, the autonomous vehicle according to the present invention controls a plurality of cameras, a radar, and a communication unit so as to perform autonomous operation to a first route toward a destination in an autonomous running driving mode, And a processor for varying the route to the destination based on at least one of the driver sleep state information and the running route state information and controlling the autonomous running through the variable route.
An autonomous vehicle according to an embodiment of the present invention controls a plurality of cameras, a radar, and a communication unit to perform an autonomous operation to a first route toward a destination in an autonomous running driving mode. When the driver is in a sleep state, And a processor for controlling the autonomous running through the variable route by varying the route to the destination on the basis of at least one of the driver sleep state information and the running route state information, The route to the destination can be varied in the driver sleep state. Therefore, the convenience of use can be increased.
On the other hand, the processor calculates the dehydration surface expected time of the driver based on the driver's sleep surface state information, varies the route to the destination based on the calculated dehydration surface expected time, and autonomously travels through the variable route , It becomes possible to change the customized route according to the driver's water surface. Therefore, the convenience of use can be increased.
On the other hand, the processor calculates the dehydration surface expected time of the driver based on the driver's sleep surface state information, and changes the vehicle running speed to the destination based on the calculated dehydration surface expected time, The speed variable becomes possible. Therefore, the convenience of use can be increased.
On the other hand, the destination information can be extracted based on the driver's voice or the driver's schedule information, thereby making it possible to increase the convenience of use.
On the other hand, it is possible to increase the usability by changing the route according to the destination change information or the traveling route information received from the outside and controlling the autonomous travel through the variable route.
On the other hand, when the destination arrives, when the driver arrives at the mobile terminal by telephone or message, when he / she travels to the destination, arrives at a rest area, It is possible to end the sleeping of the driver.
1 is a conceptual diagram of a vehicle communication system including an autonomous mobile device according to an embodiment of the present invention.
2A is a diagram showing the appearance of a vehicle having various cameras.
FIG. 2B is a view showing the appearance of a stereo camera attached to the vehicle of FIG. 2A.
2C is a view schematically showing the positions of a plurality of cameras attached to the vehicle of FIG. 2A.
Fig. 2D illustrates an auroral view image based on images photographed by the plurality of cameras of Fig. 2C.
3A to 3B illustrate various examples of an internal block diagram of the autonomous navigation apparatus of FIG.
3C to 3D illustrate various examples of the internal block diagram of the autonomous vehicle of FIG.
3E is an internal block diagram of the vehicle display device of FIG.
Figures 4A-4B illustrate various examples of internal block diagrams of the processors of Figures 3A-3D.
Figure 5 is a diagram illustrating object detection in the processor of Figures 4A-4B.
6A to 6B are views referred to in the description of the operation of the autonomous travel apparatus of FIG.
7 is an example of a block diagram of an interior of a vehicle according to an embodiment of the present invention.
8 is a flowchart showing an operation method of an autonomous navigation apparatus according to an embodiment of the present invention.
Figs. 9A to 15 are views referred to the explanation of the operation method of Fig.
16 is a flowchart showing an operation method of an autonomous vehicle according to another embodiment of the present invention.
17A to 18 are diagrams referred to in the explanation of the operation method of FIG.
Hereinafter, the present invention will be described in detail with reference to the drawings.
The suffix "module" and " part "for components used in the following description are given merely for convenience of description, and do not give special significance or role in themselves. Accordingly, the terms "module" and "part" may be used interchangeably.
The vehicle described herein may be a concept including a car, a motorcycle. Hereinafter, the vehicle will be described mainly with respect to the vehicle.
On the other hand, the vehicle described in the present specification may be a concept including both a vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, and an electric vehicle having an electric motor as a power source.
1 is a conceptual diagram of a vehicle communication system including an autonomous mobile device according to an embodiment of the present invention.
Referring to the drawings, a
The
On the other hand, the autonomous
For example, for autonomous driving of the vehicle, when the vehicle speed is equal to or greater than a predetermined speed, autonomous travel of the vehicle is performed through the vehicle
As another example, when the vehicle
On the other hand, the
For example, when the
As another example, when the
The
On the other hand, the
On the other hand, the
Alternatively, the vehicle driving
On the other hand, the surrounding
Meanwhile, the
Meanwhile, the
On the other hand, the autonomous
2A is a diagram showing the appearance of a vehicle having various cameras.
Referring to the drawings, the
The
On the other hand, the figure illustrates that the
The plurality of
FIG. 2B is a view showing the appearance of a stereo camera attached to the vehicle of FIG. 2A.
Referring to the drawing, the
The
The
A vehicle driving
FIG. 2C is a view schematically showing the positions of a plurality of cameras attached to the vehicle of FIG. 2A, and FIG. 2D illustrates an example of an ambient view image based on images photographed by the plurality of cameras of FIG. 2C.
First, referring to FIG. 2C, a plurality of
In particular, the
On the other hand, the
Each of the plurality of images photographed by the plurality of
FIG. 2D illustrates an example of the surrounding
3A to 3B illustrate various examples of an internal block diagram of the autonomous navigation apparatus of FIG.
3A and 3B illustrate an internal block diagram of the
The
3A, the vehicle driving
The
The
On the other hand, when the user is aboard the vehicle, the user's
The
The
On the other hand, the
Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Lamp information, vehicle interior temperature information, and vehicle interior humidity information.
Such sensor information may include a heading sensor, a yaw sensor, a gyro sensor, a position module, a vehicle forward / backward sensor, a wheel sensor, a vehicle speed sensor, A vehicle body inclination sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle internal temperature sensor, and a vehicle internal humidity sensor. On the other hand, the position module may include a GPS module for receiving GPS information.
On the other hand, among the sensor information, the vehicle direction information, the vehicle position information, the vehicle angle information, the vehicle speed information, the vehicle tilt information, and the like relating to the vehicle running can be referred to as vehicle running information.
The
An audio output unit (not shown) converts an electric signal from the
An audio input unit (not shown) can receive a user's voice. For this purpose, a microphone may be provided. The received voice may be converted to an electrical signal and transmitted to the
The
In particular, the
Particularly, when the object is detected, the
The
Meanwhile, the
On the other hand, the
On the other hand, the
On the other hand, the
The
The
The
The
Next, referring to FIG. 3B, the vehicle driving
The
The
The
3C to 3D illustrate various examples of the internal block diagram of the autonomous vehicle of FIG.
FIGS. 3C to 3D illustrate an internal block diagram of the surrounding
The surround
On the other hand, the surrounding
Referring to FIG. 3C, the surrounding
The
The
On the other hand, when the user is boarding the vehicle, the user's
The
On the other hand, the
Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Lamp information, vehicle interior temperature information, and vehicle interior humidity information.
On the other hand, among the sensor information, the vehicle direction information, the vehicle position information, the vehicle angle information, the vehicle speed information, the vehicle tilt information, and the like relating to the vehicle running can be referred to as vehicle running information.
The
On the other hand, the
The
Particularly, the
Meanwhile, the
Particularly, when the object is detected, the
Then, the
On the other hand, the
The
Meanwhile, the
The
The plurality of
3D is similar to the surrounding
The
The
The
Meanwhile, the far
3E is an internal block diagram of the vehicle display device of FIG.
The
The
The input signal through the
The
For example, when the user is boarded in the vehicle, the user's mobile terminal and the
On the other hand, the
The space
The spatial
The spatial
For this purpose, the spatial
The
The
The
The
That is, when a user's hand such as a user's hand approaches within a predetermined distance, an electric signal may be supplied to the electrode array or the like in the
In particular, the z-axis information can be sensed by the
The
Specifically, the
Here, the vehicle status information includes at least one of battery information, fuel information, vehicle speed information, tire information, steering information by steering wheel rotation, vehicle lamp information, vehicle internal temperature information, vehicle external temperature information, can do.
The
The
For example, the
As another example, the
The
The
The
When the user's hands approach the
On the other hand, when the user's hand approaches within a second distance closer to the
On the other hand, the
Based on the sensed signal, the
On the other hand, the
Specifically, the
That is, the
The
The
Figures 4A-4B illustrate various examples of internal block diagrams of the processors of Figures 3A-3D, and Figure 5 is a diagram illustrating object detection in the processors of Figures 4A-4B.
4A is a block diagram of the
The
The
Specifically, the
The
At this time, the stereo matching may be performed on a pixel-by-pixel basis or a predetermined block basis. On the other hand, the disparity map may mean a map in which numerical values of binocular parallax information of images, i.e., left and right images, are displayed.
The
Specifically, the
For example, an area having dispaly information within a disparity map of a predetermined value or less can be calculated as a background, and the corresponding part can be excluded. Thereby, the foreground can be relatively separated.
As another example, an area in which the dispetity information is equal to or greater than a predetermined value in the disparity map can be calculated with the foreground, and the corresponding part can be extracted. Thereby, the foreground can be separated.
Thus, by separating the foreground and background based on the disparity information information extracted based on the image, it becomes possible to shorten the signal processing speed, signal processing amount, and the like at the time of object detection thereafter.
Next, the
That is, the
More specifically, the
Next, the
For this purpose, the
On the other hand, the
For example, the
An
4B is another example of an internal block diagram of the processor.
Referring to FIG. 4B, the
The
Next, the
For this purpose, the
FIG. 5 is a diagram referred to for explaining the operation method of the
Referring to FIG. 5, during the first and second frame periods, the plurality of
The
The
On the other hand, when such a disparity map is displayed, it may be displayed so as to have a higher luminance as the disparity level becomes larger, and a lower luminance as the disparity level becomes smaller.
In the figure, first to
The
In the figure, using the
That is, the first to
On the other hand, by continuously acquiring the image, the
6A to 6B are views referred to in the description of the operation of the autonomous travel apparatus of FIG.
First, FIG. 6A is a diagram illustrating a vehicle forward situation photographed by a
Referring to the drawing, a
Next, FIG. 6B illustrates the display of the vehicle front state, which is grasped by the vehicle driving assist system, together with various information. In particular, the image as shown in FIG. 6B may be displayed on the
6B is different from FIG. 6A in that information is displayed on the basis of an image photographed by the
A
The vehicle driving assistant 100a performs signal processing on the basis of the stereo image photographed by the
On the other hand, in the drawing, it is exemplified that each of them is highlighted by a frame to indicate object identification for the
On the other hand, the vehicle driving
In the figure, calculated
On the other hand, the vehicle driving
The figure illustrates that the
On the other hand, the
The vehicle driving assistant 100a may display various information shown in FIG. 6B through the
7 is an example of a block diagram of an interior of a vehicle according to an embodiment of the present invention.
Referring to the drawings, the
The
Meanwhile, the
The
The communication unit 720 can exchange data with the
The communication unit 720 receives from the
On the other hand, when the user aboard the vehicle, the user's
The
On the other hand, the
The
The
The
The power
For example, when a fossil fuel-based engine (not shown) is a power source, the power
As another example, when the electric motor (not shown) is a power source, the power
The
The
The air
The
The
The
On the other hand, the
The
Thereby, the
In addition, the
The
It is possible to perform a specific operation by input by the
Also, the
On the other hand, the
The
For the display of such an ambient view image or the like, the
The
The
The
The
A plurality of
The
The
The
FIG. 8 is a flowchart showing an operation method of an autonomous navigation apparatus according to an embodiment of the present invention, and FIGS. 9A to 15 are diagrams referred to the description of the operation method of FIG.
Referring to the drawings, the
For example, when the autonomous operation mode button provided in the
As another example, when the driver outputs the voice of the " autonomous operation mode ", the
As another example, when the driver selects the 'autonomous operation mode' item through the
Upon entering the autonomous operation mode, the
Specifically, the
In addition, the
Next, the
The
Here, receiving destination information can be implemented in various ways.
For example, when the driver's voice including the 'destination information' is input through the
As another example, the
Then, the
Here, the first route may be any one of a route requiring the shortest time, a route being the optimum route for the high-speed travel, and a route consuming the lowest cost.
Next, the
When the driver is in the sleep state, the
On the other hand, if the driver is not in the sleep state in step 925 (S925), the
When the destination variable information is received, the
On the other hand, unlike the figure, when the destination variable information is received while the driver is in the sleep state, the
The
9A shows an example of the interior of the vehicle. As an example of the
The
The
On the other hand, FIG. 9A illustrates that a
The
On the other hand, FIG. 9A illustrates a body
The
The
When it is determined that the driver is sleeping, the
On the other hand, FIG. 9A illustrates that the driver outputs a
Accordingly, the
On the other hand, the
On the other hand, unlike FIG. 9B, it is also possible to output the autonomous driving
The
FIG. 9D illustrates a case where the
9A, the
9E, the
The
Particularly, as shown in FIG. 9F, the
On the other hand, the
It is possible to control at least one of lane change pattern control, acceleration / deceleration pattern control, vibration control, noise cancellation, external light interception, internal light control, temperature control,
For example, when the driver's sleep is shallow, the lane changes smoothly, the acceleration / deceleration is smooth, and the vibration is smoothly controlled. Further, in order to cut off the external light, the light transmittance of the wind shield, window, and the like can be controlled so as to be considerably low. It is also possible to perform audio signal processing for external noise cancellation. Further, it is possible to control the luminance of the internal illumination to be significantly lowered. Further, the air
On the other hand, the
Alternatively, the
10A illustrates that the driver sleeps in a shallow sleep state.
Fig. 10B illustrates that the vehicle running speed is variable when the driver is in a shallow sleep state.
10B, the
FIG. 10C illustrates that the route to the destination is varied when the driver is in a shallow sleep state.
Alternatively, when the driver is in a shallow sleep state, the
On the other hand, the
10D, when the congestion occurs in the route of the
For example, the
11A illustrates that the driver sleeps in a deep sleep state.
FIG. 11B illustrates that the vehicle running speed is variable when the driver is in a deep sleep state.
11B, the
FIG. 11C illustrates that the route to the destination is varied when the driver is in a deep sleep state.
Alternatively, when the driver is in a deep sleep state, the
On the other hand, the
When the congestion occurs in the route of the
Fig. 12 is a view showing an example of
When the current position of the vehicle is in the Z position, the
In particular, when the destination is set to X, the
On the other hand, the
On the other hand, the
On the other hand, the dehydrating surface mode can be entered in other cases as well. That is, the
The
Alternatively, the
On the other hand, when the destination arrives, the
The
14A illustrates that a
Alternatively, it is also possible that a destination change message is received at the driver's mobile terminal 600a and the message is transmitted to the
The
The
Fig. 15 is a diagram showing an example of a destination variable described in Figs. 14A to 14B.
When the current position of the vehicle is in the Z position, the
In particular, when the destination is set to X, the
On the other hand, when the destination variable information is received, the destination is corrected from X to X ', so that the route can be changed from
FIG. 16 is a flowchart showing an operation method of an autonomous navigation apparatus according to another embodiment of the present invention, and FIGS. 17A to 18 are views referred to the description of the operation method of FIG.
Referring to the drawings, the
Next, the
Next, the
Steps S1710 through S1725 are the same as steps S910 through S925 of FIG. 8, and a description thereof will be omitted.
Next, when the driver is in the sleep state, the
Next, the
If the destination variable information is received, control is performed to perform the dehydration mode (S1750).
Alternatively, after step 1730 (S1730), the
17A-17H illustrate various dehydration surface mode entry conditions.
Fig. 17B is a diagram showing a state in which a destination is changed, Fig. 17B is a flowchart showing a case in which a destination is changed, Fig. 17H shows the time at which the message is received and thus the
On the other hand, the
Meanwhile, the method of operating the autonomous vehicle of the present invention can be implemented as a code that can be read by a processor on a recording medium readable by a processor included in the autonomous vehicle or the vehicle. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the recording medium that can be read by the processor include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet . In addition, the processor-readable recording medium may be distributed over network-connected computer systems so that code readable by the processor in a distributed fashion can be stored and executed.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the present invention.
Claims (14)
Radar;
A communication unit;
The route to the destination can be varied based on at least one of the driver's sleeping state information and the running route state information when the driver is in the sleeping state and controls to perform the autonomous operation to the first route toward the destination in the self- And controlling the autonomous running of the vehicle through a variable route.
The processor comprising:
And controls the autonomous running vehicle to perform autonomous operation to the first route among a plurality of routes to the destination based on the received destination information when receiving the destination information in the autonomous running mode.
And an audio input unit,
The processor comprising:
Wherein when the driver's voice is input through the audio input unit, the driver's voice is recognized and the destination information is extracted based on the recognized driver's voice.
Wherein,
Receives the driver's schedule information from the driver's mobile terminal,
The processor comprising:
And extracts the destination information based on the schedule information.
Internal camera;
And a driver detection sensor for detecting the driver's body information,
The processor comprising:
Determining whether the driver is sleeping or the sleep state information of the driver based on the image from the internal camera and the driver's body information from the driver detection sensor and if the driver is determined to be in the sleep state, Wherein the control unit controls the vehicle so as to vary the route to the destination based on at least one of the driver's sleeping state information and the running route state information so as to perform the autonomous traveling through the varied route.
The processor comprising:
Based on the driver sleep state information,
Wherein the control means controls at least one of a lane change pattern control, an acceleration / deceleration pattern control, a vibration control, a noise canceling, an external light cutoff, an internal light control, a temperature control and a driving state display.
The processor comprising:
Calculating a dehydration surface expected time of the driver based on the driver's sleep surface state information and varying the route to the destination based on the calculated dehydration surface expected time and performing autonomous travel through the varied route Wherein said control means controls said control means so as to control said control means.
The processor comprising:
And calculates a dehydration surface expected time of the driver based on the driver's sleep surface state information and varies the vehicle traveling speed to the destination based on the calculated dehydration surface expected time.
The processor comprising:
A destination changing unit configured to compare a priority of a destination selected by the driver with a priority order of the destination changing information received from the outside when receiving the destination changing information from outside,
Wherein when the destination change is determined, the route is changed based on the destination change information received from the outside, and the autonomous traveling is controlled to perform the autonomous travel through the variable route.
Wherein,
Receiving the running route status information,
The processor comprising:
If the vehicle accident information is received ahead of the received route state information or the difference between the expected arrival time and the target time is equal to or greater than a predetermined value, the road type, the speed limit, the current speed, the curvature of the road, Searches for a bypass route based on at least one of the bypass routes, selects one of the searched bypass routes, varies the route, and performs autonomous travel through the variable route.
The processor comprising:
When the driver arrives at the destination, the driver's mobile terminal receives a call or a message, while the driver is traveling to the destination, or when the driver arrives at a rest area, Mode of the vehicle.
The processor comprising:
Wherein the controller controls at least one of a driver's seat vibration, a lighting, a speaker, a window, a sunroof, or an internal temperature upon entry into the dehydrating surface mode.
display; And
And an audio output unit,
The processor comprising:
Wherein at least one of the driver sleep state information, the running route state information, and the variable route information is outputted through the display or the audio output unit.
A steering driver for driving the steering device;
A brake driver for driving the brake device;
And a power source driving unit for driving the power source,
The processor comprising:
And controls at least one of the steering driving section, the brake driving section, and the power source driving section at the time of the autonomous vehicle running based on the distance information between the images from the plurality of cameras and the vehicle periphery object from the radar Autonomous vehicle.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150085407A KR20160148394A (en) | 2015-06-16 | 2015-06-16 | Autonomous vehicle |
PCT/KR2016/006348 WO2016204507A1 (en) | 2015-06-16 | 2016-06-15 | Autonomous traveling vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150085407A KR20160148394A (en) | 2015-06-16 | 2015-06-16 | Autonomous vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20160148394A true KR20160148394A (en) | 2016-12-26 |
Family
ID=57733986
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150085407A KR20160148394A (en) | 2015-06-16 | 2015-06-16 | Autonomous vehicle |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20160148394A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180104457A (en) * | 2017-03-13 | 2018-09-21 | 현대자동차주식회사 | Apparatus for sleeping aid in vehicle, system having the same and method thereof |
CN111192580A (en) * | 2019-12-31 | 2020-05-22 | 浙江合众新能源汽车有限公司 | Method and device for actively starting ACC function of automobile through voice |
KR20200116180A (en) * | 2019-03-11 | 2020-10-12 | 현대모비스 주식회사 | Apparatus for controlling lane change of vehicle and method thereof |
US11453414B2 (en) * | 2019-03-27 | 2022-09-27 | Volkswagen Aktiengesellschaft | Method and device for adapting a driving strategy of an at least partially automated transportation vehicle |
CN115209046A (en) * | 2021-04-14 | 2022-10-18 | 丰田自动车株式会社 | Information processing apparatus, non-transitory storage medium, and information processing method |
WO2023229055A1 (en) * | 2022-05-23 | 2023-11-30 | 엘지전자 주식회사 | Vehicle monitoring apparatus, vehicle comprising same and vehicle operating method |
-
2015
- 2015-06-16 KR KR1020150085407A patent/KR20160148394A/en unknown
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180104457A (en) * | 2017-03-13 | 2018-09-21 | 현대자동차주식회사 | Apparatus for sleeping aid in vehicle, system having the same and method thereof |
KR20200116180A (en) * | 2019-03-11 | 2020-10-12 | 현대모비스 주식회사 | Apparatus for controlling lane change of vehicle and method thereof |
US11453414B2 (en) * | 2019-03-27 | 2022-09-27 | Volkswagen Aktiengesellschaft | Method and device for adapting a driving strategy of an at least partially automated transportation vehicle |
CN111192580A (en) * | 2019-12-31 | 2020-05-22 | 浙江合众新能源汽车有限公司 | Method and device for actively starting ACC function of automobile through voice |
CN115209046A (en) * | 2021-04-14 | 2022-10-18 | 丰田自动车株式会社 | Information processing apparatus, non-transitory storage medium, and information processing method |
CN115209046B (en) * | 2021-04-14 | 2024-05-28 | 丰田自动车株式会社 | Information processing apparatus, non-transitory storage medium, and information processing method |
WO2023229055A1 (en) * | 2022-05-23 | 2023-11-30 | 엘지전자 주식회사 | Vehicle monitoring apparatus, vehicle comprising same and vehicle operating method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101741433B1 (en) | Driver assistance apparatus and control method for the same | |
KR101730321B1 (en) | Driver assistance apparatus and control method for the same | |
KR102043060B1 (en) | Autonomous drive apparatus and vehicle including the same | |
KR101750876B1 (en) | Display apparatus for vehicle and Vehicle | |
KR101618551B1 (en) | Driver assistance apparatus and Vehicle including the same | |
KR101582572B1 (en) | Driver assistance apparatus and Vehicle including the same | |
KR101565007B1 (en) | Driver assistance apparatus and Vehicle including the same | |
KR20170010645A (en) | Autonomous vehicle and autonomous vehicle system including the same | |
KR20170011882A (en) | Radar for vehicle, and vehicle including the same | |
KR20160142167A (en) | Display apparatus for vhhicle and vehicle including the same | |
KR101632179B1 (en) | Driver assistance apparatus and Vehicle including the same | |
KR101698781B1 (en) | Driver assistance apparatus and Vehicle including the same | |
KR20160147559A (en) | Driver assistance apparatus for vehicle and Vehicle | |
KR20160148394A (en) | Autonomous vehicle | |
KR20170140284A (en) | Vehicle driving aids and vehicles | |
KR101980547B1 (en) | Driver assistance apparatus for vehicle and Vehicle | |
KR20150072942A (en) | Driver assistance apparatus and Vehicle including the same | |
KR101641491B1 (en) | Driver assistance apparatus and Vehicle including the same | |
KR20160148395A (en) | Autonomous vehicle | |
KR20170043212A (en) | Apparatus for providing around view and Vehicle | |
KR101822896B1 (en) | Driver assistance apparatus and control method for the same | |
KR20160064762A (en) | Display apparatus for vhhicleand vehicle including the same | |
KR101872477B1 (en) | Vehicle | |
KR20160144643A (en) | Apparatus for prividing around view and vehicle including the same | |
KR101816570B1 (en) | Display apparatus for vehicle |