Nothing Special   »   [go: up one dir, main page]

KR20160148394A - Autonomous vehicle - Google Patents

Autonomous vehicle Download PDF

Info

Publication number
KR20160148394A
KR20160148394A KR1020150085407A KR20150085407A KR20160148394A KR 20160148394 A KR20160148394 A KR 20160148394A KR 1020150085407 A KR1020150085407 A KR 1020150085407A KR 20150085407 A KR20150085407 A KR 20150085407A KR 20160148394 A KR20160148394 A KR 20160148394A
Authority
KR
South Korea
Prior art keywords
vehicle
information
driver
route
processor
Prior art date
Application number
KR1020150085407A
Other languages
Korean (ko)
Inventor
박형민
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150085407A priority Critical patent/KR20160148394A/en
Priority to PCT/KR2016/006348 priority patent/WO2016204507A1/en
Publication of KR20160148394A publication Critical patent/KR20160148394A/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K28/00Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
    • B60K28/02Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
    • B60K28/06Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/037Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for occupant comfort, e.g. for automatic adjustment of appliances according to personal settings, e.g. seats, mirrors, steering wheel
    • B60R16/0373Voice control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/18Conjoint control of vehicle sub-units of different type or different function including control of braking systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/20Conjoint control of vehicle sub-units of different type or different function including control of steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0818Inactivity or incapacity of driver
    • B60W2040/0827Inactivity or incapacity of driver due to sleepiness
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Automation & Control Theory (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present invention relates to an autonomous vehicle. According to the present invention, the autonomous vehicle comprises: a plurality of cameras; a radar; a communications unit; a processor controlling the vehicle to autonomously travel along a first route toward the destination in an autonomous traveling mode, varying the route to the destination based on at least one among drivers drowsiness state information and traveling route information when a driver falls asleep, and controlling the vehicle to autonomously travel along the varied route. Therefore, the present invention can vary the route to the destination when the driver falls asleep.

Description

[0001] Autonomous vehicle [0002]

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an autonomous vehicle, and more particularly, to an autonomous vehicle capable of varying a route to a destination in a driver's sleep state.

A vehicle is a device that moves a user in a desired direction by a boarding user. Typically, automobiles are examples.

On the other hand, for the convenience of users who use the vehicle, various sensors and electronic devices are provided. In particular, various devices and the like for the user's driving comfort have been developed, and images photographed from a rear camera provided when the vehicle is backed up or when the vehicle is parked are provided.

An object of the present invention is to provide an autonomous vehicle capable of varying a route to a destination in a sleep state of a driver.

In order to achieve the above object, the autonomous vehicle according to the present invention controls a plurality of cameras, a radar, and a communication unit so as to perform autonomous operation to a first route toward a destination in an autonomous running driving mode, And a processor for varying the route to the destination based on at least one of the driver sleep state information and the running route state information and controlling the autonomous running through the variable route.

An autonomous vehicle according to an embodiment of the present invention controls a plurality of cameras, a radar, and a communication unit to perform an autonomous operation to a first route toward a destination in an autonomous running driving mode. When the driver is in a sleep state, And a processor for controlling the autonomous running through the variable route by varying the route to the destination on the basis of at least one of the driver sleep state information and the running route state information, The route to the destination can be varied in the driver sleep state. Therefore, the convenience of use can be increased.

On the other hand, the processor calculates the dehydration surface expected time of the driver based on the driver's sleep surface state information, varies the route to the destination based on the calculated dehydration surface expected time, and autonomously travels through the variable route , It becomes possible to change the customized route according to the driver's water surface. Therefore, the convenience of use can be increased.

On the other hand, the processor calculates the dehydration surface expected time of the driver based on the driver's sleep surface state information, and changes the vehicle running speed to the destination based on the calculated dehydration surface expected time, The speed variable becomes possible. Therefore, the convenience of use can be increased.

On the other hand, the destination information can be extracted based on the driver's voice or the driver's schedule information, thereby making it possible to increase the convenience of use.

On the other hand, it is possible to increase the usability by changing the route according to the destination change information or the traveling route information received from the outside and controlling the autonomous travel through the variable route.

On the other hand, when the destination arrives, when the driver arrives at the mobile terminal by telephone or message, when he / she travels to the destination, arrives at a rest area, It is possible to end the sleeping of the driver.

1 is a conceptual diagram of a vehicle communication system including an autonomous mobile device according to an embodiment of the present invention.
2A is a diagram showing the appearance of a vehicle having various cameras.
FIG. 2B is a view showing the appearance of a stereo camera attached to the vehicle of FIG. 2A.
2C is a view schematically showing the positions of a plurality of cameras attached to the vehicle of FIG. 2A.
Fig. 2D illustrates an auroral view image based on images photographed by the plurality of cameras of Fig. 2C.
3A to 3B illustrate various examples of an internal block diagram of the autonomous navigation apparatus of FIG.
3C to 3D illustrate various examples of the internal block diagram of the autonomous vehicle of FIG.
3E is an internal block diagram of the vehicle display device of FIG.
Figures 4A-4B illustrate various examples of internal block diagrams of the processors of Figures 3A-3D.
Figure 5 is a diagram illustrating object detection in the processor of Figures 4A-4B.
6A to 6B are views referred to in the description of the operation of the autonomous travel apparatus of FIG.
7 is an example of a block diagram of an interior of a vehicle according to an embodiment of the present invention.
8 is a flowchart showing an operation method of an autonomous navigation apparatus according to an embodiment of the present invention.
Figs. 9A to 15 are views referred to the explanation of the operation method of Fig.
16 is a flowchart showing an operation method of an autonomous vehicle according to another embodiment of the present invention.
17A to 18 are diagrams referred to in the explanation of the operation method of FIG.

Hereinafter, the present invention will be described in detail with reference to the drawings.

The suffix "module" and " part "for components used in the following description are given merely for convenience of description, and do not give special significance or role in themselves. Accordingly, the terms "module" and "part" may be used interchangeably.

The vehicle described herein may be a concept including a car, a motorcycle. Hereinafter, the vehicle will be described mainly with respect to the vehicle.

On the other hand, the vehicle described in the present specification may be a concept including both a vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, and an electric vehicle having an electric motor as a power source.

1 is a conceptual diagram of a vehicle communication system including an autonomous mobile device according to an embodiment of the present invention.

Referring to the drawings, a vehicle communication system 10 may include a vehicle 200, terminals 600a and 600b, and a server 500. [

The vehicle 200 may include an autonomous traveling device 100, a vehicle display device 400, and the like in the interior of the vehicle.

On the other hand, the autonomous mobile device 100 may include a vehicle driving assistant 100a, an ambient view providing device 100b, and the like

For example, for autonomous driving of the vehicle, when the vehicle speed is equal to or greater than a predetermined speed, autonomous travel of the vehicle is performed through the vehicle driving assistant device 100a, and when the vehicle speed is less than the predetermined speed, Can be performed.

As another example, when the vehicle driving assist device 100a and the surrounding view providing device 100b are operated together for autonomous driving of the vehicle but the predetermined speed or more, the vehicle driving assistant 100a is further weighted, The autonomous running is performed mainly on the driving assistance device 100a and when the speed is less than the predetermined speed, the weight is further added to the surrounding view providing device 100b so that the autonomous traveling of the vehicle can be performed mainly on the surrounding view providing device 100b .

On the other hand, the vehicle driving assistant 100a, the surrounding view providing apparatus 100b and the vehicle display apparatus 400 are connected to each other via a communication unit (not shown) or a communication unit provided in the vehicle 200, , 600b, or the server 500, as shown in FIG.

For example, when the mobile terminal 600a is located inside or near a vehicle, at least one of the vehicle driving assistant device 100a, the surrounding view providing device 100b, and the vehicle display device 400 is connected by short- , And the terminal 600a.

As another example, when the terminal 600b is located at a remote place outside the vehicle, at least one of the vehicle driving assistant device 100a, the surrounding view providing device 100b, and the vehicle display device 400 may be a remote communication ) Can exchange data with the terminal 600b or the server 500 via the network 570. [

The terminals 600a and 600b may be mobile terminals such as mobile phones, smart phones, tablet PCs, and wearable devices such as smart watches. Or a fixed terminal such as a TV or a monitor. Hereinafter, the terminal 600 will be mainly described as a mobile terminal such as a smart phone.

On the other hand, the server 500 may be a server provided by a vehicle manufacturer or a server operated by a provider providing a vehicle-related service. For example, it may be a server operated by a provider providing information on road traffic conditions and the like.

On the other hand, the vehicle driving assistant 100a can generate and provide vehicle-related information by signal processing the stereo image received from the stereo camera 195 based on computer vision. Here, the vehicle-related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for a driving guide to the vehicle driver.

Alternatively, the vehicle driving assistant device 100a generates a control signal for the vehicle-to-vehicle traveling based on the distance information between the stereo image received from the stereo camera 195 and the vehicle periphery object from the radar 797 . For example, it is possible to output a control signal for controlling at least one of the steering driver, the brake driver, and the power source driver during autonomous vehicle travel.

On the other hand, the surrounding view providing apparatus 100b is configured to provide a plurality of images captured by the plurality of cameras 295a, 295b, 295c, and 295d to a processor (270 in FIG. 3C or FIG. 3D) And the processor (270 in FIG. 3C or FIG. 3D) can combine the plurality of images to generate and provide the surround view image.

Meanwhile, the vehicle display device 400 may be an AVN (Audio Video Navigation) device.

Meanwhile, the vehicle display apparatus 400 may include a space recognition sensor unit and a touch sensor unit, whereby the remote access can be sensed by the space recognition sensor unit, and the near-to-touch approach can be sensed through the touch sensor unit . Then, a user interface corresponding to the detected user gesture or touch can be provided.

On the other hand, the autonomous mobile device 100 according to the embodiment of the present invention,

2A is a diagram showing the appearance of a vehicle having various cameras.

Referring to the drawings, the vehicle 200 includes wheels (203FR, 103FL, 103RL, ...) rotated by a power source, a handle (250) for adjusting the traveling direction of the vehicle (200) A stereo camera 195 provided inside the vehicle 200 for the device 100a and a plurality of cameras 295a, 295b, 295c, 295d mounted on the vehicle 200 for the autonomous vehicle 100b of Fig. ). On the other hand, in the figure, only the left camera 295a and the front camera 295d are shown for the sake of convenience.

The stereo camera 195 may include a plurality of cameras, and the stereo image obtained by the plurality of cameras may be signal-processed in the vehicle driving assistance apparatus (100a in Fig. 3).

On the other hand, the figure illustrates that the stereo camera 195 includes two cameras.

The plurality of cameras 295a, 295b, 295c, and 295d can be activated when the vehicle speed is equal to or lower than a predetermined speed, or when the vehicle is backward, and can acquire a shot image, respectively. The image, which is obtained by a plurality of cameras, can be signal processed within the surrounding view providing apparatus (100b in Fig. 3c or 3d).

FIG. 2B is a view showing the appearance of a stereo camera attached to the vehicle of FIG. 2A.

Referring to the drawing, the stereo camera module 195 may include a first camera 195a having a first lens 193a, and a second camera 195b having a second lens 193b.

The stereo camera module 195 includes a first light shield 192a and a second light shield 192b for shielding light incident on the first lens 193a and the second lens 193b, Shielding portion 192b.

The stereo camera module 195 in the drawing may be a structure detachable from the ceiling or the windshield of the vehicle 200.

A vehicle driving assistant device 100a (FIG. 3) having such a stereo camera module 195 obtains a stereo image for the vehicle front from the stereo camera module 195 and generates a disparity ) Detection, perform object detection for at least one stereo image based on the disparity information, and continuously track the motion of the object after object detection.

FIG. 2C is a view schematically showing the positions of a plurality of cameras attached to the vehicle of FIG. 2A, and FIG. 2D illustrates an example of an ambient view image based on images photographed by the plurality of cameras of FIG. 2C.

First, referring to FIG. 2C, a plurality of cameras 295a, 295b, 295c, and 295d may be disposed on the left, rear, right, and front of the vehicle, respectively.

In particular, the left camera 295a and the right camera 295c may be disposed in a case that surrounds the left side mirror and a case that surrounds the right side mirror, respectively.

On the other hand, the rear camera 295b and the front camera 295d can be disposed in the vicinity of the trunk switch and in the vicinity of the ambulance or the ambulance, respectively.

Each of the plurality of images photographed by the plurality of cameras 295a, 295b, 295c and 295d is transmitted to a processor (270 in Fig. 3c or 3d) in the vehicle 200 and is transmitted to a processor ) Combines a plurality of images to generate an ambient view image.

FIG. 2D illustrates an example of the surrounding view image 210. FIG. The surround view image 210 includes a first image area 295ai from the left camera 295a, a second image area 295bi from the rear camera 295b, a third image area 295b from the right camera 295c, 295ci, and a fourth image area 295di from the front camera 295d.

3A to 3B illustrate various examples of an internal block diagram of the autonomous navigation apparatus of FIG.

3A and 3B illustrate an internal block diagram of the autonomous navigation device 100 for the vehicle driving assistance device 100a.

The vehicle driving assistant 100a can process the stereo image received from the stereo camera 195 based on computer vision to generate vehicle related information. Here, the vehicle-related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for a driving guide to the vehicle driver.

3A, the vehicle driving assistant apparatus 100a includes a communication unit 120, an interface unit 130, a memory 140, a processor 170, a power supply unit 190, (Not shown).

The communication unit 120 can exchange data with the mobile terminal 600 or the server 500 in a wireless manner. In particular, the communication unit 120 can exchange data with a mobile terminal of a vehicle driver wirelessly. As a wireless data communication method, various data communication methods such as Bluetooth, WiFi Direct, WiFi, and APiX are possible.

The communication unit 120 can receive weather information and traffic situation information on the road, for example, TPEG (Transport Protocol Expert Group) information from the mobile terminal 600 or the server 500. Meanwhile, the vehicle driving assistant 100a may transmit real-time traffic information based on the stereo image to the mobile terminal 600 or the server 500. [

On the other hand, when the user is aboard the vehicle, the user's mobile terminal 600 and the vehicle driving assistant 100a can perform pairing with each other automatically or by execution of the user's application.

The interface unit 130 can receive the vehicle-related data or transmit the signal processed or generated by the processor 170 to the outside. To this end, the interface unit 130 can perform data communication with the ECU 770, the AVN (Audio Video Navigation) device 400, the sensor unit 760, and the like in the vehicle by a wire communication or a wireless communication method have.

The interface unit 130 can receive map information related to the vehicle driving by data communication with the vehicle display device 400. [

On the other hand, the interface unit 130 can receive the sensor information from the ECU 770 or the sensor unit 760.

Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Lamp information, vehicle interior temperature information, and vehicle interior humidity information.

Such sensor information may include a heading sensor, a yaw sensor, a gyro sensor, a position module, a vehicle forward / backward sensor, a wheel sensor, a vehicle speed sensor, A vehicle body inclination sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle internal temperature sensor, and a vehicle internal humidity sensor. On the other hand, the position module may include a GPS module for receiving GPS information.

On the other hand, among the sensor information, the vehicle direction information, the vehicle position information, the vehicle angle information, the vehicle speed information, the vehicle tilt information, and the like relating to the vehicle running can be referred to as vehicle running information.

The memory 140 may store various data for operation of the entire vehicle driving assistant device 100a, such as a program for processing or controlling the processor 170. [

An audio output unit (not shown) converts an electric signal from the processor 170 into an audio signal and outputs the audio signal. For this purpose, a speaker or the like may be provided. The audio output unit (not shown) can also output sound corresponding to the operation of the input unit 110, that is, the button.

An audio input unit (not shown) can receive a user's voice. For this purpose, a microphone may be provided. The received voice may be converted to an electrical signal and transmitted to the processor 170.

The processor 170 controls the overall operation of each unit in the vehicle driving assistant 100a.

In particular, the processor 170 performs signal processing based on computer vision. Accordingly, the processor 170 obtains a stereo image for the vehicle front from the stereo camera 195, performs a disparity calculation for the vehicle front based on the stereo image, and based on the calculated disparity information , Perform object detection for at least one of the stereo images, and continue to track object motion after object detection.

Particularly, when the object is detected, the processor 170 performs lane detection, vehicle detection, pedestrian detection, traffic sign detection, road surface detection, and the like .

The processor 170 may perform a distance calculation to the detected nearby vehicle, a speed calculation of the detected nearby vehicle, a speed difference calculation with the detected nearby vehicle, and the like.

Meanwhile, the processor 170 can receive weather information, traffic situation information on the road, and TPEG (Transport Protocol Expert Group) information, for example, through the communication unit 120.

On the other hand, the processor 170 can grasp, in real time, the traffic situation information on the surroundings of the vehicle based on the stereo image in the vehicle driving assistant device 100a.

On the other hand, the processor 170 can receive map information and the like from the vehicle display device 400 through the interface unit 130. [

On the other hand, the processor 170 can receive the sensor information from the ECU 770 or the sensor unit 760 through the interface unit 130. [ Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Lamp information, vehicle interior temperature information, and vehicle interior humidity information.

The power supply unit 190 can supply power necessary for the operation of each component under the control of the processor 170. [ Particularly, the power supply unit 190 can receive power from a battery or the like inside the vehicle.

The stereo camera 195 may include a plurality of cameras. Hereinafter, as described with reference to FIG. 2B and the like, it is assumed that two cameras are provided.

The stereo camera 195 may be detachably attachable to the ceiling or the front glass of the vehicle 200 and may include a first camera 195a having a first lens 193a and a second camera 195a having a second lens 193b, (195b).

The stereo camera 195 includes a first light shield 192a and a second light shield 192b for shielding light incident on the first lens 193a and the second lens 193b, And a portion 192b.

Next, referring to FIG. 3B, the vehicle driving assistance device 100a of FIG. 3B further includes an input unit 110 display 180 and an audio output unit 185 in addition to the vehicle driving assistance device 100a of FIG. . Hereinafter, only the description of the input unit 110, the display 180, and the audio output unit 185 will be described.

The input unit 110 may include a plurality of buttons or a touch screen attached to the vehicle driving assistance apparatus 100a, particularly, the stereo camera 195. [ It is possible to turn on and operate the vehicle driving assistant 100a through a plurality of buttons or a touch screen. In addition, it is also possible to perform various input operations.

The display 180 may display an image related to the operation of the vehicle driving assist system. For this image display, the display 180 may include a cluster or HUD (Head Up Display) on the inside of the vehicle interior. On the other hand, when the display 180 is the HUD, it may include a projection module that projects an image on the windshield of the vehicle 200. [

The audio output unit 185 outputs the sound to the outside based on the audio signal processed by the processor 170. [ To this end, the audio output unit 185 may include at least one speaker.

3C to 3D illustrate various examples of the internal block diagram of the autonomous vehicle of FIG.

FIGS. 3C to 3D illustrate an internal block diagram of the surrounding view providing apparatus 100b in the autonomous navigation apparatus 100. FIG.

The surround view providing apparatus 100b in FIGS. 3C to 3D can combine a plurality of images received from the plurality of cameras 295a, ..., and 295d to generate an ambient view image.

On the other hand, the surrounding view providing apparatus 100b performs object detection, confirmation, and tracking on objects located in the vicinity of the vehicle based on the plurality of images received from the plurality of cameras 295a, ..., 295d .

Referring to FIG. 3C, the surrounding view providing apparatus 100b of FIG. 3C includes a communication unit 220, an interface unit 230, a memory 240, a processor 270, a display 280, a power supply unit 290 , And a plurality of cameras 295a, ..., 295d.

The communication unit 220 can exchange data with the mobile terminal 600 or the server 500 in a wireless manner. In particular, the communication unit 220 can exchange data with the mobile terminal of the vehicle driver wirelessly. As a wireless data communication method, various data communication methods such as Bluetooth, WiFi Direct, WiFi, and APiX are possible.

The communication unit 220 receives from the mobile terminal 600 or the server 500 the schedule information related to the vehicle driver's schedule time or the travel position, weather information, road traffic situation information, for example, TPEG Group) information. Meanwhile, in the surrounding view providing apparatus 100b, real-time traffic information based on an image may be transmitted to the mobile terminal 600 or the server 500.

On the other hand, when the user is boarding the vehicle, the user's mobile terminal 600 and the surrounding view providing apparatus 100b can perform pairing with each other automatically or by execution of the user's application.

The interface unit 230 may receive the vehicle-related data or may transmit the processed or generated signal to the outside by the processor 270. For this purpose, the interface unit 230 can perform data communication with the ECU 770, the sensor unit 760, and the like in the vehicle by a wire communication or a wireless communication method.

On the other hand, the interface unit 230 can receive the sensor information from the ECU 770 or the sensor unit 760.

Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Lamp information, vehicle interior temperature information, and vehicle interior humidity information.

On the other hand, among the sensor information, the vehicle direction information, the vehicle position information, the vehicle angle information, the vehicle speed information, the vehicle tilt information, and the like relating to the vehicle running can be referred to as vehicle running information.

The memory 240 may store various data for operation of the entire surround view providing apparatus 100b, such as a program for processing or controlling the processor 270. [

On the other hand, the memory 240 may store map information related to the vehicle driving.

The processor 270 controls the overall operation of each unit in the surrounding view providing apparatus 100b.

Particularly, the processor 270 can acquire a plurality of images from the plurality of cameras 295a, ..., 295d, and combine the plurality of images to generate an around view image.

Meanwhile, the processor 270 may perform signal processing based on computer vision. For example, based on a plurality of images or a generated surrounding view image, a disparity calculation is performed around the vehicle, object detection is performed in the image based on the calculated disparity information, , The motion of the object can be continuously tracked.

Particularly, when the object is detected, the processor 270 can perform lane detection, vehicle detection, pedestrian detection, obstacle detection, parking area detection, road surface detection, and the like .

Then, the processor 270 can perform a distance calculation on the detected nearby vehicles or pedestrians.

On the other hand, the processor 270 can receive the sensor information from the ECU 770 or the sensor unit 760 via the interface unit 230. [ Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Lamp information, vehicle interior temperature information, and vehicle interior humidity information.

The display 280 may display an aurally view image generated by the processor 270. Meanwhile, it is also possible to provide a variety of user interface when displaying the surround view image, or to provide a touch sensor capable of touch input to the provided user interface.

Meanwhile, the display 280 may include a cluster or an HUD (Head Up Display) on the inside of the vehicle interior. On the other hand, when the display 280 is the HUD, it may include a projection module that projects an image on the windshield of the vehicle 200. [

The power supply unit 290 can supply power necessary for the operation of each component under the control of the processor 270. [ Particularly, the power supply unit 290 can receive power from a battery or the like inside the vehicle.

The plurality of cameras 295a, ..., and 295d are cameras for providing an overview image, preferably a wide angle camera.

3D is similar to the surrounding view providing apparatus 100b of FIG. 3C, but includes an input unit 210, an audio output unit 285, and an audio input unit (not shown) 286 are provided. Hereinafter, only the description of the input unit 210, the audio output unit 285, and the audio input unit 286 will be described.

The input unit 210 may include a plurality of buttons attached to the periphery of the display 280 or a touch screen disposed on the display 280. It is possible to turn on the power of the surrounding view providing apparatus 100b and operate it through a plurality of buttons or a touch screen. In addition, it is also possible to perform various input operations.

The audio output unit 285 converts an electric signal from the processor 270 into an audio signal and outputs the audio signal. For this purpose, a speaker or the like may be provided. The audio output unit 285 can also output sound corresponding to the operation of the input unit 210, that is, the button.

The audio input unit 286 can receive user voice. For this purpose, a microphone may be provided. The received voice may be converted to an electrical signal and transmitted to the processor 270.

Meanwhile, the far view providing apparatus 100b of FIG. 3C or FIG. 3D may be an AVN (Audio Video Navigation) apparatus.

3E is an internal block diagram of the vehicle display device of FIG.

The vehicle display apparatus 400 according to an embodiment of the present invention includes an input unit 310, a communication unit 320, a spatial recognition sensor unit 321, a touch sensor unit 326, an interface unit 330, A memory 340, a processor 370, a display 480, an audio input unit 383, an audio output unit 385, and a power supply unit 390.

The input unit 310 includes a button attached to the display device 400. For example, a power button may be provided. In addition, it may further include at least one of a menu button, an up / down button, and a left / right button.

The input signal through the input unit 310 may be transmitted to the processor 370.

The communication unit 320 can exchange data with an adjacent electronic device. For example, data can be exchanged with a vehicle internal electronic device or a server (not shown) in a wireless manner. Particularly, data can be exchanged wirelessly with the mobile terminal of the vehicle driver. Various wireless data communication methods such as Bluetooth, WiFi, and APiX are available.

For example, when the user is boarded in the vehicle, the user's mobile terminal and the display device 400 can perform the pairing with each other automatically or by execution of the user's application.

On the other hand, the communication unit 320 may include a GPS receiving device, and can receive GPS information, that is, position information of the vehicle.

The space recognition sensor unit 321 can detect the approach or movement of the user's hand. For this purpose, it may be disposed around the display 480.

The spatial recognition sensor unit 321 may perform spatial recognition based on an optical basis or may perform spatial recognition based on an ultrasonic wave. Hereinafter, description will be made mainly on performing spatial recognition under an optical basis.

The spatial recognition sensor section 321 can sense the approach or movement of the user's hand based on the output of the output light and the reception of the corresponding received light. In particular, the processor 370 can perform signal processing on the electrical signals of the output light and the received light.

For this purpose, the spatial recognition sensor unit 321 may include a light output unit 322 and a light receiving unit 324.

The light output unit 322 may output infrared light, for example, for detecting a user's hand located on the front of the display device 400. [

The light receiving unit 324 receives the light scattered or reflected when the light output from the light output unit 322 is scattered or reflected in the user's hand located on the front of the display device 400. [ Specifically, the light receiving unit 324 may include a photo diode and convert the received light into an electric signal through a photodiode. The converted electrical signal may be input to the processor 370.

The touch sensor unit 326 senses a floating touch and a direct touch. For this purpose, the touch sensor unit 326 may include an electrode array, an MCU, and the like. When the touch sensor unit is operated, an electric signal is supplied to the electrode array, and an electric field is formed on the electrode array.

The touch sensor unit 326 can operate when the intensity of light received by the spatial recognition sensor unit 321 is equal to or higher than the first level.

That is, when a user's hand such as a user's hand approaches within a predetermined distance, an electric signal may be supplied to the electrode array or the like in the touch sensor unit 326. [ An electric field is formed on the electrode array by the electric signal supplied to the electrode array, and the electric field is used to sense a capacitance change. Based on the capacitance change detection, the touch sensor detects a floating touch and a direct touch.

In particular, the z-axis information can be sensed by the touch sensor unit 326 in addition to the x- and y-axis information according to the approach of the user's hand.

The interface unit 330 can exchange data with other electronic devices in the vehicle. For example, the interface unit 330 can perform data communication with an ECU or the like in the vehicle by a wired communication method.

Specifically, the interface unit 330 can receive the vehicle status information by data communication with an ECU or the like in the vehicle.

Here, the vehicle status information includes at least one of battery information, fuel information, vehicle speed information, tire information, steering information by steering wheel rotation, vehicle lamp information, vehicle internal temperature information, vehicle external temperature information, can do.

The interface unit 330 may further receive GPS information from an ECU or the like in the vehicle. Alternatively, it is also possible to transmit GPS information, which is received by the display device 400, to an ECU or the like.

The memory 340 may store various data for operation of the display device 400, such as a program for processing or controlling the processor 370. [

For example, the memory 340 may store a map map for guiding the traveling path of the vehicle.

As another example, the memory 340 may store user information, user's mobile terminal information, for pairing with a user's mobile terminal.

The audio output unit 385 converts an electric signal from the processor 370 into an audio signal and outputs the audio signal. For this purpose, a speaker or the like may be provided. The audio output unit 385 can also output sound corresponding to the operation of the input unit 310, that is, the button.

The audio input unit 383 can receive user voice. For this purpose, a microphone may be provided. The received voice may be converted to an electrical signal and transmitted to the processor 370.

The processor 370 controls the overall operation of each unit in the vehicle display device 400. [

When the user's hands approach the display device 400 successively, the processor 370 sequentially determines the x, y, and z axes for the user's hand based on the light received by the light receiver 324 Information can be computed. At this time, the z-axis information can be sequentially reduced.

On the other hand, when the user's hand approaches within a second distance closer to the display 480 than the first distance, the processor 370 can control the touch sensor unit 326 to operate. That is, the processor 370 can control the touch sensor unit 326 to operate when the intensity of the electric signal from the spatial recognition sensor unit 321 is equal to or higher than the reference level. Thereby, an electric signal is supplied to each electrode array in the touch sensor unit 326. [

On the other hand, the processor 370 can sense the floating touch based on the sensing signal sensed by the touch sensor unit 326 when the user's hand is located within the second distance. In particular, the sensing signal may be a signal indicative of a change in capacitance.

Based on the sensed signal, the processor 370 computes the x and y axis information of the floating touch input and calculates z (x, y) based on the magnitude of the electrostatic capacitance change, Axis information can be calculated.

On the other hand, the processor 370 can change the grouping for the electrode array in the touch sensor unit 326 according to the distance of the user's hand.

Specifically, the processor 370 performs grouping on the electrode array in the touch sensor unit 326 based on the approximate z-axis information calculated on the basis of the received light received by the spatial recognition sensor unit 321 It is possible to change it. The larger the distance, the larger the size of the electrode array group can be set.

That is, the processor 370 can vary the size of the touch sensing cell with respect to the electrode array in the touch sensor unit 326 based on the distance information of the user's hand, that is, the z-axis information.

The display 480 may separately display an image corresponding to the function set for the button. For such image display, the display 480 may be implemented as a variety of display modules such as an LCD, an OLED, and the like. On the other hand, the display 480 may be implemented as a cluster on the inside of the vehicle interior.

The power supply unit 390 can supply power necessary for the operation of each component under the control of the processor 370. [

Figures 4A-4B illustrate various examples of internal block diagrams of the processors of Figures 3A-3D, and Figure 5 is a diagram illustrating object detection in the processors of Figures 4A-4B.

4A is a block diagram of the processor 170 of the vehicle driving assistance apparatus 100a of FIGS. 3A-3B or the processor 270 of the surrounding view providing apparatus 100B of FIGS. And shows an example of an internal block diagram.

The processor 170 or 270 may include an image preprocessing unit 410, a disparity computing unit 420, an object detecting unit 434, an object tracking unit 440, and an application unit 450.

The image preprocessor 410 may receive a plurality of images from the plurality of cameras 295a, ..., and 295d or a generated foreground view image to perform preprocessing.

Specifically, the image preprocessing unit 410 may perform a noise reduction, a rectification, a calibration, a color enhancement, a color correction, and a color correction on a plurality of images or a generated surrounding view image. Color space conversion (CSC), interpolation, camera gain control, and the like. Thus, it is possible to acquire a plurality of images photographed by the plurality of cameras 295a, ..., and 295d, or a sharper image than the generated surround view image.

The disparity calculator 420 receives a plurality of images or a generated surrounding view image signal-processed by the image preprocessing unit 410, and generates a plurality of images or a generated surrounding image Performs stereo matching on the view image, and obtains a disparty map according to the stereo matching. That is, it is possible to obtain the disparity information about the surroundings of the vehicle.

At this time, the stereo matching may be performed on a pixel-by-pixel basis or a predetermined block basis. On the other hand, the disparity map may mean a map in which numerical values of binocular parallax information of images, i.e., left and right images, are displayed.

The segmentation unit 432 may perform segmenting and clustering in the image based on the disparity information from the disparity calculating unit 420. [

Specifically, the segmentation unit 432 can separate the background and the foreground for at least one of the images based on the disparity information.

For example, an area having dispaly information within a disparity map of a predetermined value or less can be calculated as a background, and the corresponding part can be excluded. Thereby, the foreground can be relatively separated.

As another example, an area in which the dispetity information is equal to or greater than a predetermined value in the disparity map can be calculated with the foreground, and the corresponding part can be extracted. Thereby, the foreground can be separated.

Thus, by separating the foreground and background based on the disparity information information extracted based on the image, it becomes possible to shorten the signal processing speed, signal processing amount, and the like at the time of object detection thereafter.

Next, the object detector 434 can detect the object based on the image segment from the segmentation unit 432. [

That is, the object detecting unit 434 can detect an object for at least one of the images based on the disparity information.

More specifically, the object detecting unit 434 can detect an object for at least one of the images. For example, an object can be detected from a foreground separated by an image segment.

Next, the object verification unit 436 classifies and verifies the isolated object.

For this purpose, the object identifying unit 436 identifies the objects using a neural network identification method, a SVM (Support Vector Machine) method, a AdaBoost identification method using a Haar-like feature, or a Histograms of Oriented Gradients (HOG) Technique can be used.

On the other hand, the object checking unit 436 can check the objects by comparing the objects stored in the memory 240 with the detected objects.

For example, the object checking unit 436 can identify nearby vehicles, lanes, roads, signs, hazardous areas, tunnels, and the like, which are located around the vehicle.

An object tracking unit 440 performs tracking on the identified object. For example, it is possible to sequentially check the objects in the acquired images, calculate the motion or motion vector of the identified object, and track the movement of the object based on the calculated motion or motion vector have. Accordingly, it is possible to track nearby vehicles, lanes, roads, signs, hazardous areas, etc., located in the vicinity of the vehicle.

4B is another example of an internal block diagram of the processor.

Referring to FIG. 4B, the processor 170 or 270 of FIG. 4B has the same internal configuration unit as the processor 170 or 270 of FIG. 4A, but differs in the signal processing order. Only the difference will be described below.

The object detecting unit 434 may receive a plurality of images or a generated surrounding view image, and may detect a plurality of images or objects in the generated surrounding view image. 4A, it is possible to detect an object directly from a plurality of images or a generated surrounding view image, instead of detecting an object, based on disparity information, for a segmented image.

Next, the object verification unit 436 classifies the detected and separated objects based on the image segment from the segmentation unit 432 and the object detected by the object detection unit 434, (Verify).

For this purpose, the object identifying unit 436 identifies the objects using a neural network identification method, a SVM (Support Vector Machine) method, a AdaBoost identification method using a Haar-like feature, or a Histograms of Oriented Gradients (HOG) Technique can be used.

FIG. 5 is a diagram referred to for explaining the operation method of the processor 170 or 270 of FIGS. 4A to 4B, based on images obtained respectively in the first and second frame periods.

Referring to FIG. 5, during the first and second frame periods, the plurality of cameras 295a, ..., and 295d sequentially acquire images FR1a and FR1b, respectively.

The disparity calculating unit 420 in the processor 170 or 270 receives the images FR1a and FR1b processed by the image preprocessing unit 410 and performs stereo matching on the received images FR1a and FR1b And obtains a disparity map (520).

The disparity map 520 is obtained by leveling the parallax between the images FR1a and FR1b. The higher the disparity level is, the closer the distance from the vehicle is, and the smaller the disparity level is, The distance can be calculated to be far.

On the other hand, when such a disparity map is displayed, it may be displayed so as to have a higher luminance as the disparity level becomes larger, and a lower luminance as the disparity level becomes smaller.

In the figure, first to fourth lanes 528a, 528b, 528c, and 528d have corresponding disparity levels in the disparity map 520, and the construction area 522, the first front vehicle 524 ) And the second front vehicle 526 have corresponding disparity levels, respectively.

The segmentation unit 432, the object detection unit 434 and the object identification unit 436 determine whether or not a segment, an object detection, and an object of at least one of the images FR1a and FR1b, based on the disparity map 520, Perform verification.

In the figure, using the disparity map 520, object detection and confirmation for the second image FRlb is performed.

That is, the first to fourth lanes 538a, 538b, 538c, and 538d, the construction area 532, the first forward vehicle 534, and the second forward vehicle 536 are included in the image 530, And verification may be performed.

On the other hand, by continuously acquiring the image, the object tracking unit 440 can perform tracking on the identified object.

6A to 6B are views referred to in the description of the operation of the autonomous travel apparatus of FIG.

First, FIG. 6A is a diagram illustrating a vehicle forward situation photographed by a stereo camera 195 provided inside a vehicle. In particular, the vehicle front view is indicated by a bird eye view.

Referring to the drawing, a first lane 642a, a second lane 644a, a third lane 646a, and a fourth lane 648a are located from the left to the right, and the first lane 642a and the second The construction area 610a is positioned between the lanes 644a and the first front vehicle 620a is positioned between the second lane 644a and the third lane 646a and the third lane 646a and the fourth It can be seen that the second forward vehicle 630a is disposed between the lane lines 648a.

Next, FIG. 6B illustrates the display of the vehicle front state, which is grasped by the vehicle driving assist system, together with various information. In particular, the image as shown in FIG. 6B may be displayed on the display 180 or the vehicle display device 400 provided in the vehicle driving assistance device.

6B is different from FIG. 6A in that information is displayed on the basis of an image photographed by the stereo camera 195. FIG.

A first lane 642b, a second lane 644b, a third lane 646b and a fourth lane 648b are located from the left to the right and the first lane 642b and the second The construction area 610b is located between the lanes 644b and the first front vehicle 620b is located between the second lane 644b and the third lane 646b and the third lane 646b and the fourth It can be seen that the second forward vehicle 630b is disposed between the lane 648b.

The vehicle driving assistant 100a performs signal processing on the basis of the stereo image photographed by the stereo camera 195 and outputs it to the construction area 610b, the first front vehicle 620b, the second front vehicle 630b You can see the object for. In addition, the first lane 642b, the second lane 644b, the third lane 646b, and the fourth lane 648b can be confirmed.

On the other hand, in the drawing, it is exemplified that each of them is highlighted by a frame to indicate object identification for the construction area 610b, the first forward vehicle 620b, and the second forward vehicle 630b.

On the other hand, the vehicle driving assistant device 100a calculates the distance (distance) to the construction area 610b, the first front vehicle 620b, the second front vehicle 630b based on the stereo image photographed by the stereo camera 195 Information can be computed.

In the figure, calculated first distance information 611b, second distance information 621b, and third distance information 621b corresponding to the construction area 610b, the first forward vehicle 620b, and the second forward vehicle 630b, respectively, Information 631b is displayed.

On the other hand, the vehicle driving assistant device 100a can receive sensor information about the vehicle from the ECU 770 or the sensor unit 760. [ Particularly, it is possible to receive and display the vehicle speed information, the gear information, the yaw rate indicating the speed at which the vehicle's rotational angle (yaw angle) changes, and the angle information of the vehicle.

The figure illustrates that the vehicle speed information 672, the gear information 671 and the yaw rate information 673 are displayed on the vehicle front image upper portion 670. In the vehicle front image lower portion 680, Information 682 is displayed, but various examples are possible. Besides, the width information 683 of the vehicle and the curvature information 681 of the road can be displayed together with the angle information 682 of the vehicle.

On the other hand, the vehicle driving assistant 100a can receive the speed limitation information and the like for the road running on the vehicle through the communication unit 120 or the interface unit 130. [ In the figure, it is exemplified that the speed limitation information 640b is displayed.

The vehicle driving assistant 100a may display various information shown in FIG. 6B through the display 180 or the like, but may store various information without a separate indication. And, by using such information, it can be utilized for various applications.

7 is an example of a block diagram of an interior of a vehicle according to an embodiment of the present invention.

Referring to the drawings, the vehicle 200 may include an electronic control device 700 for vehicle control.

The electronic control unit 700 includes an input unit 710, a communication unit 720, a memory 740, a lamp driving unit 751, a steering driving unit 752, a brake driving unit 753, a power source driving unit 754, An air conditioner driving unit 757, a window driving unit 758, an airbag driving unit 759, a sensor unit 760, an ECU 770, a display 780, an audio output unit 785, An audio input unit 786, a power supply unit 790, a stereo camera 195, a plurality of cameras 295, a radar 797, an internal camera 708, a seat driving unit 761, .

Meanwhile, the ECU 770 may be a concept including the processor 270 described in FIG. 3C or FIG. 3D. Alternatively, in addition to the ECU 770, a separate processor for signal processing of images from the camera may be provided.

The input unit 710 may include a plurality of buttons or a touch screen disposed inside the vehicle 200. Through a plurality of buttons or a touch screen, it is possible to perform various input operations.

The communication unit 720 can exchange data with the mobile terminal 600 or the server 500 in a wireless manner. In particular, the communication unit 720 can exchange data with the mobile terminal of the vehicle driver wirelessly. As a wireless data communication method, various data communication methods such as Bluetooth, WiFi Direct, WiFi, and APiX are possible.

The communication unit 720 receives from the mobile terminal 600 or the server 500 the schedule information related to the vehicle driver's schedule time or the moving position, weather information, road traffic situation information, for example, TPEG Group) information.

On the other hand, when the user aboard the vehicle, the user's mobile terminal 600 and the electronic control device 700 can perform pairing with each other automatically or by execution of the user's application.

The memory 740 may store various data for operation of the electronic control unit 700, such as a program for processing or controlling the ECU 770. [

On the other hand, the memory 740 may store map information related to the vehicle driving.

The lamp driving unit 751 can control the turn-on / turn-off of the lamps disposed inside and outside the vehicle. Also, the intensity, direction, etc. of the light of the lamp can be controlled. For example, it is possible to perform control on a direction indicating lamp, a brake lamp, and the like.

The steering driver 752 may perform electronic control of a steering apparatus (not shown) in the vehicle 200. [ Thus, the traveling direction of the vehicle can be changed.

The brake driver 753 can perform electronic control of a brake apparatus (not shown) in the vehicle 200. [ For example, the speed of the vehicle 200 can be reduced by controlling the operation of the brakes disposed on the wheels. As another example, it is possible to adjust the traveling direction of the vehicle 200 to the left or right by differently operating the brakes respectively disposed on the left wheel and the right wheel.

The power source driving section 754 can perform electronic control of the power source in the vehicle 200. [

For example, when a fossil fuel-based engine (not shown) is a power source, the power source drive unit 754 can perform electronic control of the engine. Thus, the output torque of the engine and the like can be controlled.

As another example, when the electric motor (not shown) is a power source, the power source driving unit 754 can perform control on the motor. Thus, the rotation speed, torque, etc. of the motor can be controlled.

The sunroof driving unit 755 may perform electronic control of a sunroof apparatus (not shown) in the vehicle 200. [ For example, you can control the opening or closing of the sunroof.

The suspension driving unit 756 may perform electronic control of a suspension apparatus (not shown) in the vehicle 200. [ For example, when there is a curvature on the road surface, it is possible to control the suspension device so as to reduce the vibration of the vehicle 200. [

The air conditioning driving unit 757 can perform electronic control on an air conditioner (not shown) in the vehicle 200. For example, when the temperature inside the vehicle is high, the air conditioner can be operated to control the cooling air to be supplied into the vehicle.

The window driving unit 758 can perform electronic control on a window apparatus (not shown) in the vehicle 200. [ For example, it can control the opening or closing of left and right windows on the side of the vehicle.

The airbag driver 759 may perform electronic control of the airbag apparatus in the vehicle 200. [ For example, at risk, the airbag can be controlled to fire.

The seat driving unit 761 can perform position control of the seat 200 or the backrest of the vehicle 200. [ For example, when the driver is seated in the driver's seat, the driver's seat can be adjusted according to the driver, front / rear spacing adjustment of the seat, front / rear gap adjustment of the backrest, and the like.

On the other hand, the seat driving unit 761 can drive the rollers disposed in the seat or the backrest, and can control the driver to provide pressure such as a massage.

The sensor unit 760 senses a signal relating to the running of the vehicle 200 or the like. To this end, the sensor unit 760 may include a heading sensor, a yaw sensor, a gyro sensor, a position module, a vehicle forward / backward sensor, a wheel sensor, A vehicle speed sensor, a vehicle body inclination sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle interior temperature sensor, and a vehicle interior humidity sensor.

Thereby, the sensor unit 760 outputs the vehicle position information (GPS information), the vehicle angle information, the vehicle speed information, the vehicle acceleration information, the vehicle tilt information, the vehicle forward / backward information, the battery information, Tire information, vehicle lamp information, vehicle internal temperature information, vehicle interior humidity information, and the like.

In addition, the sensor unit 760 may include an accelerator pedal sensor, a pressure sensor, an engine speed sensor, an air flow sensor AFS, an intake air temperature sensor ATS, a water temperature sensor WTS, A position sensor (TPS), a TDC sensor, a crank angle sensor (CAS), and the like.

The ECU 770 can control the overall operation of each unit in the electronic control unit 700. [

It is possible to perform a specific operation by input by the input unit 710 or to receive the sensed signal from the sensor unit 760 and transmit it to the surrounding view providing apparatus 100b and receive map information from the memory 740 754, 756, 753, 754, 756, respectively.

Also, the ECU 770 can receive weather information and traffic situation information of the road, for example, TPEG (Transport Protocol Expert Group) information from the communication unit 720. [

On the other hand, the ECU 770 can combine a plurality of images received from the plurality of cameras 295 to generate an ambient view image. In particular, when the vehicle is below a predetermined speed or when the vehicle is moving backward, the surround view image can be generated.

The display 780 can display an image of the front of the vehicle while the vehicle is running or an around view image during the running of the vehicle. In particular, it is also possible to provide various user interfaces in addition to the surround view image.

For the display of such an ambient view image or the like, the display 780 may include a cluster or HUD (Head Up Display) on the inside of the vehicle interior. On the other hand, when the display 780 is the HUD, it may include a projection module that projects an image on the windshield of the vehicle 200. [ On the other hand, the display 780 may include a touch screen capable of being input.

The audio output unit 785 converts the electrical signal from the ECU 770 into an audio signal and outputs the audio signal. For this purpose, a speaker or the like may be provided. The audio output unit 785 can also output a sound corresponding to the operation of the input unit 710, that is, the button.

The audio input unit 786 can receive user voice. For this purpose, a microphone may be provided. The received voice may be converted to an electrical signal and transmitted to the ECU 770.

The power supply unit 790 can supply power necessary for operation of each component under the control of the ECU 770. [ Particularly, the power supply unit 790 can receive power from a battery (not shown) inside the vehicle.

The stereo camera 195 is used for the operation of a driving assist system for a vehicle. This will be described with reference to the above description.

A plurality of cameras 295 are used to provide the surround view image, and for this purpose, as shown in FIG. 2C, four cameras may be provided. For example, the plurality of cameras 295a, 295b, 295c, and 295d may be disposed on the left, rear, right, and front of the vehicle, respectively. The plurality of images photographed by the plurality of cameras 295 may be transmitted to the ECU 770 or a separate processor (not shown).

The internal camera 708 captures an image of the interior of the vehicle including the driver. For example, an RGB camera, an IR camera with a thermal sensation, and the like can be exemplified.

The driver detection sensor 799 detects the driver's body information. For example, the driver's blood pressure information, sleeping waves, and the like can be detected.

The radar 797 transmits a transmission signal and receives a reception signal reflected from an object near the vehicle. Then, based on the difference between the transmission signal and the reception signal, the distance information is output. Further, it outputs more information.

FIG. 8 is a flowchart showing an operation method of an autonomous navigation apparatus according to an embodiment of the present invention, and FIGS. 9A to 15 are diagrams referred to the description of the operation method of FIG.

Referring to the drawings, the vehicle 200 can enter the autonomous operation mode in accordance with the input of the driver or automatically (S910).

For example, when the autonomous operation mode button provided in the vehicle 200 is operated, the processor 170 or 770 can control to enter the autonomous operation mode.

As another example, when the driver outputs the voice of the " autonomous operation mode ", the processor 170 or 770 can perform voice recognition through the voice recognition algorithm and control to enter the autonomous operation mode.

As another example, when the driver selects the 'autonomous operation mode' item through the display device 400 in the vehicle 200, the processor 170 or 770 can control to enter the autonomous operation mode.

Upon entering the autonomous operation mode, the processor 170 or 770 controls the steering driver 752, the brake driver 730, and the brake driver 730 based on the distance information from the images from the cameras 195 and 295 to the vehicle periphery object from the radar 797, The power source driver 753, and the power source driver 754.

Specifically, the processor 170 or 770 generates a disparity map for the vehicle front, based on the stereo image for the vehicle front from the stereo camera 195, And distance can be calculated.

In addition, the processor 170 or 770 can acquire distance information from a radar 797 capable of omnidirectional signal output to a vehicle periphery object.

Processor 170 or 770 then determines the position of the vehicle based on the object detection, identification, and distance to the vehicle front based on the stereo camera 195 and the distance information to the vehicle periphery object based on the radar 797 For controlling the driving speed, for controlling the power source driving portion 754, for controlling the brake driving portion 753 for maintaining a certain distance from the preceding vehicle, or for controlling the steering driving portion 752 for lane change, .

Next, the vehicle 200 can perform the autonomous running operation to the first route toward the predetermined destination (S920).

The processor 170 or 770 can receive the destination information in the autonomous running mode and control the autonomous operation to the first route among the plurality of routes to the destination based on the received destination information.

Here, receiving destination information can be implemented in various ways.

For example, when the driver's voice including the 'destination information' is input through the audio input unit 786, the processor 770 recognizes the driver's voice and extracts the destination information based on the recognized driver's voice can do.

As another example, the processor 770 can extract the destination information based on the schedule information when the driver's schedule information is received from the driver's mobile terminal 600 through the communication unit 730. [

Then, the processor 170 or 770 can select the first route among a plurality of routes based on the destination information, and control the autonomous operation to be performed according to the first route.

Here, the first route may be any one of a route requiring the shortest time, a route being the optimum route for the high-speed travel, and a route consuming the lowest cost.

Next, the processor 170 or 770 determines whether or not the driver is in the sleep state during the operation to the destination by the first route in the autonomous running driving mode (S925).

When the driver is in the sleep state, the processor 170 or 770 receives the driver sleep state information and the running route state information (S930), and based on at least one of the driver sleep state information and the running route state information , The route to the destination is varied (S933), and control is performed to perform autonomous travel through the variable route (S950).

On the other hand, if the driver is not in the sleep state in step 925 (S925), the processor 170 or 770 determines whether the destination variable information is received (S925).

When the destination variable information is received, the processor 170 or 770 changes the route to the variable destination based on the destination variable information (S943). Then, it is controlled to perform autonomous travel through the variable route (S950).

On the other hand, unlike the figure, when the destination variable information is received while the driver is in the sleep state, the processor 170 or 770 varies the route to the variable destination based on the destination variable information, So as to perform autonomous driving.

The processor 170 or 770 can determine whether the driver is sleeping or the driver sleep state information based on the driver's body information from the driver sensor 799 and the image from the internal camera 708 have.

9A shows an example of the interior of the vehicle. As an example of the internal camera 708, an RGB camera 1500 and an IR camera 1501 having a thermal sensation are illustrated in the vehicle.

The processor 170 or 770 can grasp the motion of the driver, the eye condition, and the like based on the RGB image from the RGB camera 1500. [

The processor 170 or 770 can grasp the driver's thermal condition based on the IR image from the IR camera 1501. [

On the other hand, FIG. 9A illustrates that a microphone 1502 for audio input is disposed on the dashboard, but may alternatively be disposed on the steering wheel 150. FIG.

The processor 170 or 770 can voice-signal the driver's voice signal input from the microphone 1502 and recognize the voice recognition or the driver's state.

On the other hand, FIG. 9A illustrates a body signal detection sensor 1504 as an example of the driver detection sensor 799 in the backrest 1503 of the driver's seat.

The processor 170 or 770 can determine the blood pressure information, the sleeping wave, the body rhythm waveform, and the like through the body signal detection sensor 1504.

The processor 170 or 770 then receives the RGB image from the RGB camera 1500, the IR image from the R camera 1501, the audio signal from the microphone 1502, the body signal from the body signal detection sensor 1504 It is possible to determine whether or not the driver is sleeping or the driver's sleeping state information.

When it is determined that the driver is sleeping, the processor 170 or 770 enters the sleep mode, changes the route to the destination based on at least one of the driver sleep state information and the running route state information, It is possible to control the autonomous running through the variable route.

On the other hand, FIG. 9A illustrates that the driver outputs a voice 1010 such as 'the destination is x, and from here on in an autonomous mode of travel'.

Accordingly, the processor 170 or 770 recognizes the driver's voice 1010, extracts the destination information X and the autonomous mode information from the driver's voice 1010, Can be controlled to be performed.

On the other hand, the processor 170 or 770 outputs an autonomous drive mode notification message 1020 and destination information 1022 to the display 480 of the display device 400 for feedback to the driver .

On the other hand, unlike FIG. 9B, it is also possible to output the autonomous driving mode notification message 1020 and the destination information 1022 with sound.

The processor 170 or 770 calculates the route on the basis of the destination information X in the autonomous driving mode and displays on the display 480 of the display device 400 a message indicating that the route is being calculated (1024), and the calculated path information 1026 can be output. In the figure, route 1, which is the shortest route, is exemplified as the calculated route.

FIG. 9D illustrates a case where the driver 2001 sleeps in the driver's seat 1503 during the autonomous-travel-running operation with the destination X set as an example.

9A, the processor 170 or 770 determines whether or not the driver is sleeping or the driver sleep state information based on the image from the internal camera 708 and the driver's body information from the driver detection sensor 799 It can be judged.

9E, the processor 170 or 770 displays a driver sleeping detection message 1030 and a sleeping mode entering message 1032 on the display 480 of the display device 400 And the like.

The processor 170 or 770 can then monitor the driver ' s sleep state based on the driver's body information from the driver sensor 799, an image from the internal camera 708. [

Particularly, as shown in FIG. 9F, the processor 170 or 770 displays on the display 480 of the display device 400 the driver's movement, the driver's body signal, and the driver's sound to monitor the driver's sleep state Message 1038 may be output.

On the other hand, the processor 170 or 770, based on the driver sleep state information,

It is possible to control at least one of lane change pattern control, acceleration / deceleration pattern control, vibration control, noise cancellation, external light interception, internal light control, temperature control,

For example, when the driver's sleep is shallow, the lane changes smoothly, the acceleration / deceleration is smooth, and the vibration is smoothly controlled. Further, in order to cut off the external light, the light transmittance of the wind shield, window, and the like can be controlled so as to be considerably low. It is also possible to perform audio signal processing for external noise cancellation. Further, it is possible to control the luminance of the internal illumination to be significantly lowered. Further, the air conditioning driving section 757 can be controlled to control the room temperature. In addition, the display brightness of the display device 400 or the volume adjustment of the output sound may be performed.

On the other hand, the processor 170 or 770 calculates the dehydration surface expected time of the driver based on the driver's sleep surface state information, varies the route to the destination based on the calculated dehydration surface expected time, So as to perform autonomous driving. Thus, customized route change according to the driver ' s water surface becomes possible. Therefore, the convenience of use can be increased.

Alternatively, the processor 170 or 770 may calculate the dehydration surface expected time of the driver based on the driver's sleep surface state information, and may vary the vehicle traveling speed to the destination based on the calculated dehydration surface expected time. Accordingly, it becomes possible to change the customized travel speed according to the driver's water surface. Therefore, the convenience of use can be increased.

10A illustrates that the driver sleeps in a shallow sleep state.

Fig. 10B illustrates that the vehicle running speed is variable when the driver is in a shallow sleep state.

10B, the processor 170 or 770 stores the driver sleep state information 1120, the driver's dehydration surface expected time information 1122, and the vehicle traveling speed as the variable information 1124 And the display 480 of the display device 400, as shown in FIG.

FIG. 10C illustrates that the route to the destination is varied when the driver is in a shallow sleep state.

Alternatively, when the driver is in a shallow sleep state, the processor 170 or 770 displays the driver sleep state information 1120, the driver's dehydration surface expected time information 1122, route variable information 1126 to the destination Can be controlled to be output to the display 480 of the display device 400.

On the other hand, the processor 170 or 770 receives the running route state information through the communication unit 730, receives the vehicle incident information ahead of the received route state information, or receives the estimated arrival time from the target time The bypass route is searched based on at least one of the road type, the speed limit, the current speed, the curvature of the road, the intersection, the amount of traffic and the construction status, It is possible to control to perform autonomous travel through a variable and variable route.

10D, when the congestion occurs in the route of the route 1 while the vehicle is traveling, the processor 170 or 770 transmits the road congestion information 1130, the destination target arrival time information 1132, the route variable information 1126 to the destination Can be controlled to be output to the display 480 of the display device 400.

For example, the processor 170 or 770 can change the route information when the estimated arrival time varies depending on the path congestion, and can control the autonomous running operation to be performed according to the variable path information have.

11A illustrates that the driver sleeps in a deep sleep state.

FIG. 11B illustrates that the vehicle running speed is variable when the driver is in a deep sleep state.

11B, the processor 170 or 770 determines whether or not the driver's sleep surface state information 1140, the driver's dehydration surface expected time information 1142, and the vehicle traveling speed are variable information 1144 And the display 480 of the display device 400, as shown in FIG.

FIG. 11C illustrates that the route to the destination is varied when the driver is in a deep sleep state.

Alternatively, when the driver is in a deep sleep state, the processor 170 or 770 displays the driver sleep state information 1140, the driver's dehydration surface expected time information 1142, route variable information 1146 to the destination Can be controlled to be output to the display 480 of the display device 400.

On the other hand, the processor 170 or 770 receives the running route state information through the communication unit 730, receives the vehicle incident information ahead of the received route state information, or receives the estimated arrival time from the target time The bypass route is searched based on at least one of the road type, the speed limit, the current speed, the curvature of the road, the intersection, the amount of traffic, and the construction status, and any one of the searched bypass routes is selected, It is possible to control to perform autonomous travel through a variable and variable route.

When the congestion occurs in the route of the route 1 while the vehicle is traveling, the processor 170 or 770 transmits the road congestion information 1150, the destination target arrival time information 1152, the route variable information 1146 to the destination Can be controlled to be output to the display 480 of the display device 400.

Fig. 12 is a view showing an example of Route 1 to Route 3 described in Figs. 10A to 11D.

When the current position of the vehicle is in the Z position, the vehicle 200 is capable of exchanging data with the external server 500, and can receive data from the external server 500, images from the plurality of cameras 195 and 295, The vehicle autonomous operation can be performed based on the distance information from the radar 797 to the vehicle periphery object.

In particular, when the destination is set to X, the processor 170 or 770 can perform the routing to the route 1, which is the shortest route, and control to perform the vehicle autonomous operation according to the route 1.

On the other hand, the processor 170 or 770 varies the route to the destination to the route 2 or the route 3 based on at least one of the driver sleeping state information and the traveling route state information, It is possible to control to perform the operation.

On the other hand, the processor 170 or 770 controls to enter the dehydration mode upon arrival of the destination, and controls at least one of the driver's seat vibration, illumination, speaker, window, sunroof, .

On the other hand, the dehydrating surface mode can be entered in other cases as well. That is, the processor 170 or 770 may be configured to determine whether a driver of the mobile terminal 600 receives a telephone call or a message, receives a message to a destination while traveling to a destination, arrives at a rest stop, When receiving information, it is possible to control to enter the dehydrating surface mode.

The processor 170 or 770 may cause the destination arrival message 1420 and various control messages 1425 for the dehydration mode control to be displayed on the display 480 of the display device 400, And the like.

Alternatively, the processor 170 or 770 may control to enter the dehydration surface mode after a predetermined time in accordance with the schedule information of the driver after arrival of the destination. That is, after arriving in advance, it is possible to enter the dehydrating surface mode after a predetermined time, rather than immediately entering the dehydrating surface mode.

On the other hand, when the destination arrives, the processor 170 or 770 transmits route information (route 1 to route 3) and each required time information 1426,1427, 14428 to the display device 400 On the display 480 of the display unit 480. [

The processor 170 or 770 can receive the destination change information from the external server 500 or the mobile terminal 600 of the driver from the outside.

14A illustrates that a destination change message 1525 is received from the mobile terminal 600b of another user to the vehicle 200. FIG. Accordingly, the destination change message 1525 is transmitted to the processor 170 or 770 via the communication unit 730. [

Alternatively, it is also possible that a destination change message is received at the driver's mobile terminal 600a and the message is transmitted to the processor 170 or 770 through the communication unit 730. [

The processor 170 or 770 compares the priority of the destination selected by the driver with the priority of the destination change information received from the outside to determine whether or not to change the destination, and when determining the destination change, , It is possible to control the autonomous running through variable roots by varying the roots.

The processor 170 or 770 can control to cause the root variable message 1520 and the variable root information 1522 to be output to the display 480 of the display device 400, have.

Fig. 15 is a diagram showing an example of a destination variable described in Figs. 14A to 14B.

When the current position of the vehicle is in the Z position, the vehicle 200 is capable of exchanging data with the external server 500, and can receive data from the external server 500, images from the plurality of cameras 195 and 295, The vehicle autonomous operation can be performed based on the distance information from the radar 797 to the vehicle periphery object.

In particular, when the destination is set to X, the processor 170 or 770 can perform the routing to the route 1, which is the shortest route, and control to perform the vehicle autonomous operation according to the route 1.

On the other hand, when the destination variable information is received, the destination is corrected from X to X ', so that the route can be changed from route 1 to route 2. By controlling the autonomous running through the variable route, the convenience of use can be increased.

FIG. 16 is a flowchart showing an operation method of an autonomous navigation apparatus according to another embodiment of the present invention, and FIGS. 17A to 18 are views referred to the description of the operation method of FIG.

Referring to the drawings, the vehicle 200 can enter the autonomous operation mode in accordance with the input of the driver or automatically (S1710).

Next, the vehicle 200 can perform the autonomous running operation to the first route toward the predetermined destination (S1720).

Next, the processor 170 or 770 determines whether or not the driver is in the sleep state while the vehicle is traveling to the destination by the first route in the self-running driving mode (S1725).

Steps S1710 through S1725 are the same as steps S910 through S925 of FIG. 8, and a description thereof will be omitted.

Next, when the driver is in the sleep state, the processor 170 or 770 controls the autonomous running to enter the sleep mode (S1730).

Next, the processor 170 or 770 determines whether the destination variable information is received during the self-running operation under the sleep mode (S1735).

If the destination variable information is received, control is performed to perform the dehydration mode (S1750).

Alternatively, after step 1730 (S1730), the processor 170 or 770 determines whether the destination has arrived or not (S1745). If the destination has arrived, control is performed to perform the dehydrating surface mode (S1750).

17A-17H illustrate various dehydration surface mode entry conditions.

Fig. 17B is a diagram showing a state in which a destination is changed, Fig. 17B is a flowchart showing a case in which a destination is changed, Fig. 17H shows the time at which the message is received and thus the processor 170 or 770 displays the information 1820,1822, 1824, and 1824 on the display 480 of the display device 400, 1826, 1828, 1839, 1832, 1834).

On the other hand, the processor 170 or 770 may control at least one of driver's seat vibration, lighting, speaker, window, sunroof, or internal temperature for dehydration mode. Thus, it is possible to end the sleeping of the driver.

Processor 170 or 770, on the other hand, has a message 1836 indicating that at least one of the driver's seat vibration, illumination, speaker, window, sunroof, or internal temperature is controlled for dehydration mode And the display 480 of the display device 400, as shown in FIG.

Meanwhile, the method of operating the autonomous vehicle of the present invention can be implemented as a code that can be read by a processor on a recording medium readable by a processor included in the autonomous vehicle or the vehicle. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the recording medium that can be read by the processor include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet . In addition, the processor-readable recording medium may be distributed over network-connected computer systems so that code readable by the processor in a distributed fashion can be stored and executed.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the present invention.

Claims (14)

A plurality of cameras;
Radar;
A communication unit;
The route to the destination can be varied based on at least one of the driver's sleeping state information and the running route state information when the driver is in the sleeping state and controls to perform the autonomous operation to the first route toward the destination in the self- And controlling the autonomous running of the vehicle through a variable route.
The method according to claim 1,
The processor comprising:
And controls the autonomous running vehicle to perform autonomous operation to the first route among a plurality of routes to the destination based on the received destination information when receiving the destination information in the autonomous running mode.
3. The method of claim 2,
And an audio input unit,
The processor comprising:
Wherein when the driver's voice is input through the audio input unit, the driver's voice is recognized and the destination information is extracted based on the recognized driver's voice.
3. The method of claim 2,
Wherein,
Receives the driver's schedule information from the driver's mobile terminal,
The processor comprising:
And extracts the destination information based on the schedule information.
The method according to claim 1,
Internal camera;
And a driver detection sensor for detecting the driver's body information,
The processor comprising:
Determining whether the driver is sleeping or the sleep state information of the driver based on the image from the internal camera and the driver's body information from the driver detection sensor and if the driver is determined to be in the sleep state, Wherein the control unit controls the vehicle so as to vary the route to the destination based on at least one of the driver's sleeping state information and the running route state information so as to perform the autonomous traveling through the varied route.
The method according to claim 1,
The processor comprising:
Based on the driver sleep state information,
Wherein the control means controls at least one of a lane change pattern control, an acceleration / deceleration pattern control, a vibration control, a noise canceling, an external light cutoff, an internal light control, a temperature control and a driving state display.
The method according to claim 1,
The processor comprising:
Calculating a dehydration surface expected time of the driver based on the driver's sleep surface state information and varying the route to the destination based on the calculated dehydration surface expected time and performing autonomous travel through the varied route Wherein said control means controls said control means so as to control said control means.
The method according to claim 1,
The processor comprising:
And calculates a dehydration surface expected time of the driver based on the driver's sleep surface state information and varies the vehicle traveling speed to the destination based on the calculated dehydration surface expected time.
The method according to claim 1,
The processor comprising:
A destination changing unit configured to compare a priority of a destination selected by the driver with a priority order of the destination changing information received from the outside when receiving the destination changing information from outside,
Wherein when the destination change is determined, the route is changed based on the destination change information received from the outside, and the autonomous traveling is controlled to perform the autonomous travel through the variable route.
The method according to claim 1,
Wherein,
Receiving the running route status information,
The processor comprising:
If the vehicle accident information is received ahead of the received route state information or the difference between the expected arrival time and the target time is equal to or greater than a predetermined value, the road type, the speed limit, the current speed, the curvature of the road, Searches for a bypass route based on at least one of the bypass routes, selects one of the searched bypass routes, varies the route, and performs autonomous travel through the variable route.
The method according to claim 1,
The processor comprising:
When the driver arrives at the destination, the driver's mobile terminal receives a call or a message, while the driver is traveling to the destination, or when the driver arrives at a rest area, Mode of the vehicle.
12. The method of claim 11,
The processor comprising:
Wherein the controller controls at least one of a driver's seat vibration, a lighting, a speaker, a window, a sunroof, or an internal temperature upon entry into the dehydrating surface mode.
The method according to claim 1,
display; And
And an audio output unit,
The processor comprising:
Wherein at least one of the driver sleep state information, the running route state information, and the variable route information is outputted through the display or the audio output unit.
12. The method of claim 11,
A steering driver for driving the steering device;
A brake driver for driving the brake device;
And a power source driving unit for driving the power source,
The processor comprising:
And controls at least one of the steering driving section, the brake driving section, and the power source driving section at the time of the autonomous vehicle running based on the distance information between the images from the plurality of cameras and the vehicle periphery object from the radar Autonomous vehicle.
KR1020150085407A 2015-06-16 2015-06-16 Autonomous vehicle KR20160148394A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020150085407A KR20160148394A (en) 2015-06-16 2015-06-16 Autonomous vehicle
PCT/KR2016/006348 WO2016204507A1 (en) 2015-06-16 2016-06-15 Autonomous traveling vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150085407A KR20160148394A (en) 2015-06-16 2015-06-16 Autonomous vehicle

Publications (1)

Publication Number Publication Date
KR20160148394A true KR20160148394A (en) 2016-12-26

Family

ID=57733986

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150085407A KR20160148394A (en) 2015-06-16 2015-06-16 Autonomous vehicle

Country Status (1)

Country Link
KR (1) KR20160148394A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180104457A (en) * 2017-03-13 2018-09-21 현대자동차주식회사 Apparatus for sleeping aid in vehicle, system having the same and method thereof
CN111192580A (en) * 2019-12-31 2020-05-22 浙江合众新能源汽车有限公司 Method and device for actively starting ACC function of automobile through voice
KR20200116180A (en) * 2019-03-11 2020-10-12 현대모비스 주식회사 Apparatus for controlling lane change of vehicle and method thereof
US11453414B2 (en) * 2019-03-27 2022-09-27 Volkswagen Aktiengesellschaft Method and device for adapting a driving strategy of an at least partially automated transportation vehicle
CN115209046A (en) * 2021-04-14 2022-10-18 丰田自动车株式会社 Information processing apparatus, non-transitory storage medium, and information processing method
WO2023229055A1 (en) * 2022-05-23 2023-11-30 엘지전자 주식회사 Vehicle monitoring apparatus, vehicle comprising same and vehicle operating method

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180104457A (en) * 2017-03-13 2018-09-21 현대자동차주식회사 Apparatus for sleeping aid in vehicle, system having the same and method thereof
KR20200116180A (en) * 2019-03-11 2020-10-12 현대모비스 주식회사 Apparatus for controlling lane change of vehicle and method thereof
US11453414B2 (en) * 2019-03-27 2022-09-27 Volkswagen Aktiengesellschaft Method and device for adapting a driving strategy of an at least partially automated transportation vehicle
CN111192580A (en) * 2019-12-31 2020-05-22 浙江合众新能源汽车有限公司 Method and device for actively starting ACC function of automobile through voice
CN115209046A (en) * 2021-04-14 2022-10-18 丰田自动车株式会社 Information processing apparatus, non-transitory storage medium, and information processing method
CN115209046B (en) * 2021-04-14 2024-05-28 丰田自动车株式会社 Information processing apparatus, non-transitory storage medium, and information processing method
WO2023229055A1 (en) * 2022-05-23 2023-11-30 엘지전자 주식회사 Vehicle monitoring apparatus, vehicle comprising same and vehicle operating method

Similar Documents

Publication Publication Date Title
KR101741433B1 (en) Driver assistance apparatus and control method for the same
KR101730321B1 (en) Driver assistance apparatus and control method for the same
KR102043060B1 (en) Autonomous drive apparatus and vehicle including the same
KR101750876B1 (en) Display apparatus for vehicle and Vehicle
KR101618551B1 (en) Driver assistance apparatus and Vehicle including the same
KR101582572B1 (en) Driver assistance apparatus and Vehicle including the same
KR101565007B1 (en) Driver assistance apparatus and Vehicle including the same
KR20170010645A (en) Autonomous vehicle and autonomous vehicle system including the same
KR20170011882A (en) Radar for vehicle, and vehicle including the same
KR20160142167A (en) Display apparatus for vhhicle and vehicle including the same
KR101632179B1 (en) Driver assistance apparatus and Vehicle including the same
KR101698781B1 (en) Driver assistance apparatus and Vehicle including the same
KR20160147559A (en) Driver assistance apparatus for vehicle and Vehicle
KR20160148394A (en) Autonomous vehicle
KR20170140284A (en) Vehicle driving aids and vehicles
KR101980547B1 (en) Driver assistance apparatus for vehicle and Vehicle
KR20150072942A (en) Driver assistance apparatus and Vehicle including the same
KR101641491B1 (en) Driver assistance apparatus and Vehicle including the same
KR20160148395A (en) Autonomous vehicle
KR20170043212A (en) Apparatus for providing around view and Vehicle
KR101822896B1 (en) Driver assistance apparatus and control method for the same
KR20160064762A (en) Display apparatus for vhhicleand vehicle including the same
KR101872477B1 (en) Vehicle
KR20160144643A (en) Apparatus for prividing around view and vehicle including the same
KR101816570B1 (en) Display apparatus for vehicle