Hereinafter, the present invention will be described in detail with reference to the drawings.
The suffix "module" and " part "for components used in the following description are given merely for convenience of description, and do not give special significance or role in themselves. Accordingly, the terms "module" and "part" may be used interchangeably.
The vehicle described herein may be a concept including a car, a motorcycle. Hereinafter, the vehicle will be described mainly with respect to the vehicle.
On the other hand, the vehicle described in the present specification may be a concept including both a vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, and an electric vehicle having an electric motor as a power source.
1 is a conceptual diagram of a vehicle communication system having a mobile camera apparatus according to an embodiment of the present invention.
Referring to the drawings, a vehicle communication system 10 may include a vehicle 200, terminals 600a and 600b, and a server 500. [
The vehicle 200 may include an autonomous traveling device 100, a vehicle display device 400, and the like in the interior of the vehicle.
On the other hand, the autonomous mobile device 100 may include a vehicle driving assistant 100a, an ambient view providing device 100b, and the like
For example, for autonomous driving of the vehicle, when the vehicle speed is equal to or greater than a predetermined speed, autonomous travel of the vehicle is performed through the vehicle driving assistant device 100a, and when the vehicle speed is less than the predetermined speed, Can be performed.
As another example, when the vehicle driving assist device 100a and the surrounding view providing device 100b are operated together for autonomous driving of the vehicle but the predetermined speed or more, the vehicle driving assistant 100a is further weighted, The autonomous running is performed mainly on the driving assistance device 100a and when the speed is less than the predetermined speed, the weight is further added to the surrounding view providing device 100b so that the autonomous traveling of the vehicle can be performed mainly on the surrounding view providing device 100b .
On the other hand, the vehicle driving assistant 100a, the surrounding view providing apparatus 100b and the vehicle display apparatus 400 are connected to each other via a communication unit (not shown) or a communication unit provided in the vehicle 200, , 600b, or the server 500, as shown in FIG.
For example, when the mobile terminal 600a is located inside or near a vehicle, at least one of the vehicle driving assistant device 100a, the surrounding view providing device 100b, and the vehicle display device 400 is connected by short- , And the terminal 600a.
As another example, when the terminal 600b is located at a remote place outside the vehicle, at least one of the vehicle driving assistant device 100a, the surrounding view providing device 100b, and the vehicle display device 400 may be a remote communication ) Can exchange data with the terminal 600b or the server 500 via the network 570. [
The terminals 600a and 600b may be mobile terminals such as mobile phones, smart phones, tablet PCs, and wearable devices such as smart watches. Or a fixed terminal such as a TV or a monitor. Hereinafter, the terminal 600 will be mainly described as a mobile terminal such as a smart phone.
On the other hand, the server 500 may be a server provided by a vehicle manufacturer or a server operated by a provider providing a vehicle-related service. For example, it may be a server operated by a provider providing information on road traffic conditions and the like.
On the other hand, the vehicle driving assistant 100a can generate and provide vehicle-related information by signal processing the stereo image received from the stereo camera 195 based on computer vision. Here, the vehicle-related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for a driving guide to the vehicle driver.
Alternatively, the vehicle driving assistant device 100a generates a control signal for the vehicle-to-vehicle traveling based on the distance information between the stereo image received from the stereo camera 195 and the vehicle periphery object from the radar 797 . For example, it is possible to output a control signal for controlling at least one of the steering driver, the brake driver, and the power source driver during autonomous vehicle travel.
On the other hand, the surrounding view providing apparatus 100b is configured to provide a plurality of images captured by the plurality of cameras 295a, 295b, 295c, and 295d to a processor (270 in FIG. 3C or FIG. 3D) And the processor (270 in FIG. 3C or FIG. 3D) can combine the plurality of images to generate and provide the surround view image.
Meanwhile, the vehicle display device 400 may be an AVN (Audio Video Navigation) device.
Meanwhile, the vehicle display apparatus 400 may include a space recognition sensor unit and a touch sensor unit, whereby the remote access can be sensed by the space recognition sensor unit, and the near-to-touch approach can be sensed through the touch sensor unit . Then, a user interface corresponding to the detected user gesture or touch can be provided.
The portable camera device 300 according to the embodiment of the present invention projects to the top of the vehicle 200 when the distance from the object around the vehicle 200 is within a predetermined distance, And transmits the image to the vehicle 200 through the communication unit 330.
According to this, since the mobile camera device 300 protrudes above the vehicle 200 immediately before or at the time of an accident, all the images around the vehicle 200 can be secured, and the captured images can be transmitted to the vehicle 200 , The accident-related image can be stored in the memory within the vehicle 200. [
Meanwhile, the portable camera device 300 according to the embodiment of the present invention may be a passive portable camera device or an active portable camera device.
The manual portable camera device is a disposable portable camera device and can be protruded above the vehicle 200 by a moving part provided inside the vehicle 200 without a separate mobile power source. For example, it may be fired above the vehicle 200 by a moving part, particularly a projectile, provided inside the vehicle 200. Then, the disposable mobile camera device is gradually dropped from the top of the vehicle 200 to the ground by the parachute provided, and the photographed image can be transmitted to the vehicle 200 in the meantime.
On the other hand, the active mobile camera device means having a separate mobile power source. For example, the active mobile camera device may be a drone, a helicam, or the like.
Therefore, when the active mobile camera device protrudes above the vehicle 200 by the moving part provided inside the vehicle 200, the active mobile camera device is moved by the moving part provided inside the vehicle 200, You can fly around the top. The active mobile camera device can transmit the photographed image to the vehicle 200 while flying around the upper portion of the vehicle 200. [
On the other hand, the active mobile camera device 300 will be described in detail with reference to Fig. 8 and the following figures.
2A is a diagram showing the appearance of a vehicle having various cameras.
Referring to the drawings, the vehicle 200 includes wheels (203FR, 103FL, 103RL, ...) rotated by a power source, a handle (250) for adjusting the traveling direction of the vehicle (200) A stereo camera 195 provided inside the vehicle 200 for the device 100a and a plurality of cameras 295a, 295b, 295c, 295d mounted on the vehicle 200 for the autonomous vehicle 100b of Fig. ). On the other hand, in the figure, only the left camera 295a and the front camera 295d are shown for the sake of convenience.
The stereo camera 195 may include a plurality of cameras, and the stereo image obtained by the plurality of cameras may be signal-processed in the vehicle driving assistance apparatus (100a in Fig. 3).
On the other hand, the figure illustrates that the stereo camera 195 includes two cameras.
The plurality of cameras 295a, 295b, 295c, and 295d can be activated when the vehicle speed is equal to or lower than a predetermined speed, or when the vehicle is backward, and can acquire a shot image, respectively. The image, which is obtained by a plurality of cameras, can be signal processed within the surrounding view providing apparatus (100b in Fig. 3c or 3d).
FIG. 2B is a view showing the appearance of a stereo camera attached to the vehicle of FIG. 2A.
Referring to the drawing, the stereo camera module 195 may include a first camera 195a having a first lens 193a, and a second camera 195b having a second lens 193b.
The stereo camera module 195 includes a first light shield 192a and a second light shield 192b for shielding light incident on the first lens 193a and the second lens 193b, Shielding portion 192b.
The stereo camera module 195 in the drawing may be a structure detachable from the ceiling or the windshield of the vehicle 200.
A vehicle driving assistant device 100a (FIG. 3) having such a stereo camera module 195 obtains a stereo image for the vehicle front from the stereo camera module 195 and generates a disparity ) Detection, perform object detection for at least one stereo image based on the disparity information, and continuously track the motion of the object after object detection.
FIG. 2C is a view schematically showing the positions of a plurality of cameras attached to the vehicle of FIG. 2A, and FIG. 2D illustrates an example of an ambient view image based on images photographed by the plurality of cameras of FIG. 2C.
First, referring to FIG. 2C, a plurality of cameras 295a, 295b, 295c, and 295d may be disposed on the left, rear, right, and front of the vehicle, respectively.
In particular, the left camera 295a and the right camera 295c may be disposed in a case that surrounds the left side mirror and a case that surrounds the right side mirror, respectively.
On the other hand, the rear camera 295b and the front camera 295d can be disposed in the vicinity of the trunk switch and in the vicinity of the ambulance or the ambulance, respectively.
Each of the plurality of images photographed by the plurality of cameras 295a, 295b, 295c and 295d is transmitted to a processor (270 in Fig. 3c or 3d) in the vehicle 200 and is transmitted to a processor ) Combines a plurality of images to generate an ambient view image.
FIG. 2D illustrates an example of the surrounding view image 210. FIG. The surround view image 210 includes a first image area 295ai from the left camera 295a, a second image area 295bi from the rear camera 295b, a third image area 295b from the right camera 295c, 295ci, and a fourth image area 295di from the front camera 295d.
3A to 3B illustrate various examples of an internal block diagram of the autonomous navigation apparatus of FIG.
3A and 3B illustrate an internal block diagram of the autonomous navigation device 100 for the vehicle driving assistance device 100a.
The vehicle driving assistant 100a can process the stereo image received from the stereo camera 195 based on computer vision to generate vehicle related information. Here, the vehicle-related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for a driving guide to the vehicle driver.
3A, the vehicle driving assistant apparatus 100a includes a communication unit 120, an interface unit 130, a memory 140, a processor 170, a power supply unit 190, (Not shown).
The communication unit 120 can exchange data with the mobile terminal 600 or the server 500 in a wireless manner. In particular, the communication unit 120 can exchange data with a mobile terminal of a vehicle driver wirelessly. As a wireless data communication method, various data communication methods such as Bluetooth, WiFi Direct, WiFi, and APiX are possible.
The communication unit 120 can receive weather information and traffic situation information on the road, for example, TPEG (Transport Protocol Expert Group) information from the mobile terminal 600 or the server 500. Meanwhile, the vehicle driving assistant 100a may transmit real-time traffic information based on the stereo image to the mobile terminal 600 or the server 500. [
On the other hand, when the user is aboard the vehicle, the user's mobile terminal 600 and the vehicle driving assistant 100a can perform pairing with each other automatically or by execution of the user's application.
The interface unit 130 can receive the vehicle-related data or transmit the signal processed or generated by the processor 170 to the outside. To this end, the interface unit 130 can perform data communication with the ECU 770, the AVN (Audio Video Navigation) device 400, the sensor unit 760, and the like in the vehicle by a wire communication or a wireless communication method have.
The interface unit 130 can receive map information related to the vehicle driving by data communication with the vehicle display device 400. [
On the other hand, the interface unit 130 can receive the sensor information from the ECU 770 or the sensor unit 760.
Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Lamp information, vehicle interior temperature information, and vehicle interior humidity information.
Such sensor information may include a heading sensor, a yaw sensor, a gyro sensor, a position module, a vehicle forward / backward sensor, a wheel sensor, a vehicle speed sensor, A vehicle body inclination sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle internal temperature sensor, and a vehicle internal humidity sensor. On the other hand, the position module may include a GPS module for receiving GPS information.
On the other hand, among the sensor information, the vehicle direction information, the vehicle position information, the vehicle angle information, the vehicle speed information, the vehicle tilt information, and the like relating to the vehicle running can be referred to as vehicle running information.
The memory 140 may store various data for operation of the entire vehicle driving assistant device 100a, such as a program for processing or controlling the processor 170. [
An audio output unit (not shown) converts an electric signal from the processor 170 into an audio signal and outputs the audio signal. For this purpose, a speaker or the like may be provided. The audio output unit (not shown) can also output sound corresponding to the operation of the input unit 110, that is, the button.
An audio input unit (not shown) can receive a user's voice. For this purpose, a microphone may be provided. The received voice may be converted to an electrical signal and transmitted to the processor 170.
The processor 170 controls the overall operation of each unit in the vehicle driving assistant 100a.
In particular, the processor 170 performs signal processing based on computer vision. Accordingly, the processor 170 obtains a stereo image for the vehicle front from the stereo camera 195, performs a disparity calculation for the vehicle front based on the stereo image, and based on the calculated disparity information , Perform object detection for at least one of the stereo images, and continue to track object motion after object detection.
Particularly, when the object is detected, the processor 170 performs lane detection, vehicle detection, pedestrian detection, traffic sign detection, road surface detection, and the like .
The processor 170 may perform a distance calculation to the detected nearby vehicle, a speed calculation of the detected nearby vehicle, a speed difference calculation with the detected nearby vehicle, and the like.
Meanwhile, the processor 170 can receive weather information, traffic situation information on the road, and TPEG (Transport Protocol Expert Group) information, for example, through the communication unit 120.
On the other hand, the processor 170 can grasp, in real time, the traffic situation information on the surroundings of the vehicle based on the stereo image in the vehicle driving assistant device 100a.
On the other hand, the processor 170 can receive map information and the like from the vehicle display device 400 through the interface unit 130. [
On the other hand, the processor 170 can receive the sensor information from the ECU 770 or the sensor unit 760 through the interface unit 130. [ Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Lamp information, vehicle interior temperature information, and vehicle interior humidity information.
The power supply unit 190 can supply power necessary for the operation of each component under the control of the processor 170. [ Particularly, the power supply unit 190 can receive power from a battery or the like inside the vehicle.
The stereo camera 195 may include a plurality of cameras. Hereinafter, as described with reference to FIG. 2B and the like, it is assumed that two cameras are provided.
The stereo camera 195 may be detachably attachable to the ceiling or the front glass of the vehicle 200 and may include a first camera 195a having a first lens 193a and a second camera 195a having a second lens 193b, (195b).
The stereo camera 195 includes a first light shield 192a and a second light shield 192b for shielding light incident on the first lens 193a and the second lens 193b, And a portion 192b.
Next, referring to FIG. 3B, the vehicle driving assistance device 100a of FIG. 3B further includes an input unit 110 display 180 and an audio output unit 185 in addition to the vehicle driving assistance device 100a of FIG. . Hereinafter, only the description of the input unit 110, the display 180, and the audio output unit 185 will be described.
The input unit 110 may include a plurality of buttons or a touch screen attached to the vehicle driving assistance apparatus 100a, particularly, the stereo camera 195. [ It is possible to turn on and operate the vehicle driving assistant 100a through a plurality of buttons or a touch screen. In addition, it is also possible to perform various input operations.
The display 180 may display an image related to the operation of the vehicle driving assist system. For this image display, the display 180 may include a cluster or HUD (Head Up Display) on the inside of the vehicle interior. On the other hand, when the display 180 is the HUD, it may include a projection module that projects an image on the windshield of the vehicle 200. [
The audio output unit 185 outputs the sound to the outside based on the audio signal processed by the processor 170. [ To this end, the audio output unit 185 may include at least one speaker.
3C to 3D illustrate various examples of the internal block diagram of the autonomous vehicle of FIG.
FIGS. 3C to 3D illustrate an internal block diagram of the surrounding view providing apparatus 100b in the autonomous navigation apparatus 100. FIG.
The surround view providing apparatus 100b in FIGS. 3C to 3D can combine a plurality of images received from the plurality of cameras 295a, ..., and 295d to generate an ambient view image.
On the other hand, the surrounding view providing apparatus 100b performs object detection, confirmation, and tracking on objects located in the vicinity of the vehicle based on the plurality of images received from the plurality of cameras 295a, ..., 295d .
Referring to FIG. 3C, the surrounding view providing apparatus 100b of FIG. 3C includes a communication unit 220, an interface unit 230, a memory 240, a processor 270, a display 280, a power supply unit 290 , And a plurality of cameras 295a, ..., 295d.
The communication unit 220 can exchange data with the mobile terminal 600 or the server 500 in a wireless manner. In particular, the communication unit 220 can exchange data with the mobile terminal of the vehicle driver wirelessly. As a wireless data communication method, various data communication methods such as Bluetooth, WiFi Direct, WiFi, and APiX are possible.
The communication unit 220 receives from the mobile terminal 600 or the server 500 the schedule information related to the vehicle driver's schedule time or the travel position, weather information, road traffic situation information, for example, TPEG Group) information. Meanwhile, in the surrounding view providing apparatus 100b, real-time traffic information based on an image may be transmitted to the mobile terminal 600 or the server 500.
On the other hand, when the user is boarding the vehicle, the user's mobile terminal 600 and the surrounding view providing apparatus 100b can perform pairing with each other automatically or by execution of the user's application.
The interface unit 230 may receive the vehicle-related data or may transmit the processed or generated signal to the outside by the processor 270. For this purpose, the interface unit 230 can perform data communication with the ECU 770, the sensor unit 760, and the like in the vehicle by a wire communication or a wireless communication method.
On the other hand, the interface unit 230 can receive the sensor information from the ECU 770 or the sensor unit 760.
Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Lamp information, vehicle interior temperature information, and vehicle interior humidity information.
On the other hand, among the sensor information, the vehicle direction information, the vehicle position information, the vehicle angle information, the vehicle speed information, the vehicle tilt information, and the like relating to the vehicle running can be referred to as vehicle running information.
The memory 240 may store various data for operation of the entire surround view providing apparatus 100b, such as a program for processing or controlling the processor 270. [
On the other hand, the memory 240 may store map information related to the vehicle driving.
The processor 270 controls the overall operation of each unit in the surrounding view providing apparatus 100b.
Particularly, the processor 270 can acquire a plurality of images from the plurality of cameras 295a, ..., 295d, and combine the plurality of images to generate an around view image.
Meanwhile, the processor 270 may perform signal processing based on computer vision. For example, based on a plurality of images or a generated surrounding view image, a disparity calculation is performed around the vehicle, object detection is performed in the image based on the calculated disparity information, , The motion of the object can be continuously tracked.
Particularly, when the object is detected, the processor 270 can perform lane detection, vehicle detection, pedestrian detection, obstacle detection, parking area detection, road surface detection, and the like .
Then, the processor 270 can perform a distance calculation on the detected nearby vehicles or pedestrians.
On the other hand, the processor 270 can receive the sensor information from the ECU 770 or the sensor unit 760 via the interface unit 230. [ Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Lamp information, vehicle interior temperature information, and vehicle interior humidity information.
The display 280 may display an aurally view image generated by the processor 270. Meanwhile, it is also possible to provide a variety of user interface when displaying the surround view image, or to provide a touch sensor capable of touch input to the provided user interface.
Meanwhile, the display 280 may include a cluster or an HUD (Head Up Display) on the inside of the vehicle interior. On the other hand, when the display 280 is the HUD, it may include a projection module that projects an image on the windshield of the vehicle 200. [
The power supply unit 290 can supply power necessary for the operation of each component under the control of the processor 270. [ Particularly, the power supply unit 290 can receive power from a battery or the like inside the vehicle.
The plurality of cameras 295a, ..., and 295d are cameras for providing an overview image, preferably a wide angle camera.
3D is similar to the surrounding view providing apparatus 100b of FIG. 3C, but includes an input unit 210, an audio output unit 285, and an audio input unit (not shown) 286 are provided. Hereinafter, only the description of the input unit 210, the audio output unit 285, and the audio input unit 286 will be described.
The input unit 210 may include a plurality of buttons attached to the periphery of the display 280 or a touch screen disposed on the display 280. It is possible to turn on the power of the surrounding view providing apparatus 100b and operate it through a plurality of buttons or a touch screen. In addition, it is also possible to perform various input operations.
The audio output unit 285 converts an electric signal from the processor 270 into an audio signal and outputs the audio signal. For this purpose, a speaker or the like may be provided. The audio output unit 285 can also output sound corresponding to the operation of the input unit 210, that is, the button.
The audio input unit 286 can receive user voice. For this purpose, a microphone may be provided. The received voice may be converted to an electrical signal and transmitted to the processor 270.
Meanwhile, the far view providing apparatus 100b of FIG. 3C or FIG. 3D may be an AVN (Audio Video Navigation) apparatus.
3E is an internal block diagram of the vehicle display device of FIG.
The vehicle display apparatus 400 according to an embodiment of the present invention includes an input unit 310, a communication unit 320, a spatial recognition sensor unit 321, a touch sensor unit 326, an interface unit 330, A memory 340, a processor 370, a display 480, an audio input unit 383, an audio output unit 385, and a power supply unit 390.
The input unit 310 includes a button attached to the display device 400. For example, a power button may be provided. In addition, it may further include at least one of a menu button, an up / down button, and a left / right button.
The input signal through the input unit 310 may be transmitted to the processor 370.
The communication unit 320 can exchange data with an adjacent electronic device. For example, data can be exchanged with a vehicle internal electronic device or a server (not shown) in a wireless manner. Particularly, data can be exchanged wirelessly with the mobile terminal of the vehicle driver. Various wireless data communication methods such as Bluetooth, WiFi, and APiX are available.
For example, when the user is boarded in the vehicle, the user's mobile terminal and the display device 400 can perform the pairing with each other automatically or by execution of the user's application.
On the other hand, the communication unit 320 may include a GPS receiving device, and can receive GPS information, that is, position information of the vehicle.
The space recognition sensor unit 321 can detect the approach or movement of the user's hand. For this purpose, it may be disposed around the display 480.
The spatial recognition sensor unit 321 may perform spatial recognition based on an optical basis or may perform spatial recognition based on an ultrasonic wave. Hereinafter, description will be made mainly on performing spatial recognition under an optical basis.
The spatial recognition sensor section 321 can sense the approach or movement of the user's hand based on the output of the output light and the reception of the corresponding received light. In particular, the processor 370 can perform signal processing on the electrical signals of the output light and the received light.
For this purpose, the spatial recognition sensor unit 321 may include a light output unit 322 and a light receiving unit 324.
The light output unit 322 may output infrared light, for example, for detecting a user's hand located on the front of the display device 400. [
The light receiving unit 324 receives the light scattered or reflected when the light output from the light output unit 322 is scattered or reflected in the user's hand located on the front of the display device 400. [ Specifically, the light receiving unit 324 may include a photo diode and convert the received light into an electric signal through a photodiode. The converted electrical signal may be input to the processor 370.
The touch sensor unit 326 senses a floating touch and a direct touch. For this purpose, the touch sensor unit 326 may include an electrode array, an MCU, and the like. When the touch sensor unit is operated, an electric signal is supplied to the electrode array, and an electric field is formed on the electrode array.
The touch sensor unit 326 can operate when the intensity of light received by the spatial recognition sensor unit 321 is equal to or higher than the first level.
That is, when a user's hand such as a user's hand approaches within a predetermined distance, an electric signal may be supplied to the electrode array or the like in the touch sensor unit 326. [ An electric field is formed on the electrode array by the electric signal supplied to the electrode array, and the electric field is used to sense a capacitance change. Based on the capacitance change detection, the touch sensor detects a floating touch and a direct touch.
In particular, the z-axis information can be sensed by the touch sensor unit 326 in addition to the x- and y-axis information according to the approach of the user's hand.
The interface unit 330 can exchange data with other electronic devices in the vehicle. For example, the interface unit 330 can perform data communication with an ECU or the like in the vehicle by a wired communication method.
Specifically, the interface unit 330 can receive the vehicle status information by data communication with an ECU or the like in the vehicle.
Here, the vehicle status information includes at least one of battery information, fuel information, vehicle speed information, tire information, steering information by steering wheel rotation, vehicle lamp information, vehicle internal temperature information, vehicle external temperature information, can do.
The interface unit 330 may further receive GPS information from an ECU or the like in the vehicle. Alternatively, it is also possible to transmit GPS information, which is received by the display device 400, to an ECU or the like.
The memory 340 may store various data for operation of the display device 400, such as a program for processing or controlling the processor 370. [
For example, the memory 340 may store a map map for guiding the traveling path of the vehicle.
As another example, the memory 340 may store user information, user's mobile terminal information, for pairing with a user's mobile terminal.
The audio output unit 385 converts an electric signal from the processor 370 into an audio signal and outputs the audio signal. For this purpose, a speaker or the like may be provided. The audio output unit 385 can also output sound corresponding to the operation of the input unit 310, that is, the button.
The audio input unit 383 can receive user voice. For this purpose, a microphone may be provided. The received voice may be converted to an electrical signal and transmitted to the processor 370.
The processor 370 controls the overall operation of each unit in the vehicle display device 400. [
When the user's hands approach the display device 400 successively, the processor 370 sequentially determines the x, y, and z axes for the user's hand based on the light received by the light receiver 324 Information can be computed. At this time, the z-axis information can be sequentially reduced.
On the other hand, when the user's hand approaches within a second distance closer to the display 480 than the first distance, the processor 370 can control the touch sensor unit 326 to operate. That is, the processor 370 can control the touch sensor unit 326 to operate when the intensity of the electric signal from the spatial recognition sensor unit 321 is equal to or higher than the reference level. Thereby, an electric signal is supplied to each electrode array in the touch sensor unit 326. [
On the other hand, the processor 370 can sense the floating touch based on the sensing signal sensed by the touch sensor unit 326 when the user's hand is located within the second distance. In particular, the sensing signal may be a signal indicative of a change in capacitance.
Based on the sensed signal, the processor 370 computes the x and y axis information of the floating touch input and calculates z (x, y) based on the magnitude of the electrostatic capacitance change, Axis information can be calculated.
On the other hand, the processor 370 can change the grouping for the electrode array in the touch sensor unit 326 according to the distance of the user's hand.
Specifically, the processor 370 performs grouping on the electrode array in the touch sensor unit 326 based on the approximate z-axis information calculated on the basis of the received light received by the spatial recognition sensor unit 321 It is possible to change it. The larger the distance, the larger the size of the electrode array group can be set.
That is, the processor 370 can vary the size of the touch sensing cell with respect to the electrode array in the touch sensor unit 326 based on the distance information of the user's hand, that is, the z-axis information.
The display 480 may separately display an image corresponding to the function set for the button. For such image display, the display 480 may be implemented as a variety of display modules such as an LCD, an OLED, and the like. On the other hand, the display 480 may be implemented as a cluster on the inside of the vehicle interior.
The power supply unit 390 can supply power necessary for the operation of each component under the control of the processor 370. [
Figures 4A-4B illustrate various examples of internal block diagrams of the processors of Figures 3A-3D, and Figure 5 is a diagram illustrating object detection in the processors of Figures 4A-4B.
4A is a block diagram of the processor 170 of the vehicle driving assistance apparatus 100a of FIGS. 3A-3B or the processor 270 of the surrounding view providing apparatus 100B of FIGS. 3C- And shows an example of an internal block diagram.
The processor 170 or 270 may include an image preprocessing unit 410, a disparity computing unit 420, an object detecting unit 434, an object tracking unit 440, and an application unit 450.
The image preprocessor 410 may receive a plurality of images from the plurality of cameras 295a, ..., and 295d or a generated foreground view image to perform preprocessing.
Specifically, the image preprocessing unit 410 may perform a noise reduction, a rectification, a calibration, a color enhancement, a color correction, and a color correction on a plurality of images or a generated surrounding view image. Color space conversion (CSC), interpolation, camera gain control, and the like. Thus, it is possible to acquire a plurality of images photographed by the plurality of cameras 295a, ..., and 295d, or a sharper image than the generated surround view image.
The disparity calculator 420 receives a plurality of images or a generated surrounding view image signal-processed by the image preprocessing unit 410, and generates a plurality of images or a generated surrounding image Performs stereo matching on the view image, and obtains a disparty map according to the stereo matching. That is, it is possible to obtain the disparity information about the surroundings of the vehicle.
At this time, the stereo matching may be performed on a pixel-by-pixel basis or a predetermined block basis. On the other hand, the disparity map may mean a map in which numerical values of binocular parallax information of images, i.e., left and right images, are displayed.
The segmentation unit 432 may perform segmenting and clustering in the image based on the disparity information from the disparity calculating unit 420. [
Specifically, the segmentation unit 432 can separate the background and the foreground for at least one of the images based on the disparity information.
For example, an area having dispaly information within a disparity map of a predetermined value or less can be calculated as a background, and the corresponding part can be excluded. Thereby, the foreground can be relatively separated.
As another example, an area in which the dispetity information is equal to or greater than a predetermined value in the disparity map can be calculated with the foreground, and the corresponding part can be extracted. Thereby, the foreground can be separated.
Thus, by separating the foreground and background based on the disparity information information extracted based on the image, it becomes possible to shorten the signal processing speed, signal processing amount, and the like at the time of object detection thereafter.
Next, the object detector 434 can detect the object based on the image segment from the segmentation unit 432. [
That is, the object detecting unit 434 can detect an object for at least one of the images based on the disparity information.
More specifically, the object detecting unit 434 can detect an object for at least one of the images. For example, an object can be detected from a foreground separated by an image segment.
Next, the object verification unit 436 classifies and verifies the isolated object.
For this purpose, the object identifying unit 436 identifies the objects using a neural network identification method, a SVM (Support Vector Machine) method, a AdaBoost identification method using a Haar-like feature, or a Histograms of Oriented Gradients (HOG) Technique can be used.
On the other hand, the object checking unit 436 can check the objects by comparing the objects stored in the memory 240 with the detected objects.
For example, the object checking unit 436 can identify nearby vehicles, lanes, roads, signs, hazardous areas, tunnels, and the like, which are located around the vehicle.
An object tracking unit 440 performs tracking on the identified object. For example, it is possible to sequentially check the objects in the acquired images, calculate the motion or motion vector of the identified object, and track the movement of the object based on the calculated motion or motion vector have. Accordingly, it is possible to track nearby vehicles, lanes, roads, signs, hazardous areas, etc., located in the vicinity of the vehicle.
4B is another example of an internal block diagram of the processor.
Referring to FIG. 4B, the processor 170 or 270 of FIG. 4B has the same internal configuration unit as the processor 170 or 270 of FIG. 4A, but differs in the signal processing order. Only the difference will be described below.
The object detecting unit 434 may receive a plurality of images or a generated surrounding view image, and may detect a plurality of images or objects in the generated surrounding view image. 4A, it is possible to detect an object directly from a plurality of images or a generated surrounding view image, instead of detecting an object, based on disparity information, for a segmented image.
Next, the object verification unit 436 classifies the detected and separated objects based on the image segment from the segmentation unit 432 and the object detected by the object detection unit 434, (Verify).
For this purpose, the object identifying unit 436 identifies the objects using a neural network identification method, a SVM (Support Vector Machine) method, a AdaBoost identification method using a Haar-like feature, or a Histograms of Oriented Gradients (HOG) Technique can be used.
FIG. 5 is a diagram referred to for explaining the operation method of the processor 170 or 270 of FIGS. 4A to 4B, based on images obtained respectively in the first and second frame periods.
Referring to FIG. 5, during the first and second frame periods, the plurality of cameras 295a, ..., and 295d sequentially acquire images FR1a and FR1b, respectively.
The disparity calculating unit 420 in the processor 170 or 270 receives the images FR1a and FR1b processed by the image preprocessing unit 410 and performs stereo matching on the received images FR1a and FR1b And obtains a disparity map (520).
The disparity map 520 is obtained by leveling the parallax between the images FR1a and FR1b. The higher the disparity level is, the closer the distance from the vehicle is, and the smaller the disparity level is, The distance can be calculated to be far.
On the other hand, when such a disparity map is displayed, it may be displayed so as to have a higher luminance as the disparity level becomes larger, and a lower luminance as the disparity level becomes smaller.
In the figure, first to fourth lanes 528a, 528b, 528c, and 528d have corresponding disparity levels in the disparity map 520, and the construction area 522, the first front vehicle 524 ) And the second front vehicle 526 have corresponding disparity levels, respectively.
The segmentation unit 432, the object detection unit 434 and the object identification unit 436 determine whether or not a segment, an object detection, and an object of at least one of the images FR1a and FR1b, based on the disparity map 520, Perform verification.
In the figure, using the disparity map 520, object detection and confirmation for the second image FRlb is performed.
That is, the first to fourth lanes 538a, 538b, 538c, and 538d, the construction area 532, the first forward vehicle 534, and the second forward vehicle 536 are included in the image 530, And verification may be performed.
On the other hand, by continuously acquiring the image, the object tracking unit 440 can perform tracking on the identified object.
6A to 6B are views referred to in the description of the operation of the autonomous travel apparatus of FIG.
First, FIG. 6A is a diagram illustrating a vehicle forward situation photographed by a stereo camera 195 provided inside a vehicle. In particular, the vehicle front view is indicated by a bird eye view.
Referring to the drawing, a first lane 642a, a second lane 644a, a third lane 646a, and a fourth lane 648a are located from the left to the right, and the first lane 642a and the second The construction area 610a is positioned between the lanes 644a and the first front vehicle 620a is positioned between the second lane 644a and the third lane 646a and the third lane 646a and the fourth It can be seen that the second forward vehicle 630a is disposed between the lane lines 648a.
Next, FIG. 6B illustrates the display of the vehicle front state, which is grasped by the vehicle driving assist system, together with various information. In particular, the image as shown in FIG. 6B may be displayed on the display 180 or the vehicle display device 400 provided in the vehicle driving assistance device.
6B is different from FIG. 6A in that information is displayed on the basis of an image photographed by the stereo camera 195. FIG.
A first lane 642b, a second lane 644b, a third lane 646b and a fourth lane 648b are located from the left to the right and the first lane 642b and the second The construction area 610b is located between the lanes 644b and the first front vehicle 620b is located between the second lane 644b and the third lane 646b and the third lane 646b and the fourth It can be seen that the second forward vehicle 630b is disposed between the lane 648b.
The vehicle driving assistant 100a performs signal processing on the basis of the stereo image photographed by the stereo camera 195 and outputs it to the construction area 610b, the first front vehicle 620b, the second front vehicle 630b You can see the object for. In addition, the first lane 642b, the second lane 644b, the third lane 646b, and the fourth lane 648b can be confirmed.
On the other hand, in the drawing, it is exemplified that each of them is highlighted by a frame to indicate object identification for the construction area 610b, the first forward vehicle 620b, and the second forward vehicle 630b.
On the other hand, the vehicle driving assistant device 100a calculates the distance (distance) to the construction area 610b, the first front vehicle 620b, the second front vehicle 630b based on the stereo image photographed by the stereo camera 195 Information can be computed.
In the figure, calculated first distance information 611b, second distance information 621b, and third distance information 621b corresponding to the construction area 610b, the first forward vehicle 620b, and the second forward vehicle 630b, respectively, Information 631b is displayed.
On the other hand, the vehicle driving assistant device 100a can receive sensor information about the vehicle from the ECU 770 or the sensor unit 760. [ Particularly, it is possible to receive and display the vehicle speed information, the gear information, the yaw rate indicating the speed at which the vehicle's rotational angle (yaw angle) changes, and the angle information of the vehicle.
The figure illustrates that the vehicle speed information 672, the gear information 671 and the yaw rate information 673 are displayed on the vehicle front image upper portion 670. In the vehicle front image lower portion 680, Information 682 is displayed, but various examples are possible. Besides, the width information 683 of the vehicle and the curvature information 681 of the road can be displayed together with the angle information 682 of the vehicle.
On the other hand, the vehicle driving assistant 100a can receive the speed limitation information and the like for the road running on the vehicle through the communication unit 120 or the interface unit 130. [ In the figure, it is exemplified that the speed limitation information 640b is displayed.
The vehicle driving assistant 100a may display various information shown in FIG. 6B through the display 180 or the like, but may store various information without a separate indication. And, by using such information, it can be utilized for various applications.
7 is an example of a block diagram of an interior of a vehicle according to an embodiment of the present invention.
Referring to the drawings, the vehicle 200 may include an electronic control device 700 for vehicle control.
The electronic control unit 700 includes an input unit 710, a communication unit 720, a memory 740, a lamp driving unit 751, a steering driving unit 752, a brake driving unit 753, a power source driving unit 754, An air conditioner driving unit 757, a window driving unit 758, an airbag driving unit 759, a sensor unit 760, an ECU 770, a display 780, an audio output unit 785, An audio input unit 786, a power supply unit 790, a stereo camera 195, a plurality of cameras 295, a radar 797, an internal camera 708, a seat driving unit 761, .
Meanwhile, the ECU 770 may be a concept including the processor 270 described in FIG. 3C or FIG. 3D. Alternatively, in addition to the ECU 770, a separate processor for signal processing of images from the camera may be provided.
The input unit 710 may include a plurality of buttons or a touch screen disposed inside the vehicle 200. Through a plurality of buttons or a touch screen, it is possible to perform various input operations.
The communication unit 720 can exchange data with the mobile terminal 600 or the server 500 in a wireless manner. In particular, the communication unit 720 can exchange data with the mobile terminal of the vehicle driver wirelessly. As a wireless data communication method, various data communication methods such as Bluetooth, WiFi Direct, WiFi, and APiX are possible.
The communication unit 720 receives from the mobile terminal 600 or the server 500 the schedule information related to the vehicle driver's schedule time or the moving position, weather information, road traffic situation information, for example, TPEG Group) information.
On the other hand, when the user aboard the vehicle, the user's mobile terminal 600 and the electronic control device 700 can perform pairing with each other automatically or by execution of the user's application.
The memory 740 may store various data for operation of the electronic control unit 700, such as a program for processing or controlling the ECU 770. [
On the other hand, the memory 740 may store map information related to the vehicle driving.
The lamp driving unit 751 can control the turn-on / turn-off of the lamps disposed inside and outside the vehicle. Also, the intensity, direction, etc. of the light of the lamp can be controlled. For example, it is possible to perform control on a direction indicating lamp, a brake lamp, and the like.
The steering driver 752 may perform electronic control of a steering apparatus (not shown) in the vehicle 200. [ Thus, the traveling direction of the vehicle can be changed.
The brake driver 753 can perform electronic control of a brake apparatus (not shown) in the vehicle 200. [ For example, the speed of the vehicle 200 can be reduced by controlling the operation of the brakes disposed on the wheels. As another example, it is possible to adjust the traveling direction of the vehicle 200 to the left or right by differently operating the brakes respectively disposed on the left wheel and the right wheel.
The power source driving section 754 can perform electronic control of the power source in the vehicle 200. [
For example, when a fossil fuel-based engine (not shown) is a power source, the power source drive unit 754 can perform electronic control of the engine. Thus, the output torque of the engine and the like can be controlled.
As another example, when the electric motor (not shown) is a power source, the power source driving unit 754 can perform control on the motor. Thus, the rotation speed, torque, etc. of the motor can be controlled.
The sunroof driving unit 755 may perform electronic control of a sunroof apparatus (not shown) in the vehicle 200. [ For example, you can control the opening or closing of the sunroof.
The suspension driving unit 756 may perform electronic control of a suspension apparatus (not shown) in the vehicle 200. [ For example, when there is a curvature on the road surface, it is possible to control the suspension device so as to reduce the vibration of the vehicle 200. [
The air conditioning driving unit 757 can perform electronic control on an air conditioner (not shown) in the vehicle 200. For example, when the temperature inside the vehicle is high, the air conditioner can be operated to control the cooling air to be supplied into the vehicle.
The window driving unit 758 can perform electronic control on a window apparatus (not shown) in the vehicle 200. [ For example, it can control the opening or closing of left and right windows on the side of the vehicle.
The airbag driver 759 may perform electronic control of the airbag apparatus in the vehicle 200. [ For example, at risk, the airbag can be controlled to fire.
The seat driving unit 761 can perform position control of the seat 200 or the backrest of the vehicle 200. [ For example, when the driver is seated in the driver's seat, the driver's seat can be adjusted according to the driver, front / rear spacing adjustment of the seat, front / rear gap adjustment of the backrest, and the like.
On the other hand, the seat driving unit 761 can drive the rollers disposed in the seat or the backrest, and can control the driver to provide pressure such as a massage.
The sensor unit 760 senses a signal relating to the running of the vehicle 200 or the like. To this end, the sensor unit 760 may include a heading sensor, a yaw sensor, a gyro sensor, a position module, a vehicle forward / backward sensor, a wheel sensor, A vehicle speed sensor, a vehicle body inclination sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle interior temperature sensor, and a vehicle interior humidity sensor.
Thereby, the sensor unit 760 outputs the vehicle position information (GPS information), the vehicle angle information, the vehicle speed information, the vehicle acceleration information, the vehicle tilt information, the vehicle forward / backward information, the battery information, Tire information, vehicle lamp information, vehicle internal temperature information, vehicle interior humidity information, and the like.
In addition, the sensor unit 760 may include an accelerator pedal sensor, a pressure sensor, an engine speed sensor, an air flow sensor AFS, an intake air temperature sensor ATS, a water temperature sensor WTS, A position sensor (TPS), a TDC sensor, a crank angle sensor (CAS), and the like.
The ECU 770 can control the overall operation of each unit in the electronic control unit 700. [
It is possible to perform a specific operation by input by the input unit 710 or to receive the sensed signal from the sensor unit 760 and transmit it to the surrounding view providing apparatus 100b and receive map information from the memory 740 754, 756, 753, 754, 756, respectively.
Also, the ECU 770 can receive weather information and traffic situation information of the road, for example, TPEG (Transport Protocol Expert Group) information from the communication unit 720. [
On the other hand, the ECU 770 can combine a plurality of images received from the plurality of cameras 295 to generate an ambient view image. In particular, when the vehicle is below a predetermined speed or when the vehicle is moving backward, the surround view image can be generated.
The display 780 can display an image of the front of the vehicle while the vehicle is running or an around view image during the running of the vehicle. In particular, it is also possible to provide various user interfaces in addition to the surround view image.
For the display of such an ambient view image or the like, the display 780 may include a cluster or HUD (Head Up Display) on the inside of the vehicle interior. On the other hand, when the display 780 is the HUD, it may include a projection module that projects an image on the windshield of the vehicle 200. [ On the other hand, the display 780 may include a touch screen capable of being input.
The audio output unit 785 converts the electrical signal from the ECU 770 into an audio signal and outputs the audio signal. For this purpose, a speaker or the like may be provided. The audio output unit 785 can also output a sound corresponding to the operation of the input unit 710, that is, the button.
The audio input unit 786 can receive user voice. For this purpose, a microphone may be provided. The received voice may be converted to an electrical signal and transmitted to the ECU 770.
The power supply unit 790 can supply power necessary for operation of each component under the control of the ECU 770. [ Particularly, the power supply unit 790 can receive power from a battery (not shown) inside the vehicle.
The stereo camera 195 is used for the operation of a driving assist system for a vehicle. This will be described with reference to the above description.
A plurality of cameras 295 are used to provide the surround view image, and for this purpose, as shown in FIG. 2C, four cameras may be provided. For example, the plurality of cameras 295a, 295b, 295c, and 295d may be disposed on the left, rear, right, and front of the vehicle, respectively. The plurality of images photographed by the plurality of cameras 295 may be transmitted to the ECU 770 or a separate processor (not shown).
The internal camera 708 captures an image of the interior of the vehicle including the driver. For example, an RGB camera, an IR camera with a thermal sensation, and the like can be exemplified.
The driver detection sensor 799 detects the driver's body information. For example, the driver's blood pressure information, sleeping waves, and the like can be detected.
The radar 797 transmits a transmission signal and receives a reception signal reflected from an object near the vehicle. Then, based on the difference between the transmission signal and the reception signal, the distance information is output. Further, it outputs more information.
Fig. 8 is an example of an internal block diagram of the mobile camera apparatus of Fig. 1, and Figs. 9A to 10E are drawings referred to the description of the operation method of the mobile camera apparatus of Fig.
8, the mobile camera device 300 may include a communication unit 330, a memory 340, a moving unit 360, a processor 370, and a camera 395 as an active mobile camera apparatus .
The communication unit 330 exchanges data with the vehicle 200. In particular, an image or moving image photographed by the camera 395 can be transmitted to the communication unit 730 in the vehicle 200. [
The memory 340 may store images or moving images photographed by the camera 395. [
The moving unit 360 can operate to move the position of the camera 395 when the portable camera apparatus 300 protrudes out of the vehicle.
For example, the moving unit 360 can be operated so that the lens direction of the camera 395 is directed to the road direction.
On the other hand, after the camera 395 has taken a predetermined time, the moving unit 360 may be operated so as to return to the inside of the vehicle 200. [
To this end, the moving part 360 may be provided with a separate power source. In other words, it is provided with a motor that is operated by a battery, and can be moved in the x, y, and z axis directions by the rotation of the motor.
The processor 370 projects the image obtained through the camera 395 to the upper portion of the vehicle 200 when the distance from the object around the vehicle 200 is within a predetermined distance, (200).
The processor 370 controls the moving unit 360 to acquire an image of the surroundings of the vehicle 200 via the camera 395 and to transmit the obtained image to the vehicle 200 through the communication unit 330 To be transmitted.
The processor 370 may control the moving unit 360 to return the camera 395 to the inside of the vehicle 200 after the camera 395 has taken a predetermined time.
On the other hand, the processor 370 can control the moving unit 360 such that the lens direction of the camera 395 faces the road direction.
Accordingly, the portable camera device 300 positioned at the upper portion of the vehicle can take an image or a moving picture including both the vehicle 200, the surrounding vehicles, and the surrounding situation.
On the other hand, FIG. 9A illustrates a case where the vehicle 200 travels on a curved road.
9B, the processor 770 of the vehicle 200 determines whether or not there is a collision by the other vehicle 200b located on the right rear side of the vehicle 200, It is determined whether or not the distance from the object near the vehicle 200 is within a predetermined distance based on the image obtained from the cameras 195 and 295b and 295c and 295d in the surrounding view providing apparatus 100b .
The processor 770 of the vehicle 200 determines that there is a high possibility of an accident when the distance from the object around the vehicle 200 is within a predetermined distance and a part of the loop 789a of the vehicle 9c.
The processor 770 of the vehicle 200 can control the mobile camera device 300 to protrude above the vehicle through a part of the loop 789a of the vehicle as shown in Fig.
9D, the mobile camera device 300 travels in the right rear direction (DDR) of the vehicle, and the image or moving image (moving image) of the region 920 including the contact point near the right rear side of the vehicle 200 Can be obtained.
Then, the portable camera device 300 transmits the photographed image or moving image to the vehicle 200 as shown in Fig. 9E. Accordingly, it is possible to store an image or a moving image immediately before an accident or an accident in the memory 740 in the vehicle 200 or the like. Therefore, the convenience of the driver can be increased.
On the other hand, the mobile camera device 300 can return to the inside of the vehicle 200 after the photographing for a predetermined time.
On the other hand, FIG. 10A illustrates that the forward vehicle is backward in a situation where the vehicle 200 is driven at a regular speed.
10B, the processor 770 of the vehicle 200 determines whether or not there is a collision by the other vehicle 200cb located on the left front of the vehicle 200, It is determined whether or not the distance from the object near the vehicle 200 is within a predetermined distance based on the image obtained from the cameras 195 and 295b and 295c and 295d in the surrounding view providing apparatus 100b .
The processor 770 of the vehicle 200 determines that there is a high possibility of an accident if the distance from the object around the vehicle 200 is within a predetermined distance and a part of the loop 789a of the vehicle 10c.
The processor 770 of the vehicle 200 can control the mobile camera device 300 to protrude above the vehicle through a part of the loop 789a of the vehicle as shown in Fig. 10C.
Accordingly, the portable camera device 300 is capable of flying in the left front direction DFL of the vehicle, and as shown in Fig. 10D, Can be obtained.
Then, the portable camera device 300 transmits the photographed image or moving image to the vehicle 200 as shown in Fig. 10E. Accordingly, it is possible to store an image or a moving image immediately before an accident or an accident in the memory 740 in the vehicle 200 or the like. Therefore, the convenience of the driver can be increased.
On the other hand, the mobile camera device 300 can return to the inside of the vehicle 200 after the photographing for a predetermined time.
Meanwhile, the mobile camera device or the vehicle operation method of the present invention can be implemented as a code that can be read by a processor in a portable camera device or a recording medium readable by a processor included in the vehicle, respectively. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the recording medium that can be read by the processor include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet . In addition, the processor-readable recording medium may be distributed over network-connected computer systems so that code readable by the processor in a distributed fashion can be stored and executed.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the present invention.