Nothing Special   »   [go: up one dir, main page]

WO2020194686A1 - Driving assistance system for saddle-ride type vehicles - Google Patents

Driving assistance system for saddle-ride type vehicles Download PDF

Info

Publication number
WO2020194686A1
WO2020194686A1 PCT/JP2019/013690 JP2019013690W WO2020194686A1 WO 2020194686 A1 WO2020194686 A1 WO 2020194686A1 JP 2019013690 W JP2019013690 W JP 2019013690W WO 2020194686 A1 WO2020194686 A1 WO 2020194686A1
Authority
WO
WIPO (PCT)
Prior art keywords
driver
line
sight
unit
saddle
Prior art date
Application number
PCT/JP2019/013690
Other languages
French (fr)
Japanese (ja)
Inventor
弘至 巽
拡 前田
Original Assignee
本田技研工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 本田技研工業株式会社 filed Critical 本田技研工業株式会社
Priority to JP2021508616A priority Critical patent/JP7232320B2/en
Priority to PCT/JP2019/013690 priority patent/WO2020194686A1/en
Publication of WO2020194686A1 publication Critical patent/WO2020194686A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A42HEADWEAR
    • A42BHATS; HEAD COVERINGS
    • A42B3/00Helmets; Helmet covers ; Other protective head coverings
    • A42B3/04Parts, details or accessories of helmets
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/04Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62JCYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
    • B62J99/00Subject matter not provided for in other groups of this subclass
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a saddle-riding vehicle driving support system.
  • Patent Document 1 is a direction information notification device that gives at least direction information to a helmet wearer by localizing a sound image, and is arranged at a plurality of places on the helmet and each of them responds to an external signal to the helmet wearer.
  • a direction information notification device including, is disclosed.
  • the driver in saddle-riding vehicles, the driver is not in an environment surrounded by doors and roofs like a four-wheeled vehicle, so it is easily affected by noise.
  • the driver is also subject to vibrations from the road surface when the vehicle is traveling. Therefore, it may not be possible to sufficiently inform the driver of the direction to be recognized by the driver by sound and vibration as in the prior art.
  • the present invention provides a saddle-riding vehicle driving support system that can notify the driver of the direction in which he / she wants to be recognized.
  • the driving support system for a saddle-riding vehicle has a line-of-sight direction recognition unit (150) that recognizes the line-of-sight direction (De) of the driver (J) and the driver (J).
  • An object direction recognition unit (400) that recognizes the direction (Do) of an object (O) around the own vehicle, the line-of-sight direction (De), and the direction (Do) of the object (O) with respect to the driver (J). It is characterized by including a light source (540) that emits light when the two are different.
  • the driver's line of sight is obtained by emitting a light source as compared with a configuration in which the driver perceives sound and vibration as in the prior art. It is possible to make the driver effectively recognize that the object is in a direction different from the direction. Therefore, it is possible to inform the driver of the desired direction.
  • the light source (540) may be made to emit light.
  • the light source emits light when the driver's line of sight is directed to a direction other than an object around the own vehicle. Therefore, it is possible to make the driver recognize that the object is in a direction different from the driver's line-of-sight direction.
  • the light source (540) may be made to emit light when the direction (Do) is different from that of.
  • the line of sight is directed downward compared to the case where the line of sight is directed to the objects around the own vehicle. Therefore, with the above configuration, the light source emits light, for example, when the driver's line of sight is directed to the meter device. Therefore, it is possible to make the driver recognize that the object is in a direction different from the driver's line-of-sight direction.
  • the line-of-sight direction (De) and the direction of the object (O) with respect to the driver (J) When the angle ( ⁇ 1, ⁇ 2) formed with Do) is equal to or greater than a predetermined angle, the light source (540) may emit light.
  • the light source (540) emits light according to the distance between the vehicle and the object (O). At least one of intensity, emission color, and emission period may be changed.
  • the light source (540) is provided on the helmet (40) and has a line-of-sight direction (De).
  • the direction (Do) of the object (O) with respect to the driver (J) is different, light may be emitted on the same side as the object (O) with respect to the line of sight of the driver (J).
  • the driver can shift his / her line of sight to the object side by directing his / her line of sight to the light source that emits light. This makes it possible to guide the driver's line of sight and make the driver recognize the position of the object.
  • the light source (540) is provided in the meter device (30), and the driver (J)
  • the driver (J) When the line of sight is directed to the meter device (30) and the direction of the line of sight (De) and the direction (Do) of the object (O) with respect to the driver (J) are different, the driver (J) May emit light on the same side as the object (O) with respect to the line of sight of.
  • the driver when the driver's line of sight is directed to the meter device, the driver can shift the line of sight to the object side by directing the line of sight to the light source emitting light. Therefore, the line of sight of the driver can be guided so that the driver can more reliably recognize the position of the object.
  • the vehicle vibrates in conjunction with the light emission of the light source (540) and vibrates to the driver (J).
  • a vibrating unit (560) may be further provided.
  • the vibrating unit (560) vibrates at a position corresponding to the direction of the object (O) with respect to the driver (J). May be good.
  • the driver can direct the line of sight to the vibrating part side, so that the line of sight can be moved to the object side. This makes it possible to guide the driver's line of sight and make the driver recognize the position of the object.
  • the vibrating portion (560) includes a helmet (40), a steering handle (16), a fuel tank (24), and a fuel tank (24). It may be provided in at least one of steps (23).
  • the vibrating part is arranged in the portion that comes into contact with the driver, so that the vibration of the vibrating part can be effectively transmitted to the driver.
  • a warning is given when the distance between the own vehicle and the object (O) is smaller than a predetermined distance.
  • a warning sound generating unit (550) that emits a sound may be further provided.
  • the warning sound generating unit (550) corresponds to the direction (Do) of the object (O) with respect to the driver (J). A warning sound may be emitted from the position.
  • the driver can shift the line of sight to the object side by directing the line of sight to the sound source side of the warning sound. This makes it possible to guide the driver's line of sight and make the driver recognize the position of the object.
  • the warning sound generating unit (550) may be a retrofit device to the helmet (40).
  • a warning sound generator can be provided on the existing helmet. Therefore, it is possible to easily introduce a driving support system for a saddle-riding vehicle that exhibits the above-mentioned effects.
  • the warning sound generating unit (550) may be provided on the helmet (40).
  • the driver can effectively perceive the warning sound.
  • the driving support system for the saddle-riding vehicle of the present embodiment will be described with reference to the drawings.
  • Autonomous driving is a type of driving assistance in which a vehicle runs in a state that does not require operation by the driver in principle.
  • the degree of driving support includes the first degree of driving assistance by operating a driving support device such as ACC (Adaptive Cruise Control System) or LKAS (Lane Keeping Assistance System), and the first degree of driving assistance.
  • the degree of control is also high, and the driver automatically controls at least one of acceleration / deceleration or steering of the vehicle without operating the driver of the vehicle to perform automatic driving, but the driver has some degree of control.
  • the second degree and the third degree of driving support correspond to automatic driving.
  • FIG. 1 is a configuration diagram of a driving support system according to the first embodiment.
  • the vehicle equipped with the driving support system 1 shown in FIG. 1 is a saddle-riding vehicle such as a two-wheeled vehicle or a three-wheeled vehicle.
  • the prime mover of a vehicle is an internal combustion engine such as a gasoline engine, an electric motor, or a combination of an internal combustion engine and an electric motor.
  • the electric motor operates by using the electric power generated by the generator connected to the internal combustion engine or the electric power generated by the secondary battery or the fuel cell.
  • the driving support system 1 includes a camera 51, a radar device 52, a finder 53, an object recognition device 54 (object direction recognition unit), a communication device 55, an HMI (Human Machine Interface) 56, and a vehicle sensor 57.
  • a navigation device 60 an MPU (Map Positioning Unit) 70, a driving operator 80, a driver monitoring camera 90, a control device 100, a traveling driving force output device 500, a braking device 510, and a steering device 520.
  • a line-of-sight guidance unit 530 are connected to each other by multiple communication lines such as CAN (Controller Area Network) communication lines, serial communication lines, wireless communication networks, and the like.
  • CAN Controller Area Network
  • the camera 51 is a digital camera that uses a solid-state image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).
  • the camera 51 is attached to an arbitrary position of the vehicle (hereinafter, own vehicle M) on which the driving support system 1 is mounted.
  • the camera 51 periodically and repeatedly images the periphery of the own vehicle M, for example.
  • the camera 51 may be a stereo camera.
  • the radar device 52 radiates radio waves such as millimeter waves around the own vehicle M, and also detects radio waves (reflected waves) reflected by the object to detect at least the position (distance and direction) of the object.
  • the radar device 52 is attached to an arbitrary position of the own vehicle M.
  • the radar device 52 may detect the position and speed of the object by the FM-CW (Frequency Modulated Continuous Wave) method.
  • FM-CW Frequency Modulated Continuous Wave
  • the finder 53 is a LIDAR (Light Detection and Ringing).
  • the finder 53 irradiates the periphery of the own vehicle M with light and measures the scattered light.
  • the finder 53 detects the distance to the target based on the time from light emission to light reception.
  • the light to be irradiated is, for example, a pulsed laser beam.
  • the finder 53 is attached to an arbitrary position of the own vehicle M.
  • the object recognition device 54 performs sensor fusion processing on the detection results of a part or all of the camera 51, the radar device 52, and the finder 53 to determine the position, type, speed, and the like of the objects around the own vehicle M. recognize.
  • the object recognition device 54 outputs the recognition result to the control device 100.
  • the object recognition device 54 may output the detection results of the camera 51, the radar device 52, and the finder 53 to the control device 100 as they are.
  • the communication device 55 communicates with another vehicle existing in the vicinity of the own vehicle M by using, for example, a cellular network, a Wi-Fi network, Bluetooth (registered trademark), DSRC (Dedicated Short Range Communication), or wirelessly. Communicates with various server devices via the base station.
  • a cellular network for example, a Wi-Fi network, Bluetooth (registered trademark), DSRC (Dedicated Short Range Communication), or wirelessly.
  • the HMI 56 presents various information to the driver of the own vehicle M and accepts input operations by the driver.
  • the HMI 56 includes various display devices, speakers, buzzers, touch panels, switches, keys and the like.
  • the vehicle sensor 57 includes a vehicle speed sensor that detects the speed of the own vehicle M, an acceleration sensor that detects the acceleration, a yaw rate sensor that detects the angular velocity around the vertical axis, an orientation sensor that detects the direction of the own vehicle M, and the like.
  • the navigation device 60 includes, for example, a GNSS (Global Navigation Satellite System) receiver 61, a navigation HMI 62, and a route determination unit 63.
  • the navigation device 60 holds the first map information 64 in a storage device such as an HDD (Hard Disk Drive) or a flash memory.
  • the GNSS receiver 61 identifies the position of the own vehicle M based on the signal received from the GNSS satellite. The position of the own vehicle M may be specified or complemented by an INS (Inertial Navigation System) using the output of the vehicle sensor 57.
  • the navigation HMI 62 includes a display device, a speaker, a touch panel, keys, and the like. The navigation HMI 62 may be partially or wholly shared with the above-mentioned HMI 56.
  • the route determination unit 63 has a route from the position of the own vehicle M (or an arbitrary position input) specified by the GNSS receiver 61 to the destination input by the occupant using the navigation HMI 62 (hereinafter,).
  • the route on the map) is determined with reference to the first map information 64.
  • the first map information 64 is information in which the road shape is expressed by, for example, a link indicating a road and a node connected by the link.
  • the first map information 64 may include road curvature, POI (Point Of Interest) information, and the like.
  • the route on the map is output to the MPU 70.
  • the navigation device 60 may provide route guidance using the navigation HMI 62 based on the route on the map.
  • the navigation device 60 may be realized by, for example, the function of a terminal device such as a smartphone or a tablet terminal owned by an occupant.
  • the navigation device 60 may transmit the current position and the destination to the navigation server via the communication device 55, and may acquire a route equivalent to the route on the map from the navigation server.
  • the MPU 70 includes, for example, a recommended lane determination unit 71.
  • the MPU 70 holds the second map information 72 in a storage device such as an HDD or a flash memory.
  • the recommended lane determination unit 71 divides the route on the map provided by the navigation device 60 into a plurality of blocks (for example, divides the route every 100 [m] with respect to the vehicle traveling direction), and refers to the second map information 72. Determine the recommended lane for each block.
  • the recommended lane determination unit 71 determines which lane to drive from the left. When the recommended lane determination unit 71 has a branch point on the route on the map, the recommended lane determination unit 71 determines the recommended lane so that the own vehicle M can travel on a reasonable route to proceed to the branch destination.
  • the second map information 72 is more accurate map information than the first map information 64.
  • the second map information 72 includes, for example, information on the center of the lane, information on the boundary of the lane, and the like. Further, the second map information 72 may include road information, traffic regulation information, address information (address / zip code), facility information, telephone number information, and the like.
  • the second map information 72 may be updated at any time by the communication device 55 communicating with another device.
  • the driving operator 80 includes, for example, an accelerator grip, an operator such as a brake pedal, a brake lever, a shift pedal, and a steering handle.
  • a sensor for detecting the amount of operation or the presence or absence of operation is attached to the operation operator 80. The detection result of the sensor is output to a part or all of the control device 100, the traveling driving force output device 500, the brake device 510, and the steering device 520.
  • the driver monitoring camera 90 is arranged at a position where the driver sitting on the seat can be imaged.
  • the driver surveillance camera 90 is attached to the front portion of the own vehicle M.
  • the driver monitoring camera 90 takes an image of the face of the driver sitting on the seat.
  • the driver surveillance camera 90 is a digital camera that uses a solid-state image sensor such as a CCD or CMOS.
  • the driver monitoring camera 90 periodically images the driver, for example.
  • the captured image of the driver monitoring camera 90 is output to the control device 100.
  • the control device 100 includes a master control unit 110, a driving support control unit 200, an automatic driving control unit 300, and a line-of-sight guidance control unit 400.
  • the master control unit 110 may be integrated into either the operation support control unit 200 or the automatic operation control unit 300.
  • the master control unit 110 switches the degree of driving support and controls the HMI 56.
  • the master control unit 110 includes a switching control unit 120, an HMI control unit 130, an operator state determination unit 140, and an occupant condition monitoring unit 150 (line-of-sight direction recognition unit).
  • the switching control unit 120, the HMI control unit 130, the operator state determination unit 140, and the occupant condition monitoring unit 150 are each realized by executing a program by a processor such as a CPU (Central Processing Unit).
  • a processor Central Processing Unit
  • some or all of these functional parts may be realized by hardware such as LSI (Large Scale Integration), ASIC (Application Specific Integrated Circuit), FPGA (Field-Programmable Gate Array), or software. And may be realized by the cooperation of hardware.
  • the switching control unit 120 switches the degree of driving support based on, for example, an operation signal input from a predetermined switch included in the HMI 56. Further, the switching control unit 120 cancels the driving support and manually operates the vehicle based on, for example, an operation of instructing the driving controller 80 such as the accelerator grip, the brake pedal, the brake lever, and the steering handle to accelerate, decelerate, or steer. You may switch to.
  • the driving controller 80 such as the accelerator grip, the brake pedal, the brake lever, and the steering handle to accelerate, decelerate, or steer. You may switch to.
  • the switching control unit 120 may switch the degree of driving support based on the action plan generated by the action plan generation unit 330 described later. For example, the switching control unit 120 may end the driving support at the scheduled end point of the automatic driving defined by the action plan.
  • the HMI control unit 130 causes the HMI 56 to output a notification or the like related to switching the degree of driving support. Further, the HMI control unit 130 switches the content to be output to the HMI 56 when a predetermined event for the own vehicle M occurs. Further, the HMI control unit 130 may output the information regarding the determination result by one or both of the operator state determination unit 140 and the occupant condition monitoring unit 150 to the HMI 56. Further, the HMI control unit 130 may output the information received by the HMI 56 to one or both of the operation support control unit 200 and the automatic operation control unit 300.
  • the operator state determination unit 140 is, for example, in a state in which the steering handle included in the operation operator 80 is being operated (specifically, when an intentional operation is actually performed, a state in which the steering wheel can be immediately operated, or (It shall indicate the gripping state).
  • the occupant condition monitoring unit 150 monitors the driver's condition based on the image captured by the driver monitoring camera 90.
  • the occupant condition monitoring unit 150 monitors that the driver is continuously monitoring the traffic conditions in the surrounding area.
  • the occupant condition monitoring unit 150 acquires the driver's face image from the image captured by the driver monitoring camera 90, and recognizes the driver's line-of-sight direction from the acquired face image.
  • the occupant condition monitoring unit 150 may recognize the line-of-sight direction of the occupant from the image captured by the driver monitoring camera 90 by deep learning using a neural network or the like.
  • the driving support control unit 200 executes the first degree of driving support.
  • the driving support control unit 200 executes, for example, ACC, LKAS, or other driving support control.
  • ACC ACC
  • LKAS LKAS
  • the traveling driving force output device 500 and the braking device 510 are controlled so that the vehicle travels while keeping the distance between the vehicle and the vehicle constant. That is, the driving support control unit 200 performs acceleration / deceleration control (speed control) based on the inter-vehicle distance from the vehicle in front.
  • the driving support control unit 200 controls the steering device 520 so that the own vehicle M travels while maintaining (lane keeping) the traveling lane in which the vehicle is currently traveling. That is, the driving support control unit 200 performs steering control for maintaining the lane.
  • the type of driving support of the first degree may include various controls other than automatic driving (second degree and third degree) that do not require an operation on the driving operator 80.
  • the automatic driving control unit 300 executes the driving support of the second degree and the third degree.
  • the automatic operation control unit 300 includes, for example, a first control unit 310 and a second control unit 350.
  • Each of the first control unit 310 and the second control unit 350 is realized by, for example, a hardware processor such as a CPU executing a program (software).
  • a hardware processor such as a CPU executing a program (software).
  • some or all of these components may be realized by hardware such as LSI, ASIC, FPGA, GPU, or may be realized by collaboration between software and hardware.
  • the first control unit 310 includes, for example, a recognition unit 320 and an action plan generation unit 330.
  • the first control unit 310 realizes a function by AI (Artificial Intelligence) and a function by a model given in advance in parallel.
  • AI Artificial Intelligence
  • a function by a model given in advance in parallel For example, in the "intersection recognition" function, recognition of an intersection by deep learning or the like and recognition based on predetermined conditions (pattern matching signals, road markings, etc.) are executed in parallel, and both are executed in parallel.
  • it may be realized by scoring and comprehensively evaluating. This ensures the reliability of autonomous driving.
  • the recognition unit 320 recognizes states such as the position, speed, and acceleration of surrounding vehicles based on the information input from the camera 51, the radar device 52, and the finder 53 via the object recognition device 54.
  • the positions of peripheral vehicles are recognized as, for example, positions on absolute coordinates with the representative point (center of gravity, center of drive shaft, etc.) of the own vehicle M as the origin, and are used for control.
  • the position of the peripheral vehicle may be represented by a representative point such as the center of gravity or a corner of the peripheral vehicle, or may be represented by the represented area.
  • the "state" of the surrounding vehicle may include the acceleration or jerk of the object, or the "behavioral state” (eg, whether or not the vehicle is changing lanes or is about to change lanes).
  • the recognition unit 320 recognizes, for example, the lane (traveling lane) in which the own vehicle M is traveling.
  • the recognition unit 320 has a road marking line pattern (for example, an arrangement of a solid line and a broken line) obtained from the second map information 72 and a road marking line around the own vehicle M recognized from the image captured by the camera 51. By comparing with the pattern of, the traveling lane is recognized.
  • the recognition unit 320 may recognize the traveling lane by recognizing not only the road marking line but also the running road boundary (road boundary) including the road marking line, the shoulder, the curb, the median strip, the guardrail, and the like. .. In this recognition, the position of the own vehicle M acquired from the navigation device 60 or the processing result by the INS may be added.
  • the recognition unit 320 recognizes a stop line, an obstacle, a red light, a tollgate, other road events, and the like.
  • the recognition unit 320 When recognizing the traveling lane, the recognition unit 320 recognizes the position and orientation of the own vehicle M with respect to the traveling lane.
  • FIG. 2 is a diagram showing an example of how the recognition unit recognizes the relative position and posture of the own vehicle with respect to the traveling lane.
  • the recognition unit 320 is, for example, a line connecting the deviation OS of the reference point (for example, the center of gravity) of the own vehicle M from the central CL of the traveling lane and the central CL of the traveling lane in the traveling direction of the own vehicle M.
  • the angle ⁇ formed with respect to the traveling lane L1 may be recognized as the relative position and orientation of the own vehicle M with respect to the traveling lane L1.
  • the recognition unit 320 sets the position of the reference point of the own vehicle M with respect to any side end (road marking line or road boundary) of the traveling lane L1 relative to the traveling lane. It may be recognized as a position.
  • the action plan generation unit 330 generates an action plan for driving the own vehicle M by automatic driving.
  • the action plan generation unit 330 travels in the recommended lane determined by the recommended lane determination unit 71, and the own vehicle M automatically (driver) so as to be able to respond to the surrounding conditions of the own vehicle M.
  • the target trajectory includes, for example, a position element that determines the position of the own vehicle M in the future and a speed element that determines the speed and acceleration of the own vehicle M in the future.
  • the action plan generation unit 330 determines a plurality of points (track points) that the own vehicle M should reach in order as position elements of the target track.
  • the track point is a point to be reached by the own vehicle M for each predetermined mileage (for example, about several [m]).
  • the predetermined mileage may be calculated, for example, by the road distance when traveling along the route.
  • the action plan generation unit 330 determines the target speed and the target acceleration for each predetermined sampling time (for example, about 0 comma several seconds) as the speed elements of the target trajectory.
  • the track point may be a position to be reached by the own vehicle M at the sampling time for each predetermined sampling time. In this case, the target velocity and target acceleration are determined by the sampling time and the interval between the orbital points.
  • the action plan generation unit 330 may set an event for automatic driving when generating a target trajectory.
  • Examples of the automatic driving event include a constant speed driving event in which the vehicle travels in the same lane at a constant speed, a following driving event in which the vehicle follows the vehicle in front, and a lane change event in which the vehicle M changes the traveling lane.
  • the action plan generation unit 330 generates a target trajectory according to the activated event.
  • FIG. 3 is a diagram showing how a target trajectory is generated based on the recommended lane.
  • the recommended lane is set so as to be convenient for traveling along the route to the destination.
  • the action plan generation unit 330 activates a lane change event, a branch event, a merging event, and the like. If it becomes necessary to avoid an obstacle during the execution of each event, an avoidance trajectory is generated as shown in the figure.
  • the second control unit 350 has a traveling driving force output device 500, a brake device 510, so that the own vehicle M passes the target trajectory generated by the action plan generation unit 330 at the scheduled time. And controls the steering device 520.
  • the second control unit 350 includes, for example, an acquisition unit 352, a speed control unit 354, and a steering control unit 356.
  • the acquisition unit 352 acquires the information of the target trajectory (orbit point) generated by the action plan generation unit 330 and stores it in a memory (not shown).
  • the speed control unit 354 controls the traveling driving force output device 500 or the brake device 510 based on the speed element associated with the target trajectory stored in the memory.
  • the steering control unit 356 controls the steering device 520 according to the degree of bending of the target trajectory stored in the memory.
  • the processing of the speed control unit 354 and the steering control unit 356 is realized by, for example, a combination of feedforward control and feedback control.
  • the steering control unit 356 executes a combination of feedforward control according to the curvature of the road in front of the own vehicle M and feedback control based on the deviation from the target trajectory.
  • the line-of-sight guidance control unit 400 recognizes the direction of an object around the own vehicle M with respect to the position of the driver based on the information recognized by the object recognition device 54.
  • object direction the direction of an object around the own vehicle M with respect to the position of the driver.
  • the position of the driver is recognized by, for example, an image captured by the driver monitoring camera 90.
  • the position of the driver is represented by, for example, the center of gravity of the driver's head.
  • the position of the driver may be fixedly set with respect to the vehicle in advance.
  • the line-of-sight guidance control unit 400 recognizes the distance between the own vehicle M and an object around the own vehicle M.
  • the line-of-sight guidance control unit 400 compares the direction of the object with the direction of the driver's line of sight recognized by the occupant condition monitoring unit 150.
  • the line-of-sight guidance control unit 400 determines the operation of the line-of-sight guidance unit 530, which will be described later, based on the result of comparison between the object direction and the line-of-sight direction of the driver and the distance between the own vehicle M and the object.
  • the line-of-sight guidance control unit 400 outputs the determined operation command of the line-of-sight guidance unit 530 to the helmet control unit 42 of the helmet 40, which will be described later.
  • a part of the line-of-sight guidance control unit 400 may be integrated with the recognition unit 320 of the automatic driving control unit 300. The details of the function of the line-of-sight guidance control unit 400 will be described later.
  • the traveling driving force output device 500 outputs a traveling driving force (torque) for the own vehicle M to travel to the drive wheels.
  • the traveling driving force output device 500 includes, for example, a combination of an internal combustion engine, an electric motor, a transmission, and the like, and an ECU (Electronic Control Unit) that controls them.
  • the ECU controls the above configuration according to the information input from the second control unit 350 or the information input from the operation operator 80.
  • the brake device 510 includes, for example, a brake caliper, a cylinder that transmits hydraulic pressure to the brake caliper, an electric motor that generates hydraulic pressure in the cylinder, and a brake ECU.
  • the brake ECU controls the electric motor according to the information input from the second control unit 350 or the information input from the operation operator 80 so that the brake torque corresponding to the braking operation is output to each wheel.
  • the brake device 510 may include, as a backup, a mechanism for transmitting the hydraulic pressure generated by the operation of the brake lever or the brake pedal included in the operation operator 80 to the cylinder via the master cylinder.
  • the brake device 510 is not limited to the configuration described above, and is an electronically controlled hydraulic brake device that controls the actuator according to the information input from the second control unit 350 to transmit the hydraulic pressure of the master cylinder to the cylinder. May be good.
  • the steering device 520 includes, for example, a steering ECU and an electric motor.
  • the electric motor changes the direction of the steering wheels (front wheels), for example.
  • the steering ECU drives the electric motor according to the information input from the second control unit 350 or the information input from the driving operator 80 to change the direction of the steering wheels.
  • the line-of-sight guidance unit 530 guides the driver's line of sight in a predetermined direction determined by the line-of-sight guidance control unit 400.
  • the line-of-sight guidance unit 530 includes a light emitting unit 540 (light source), a warning sound generating unit 550, and a vibrating unit 560.
  • the light emitting unit 540 and the warning sound generating unit 550 are provided on a helmet worn by the driver.
  • the vibrating unit 560 is provided at a position in the vehicle that comes into contact with the driver. Details of the light emitting unit 540, the warning sound generating unit 550, and the vibrating unit 560 will be described later.
  • FIG. 4 is a left side view showing the motorcycle of the first embodiment.
  • the motorcycle 10 is a saddle-riding vehicle equipped with the driving support system 1 of the embodiment.
  • the motorcycle 10 mainly includes a front wheel 11 which is a steering wheel, a rear wheel 12 which is a driving wheel, and a vehicle body frame 20 which supports a prime mover 13 (an engine in the illustrated example).
  • the front wheel 11 is steerably supported by the vehicle body frame 20 via a steering mechanism.
  • the steering mechanism includes a front fork 14 that supports the front wheels 11 and a steering stem 15 that supports the front fork 14.
  • a steering handle 16 held by the driver J is attached to the upper part of the steering stem 15.
  • the front wheels 11 are braked by the braking device 510.
  • the rear wheel 12 is supported by the rear end of the swing arm 17 extending in the front-rear direction at the rear of the vehicle.
  • the front end portion of the swing arm 17 is supported by the vehicle body frame 20 so as to be able to swing up and down.
  • the rear wheel 12 is braked by the braking device 510.
  • the vehicle body frame 20 rotatably supports the steering stem 15 by a head pipe 21 provided at the front end portion.
  • the vehicle body frame 20 supports the seat 22 on which the driver J sits, the left and right steps 23 on which the driver J rests his / her feet, the fuel tank 24 arranged in front of the seat 22, and the like. There is.
  • the fuel tank 24 is provided so that the driver J can knee grip it.
  • a front cowl 25 supported by the vehicle body frame 20 is mounted on the front portion of the vehicle.
  • a meter device 30 is arranged inside the front cowl 25.
  • a vibrating portion 560 (see FIG. 1) is provided on at least one of the steering handle 16, the fuel tank 24, and the left and right steps 23.
  • the vibrating unit 560 is controlled by the line-of-sight guidance control unit 400.
  • the vibrating unit 560 transmits the vibration to the driver J to make the driver J perceive the vibration.
  • FIG. 5 is a perspective view of the helmet of the first embodiment.
  • the helmet 40 is a full-face type helmet equipped with the driving support system 1 of the embodiment.
  • the helmet 40 includes a helmet body 41 that covers the head of the driver J, a light emitting unit 540 and a warning sound generating unit 550 provided on the helmet body 41, and a helmet control unit that controls the light emitting unit 540 and the warning sound generating unit 550.
  • a power supply (not shown) for supplying power to the helmet control unit 42 is provided with the 42.
  • the light emitting unit 540 and the warning sound generating unit 550 may be preliminarily incorporated in the helmet 40, or may be a retrofit device to the helmet 40.
  • the power source is a secondary battery, which is embedded in the helmet body 41.
  • the helmet body 41 includes a cap body 43 as an outer shell member, an interior material (not shown) arranged inside the cap body 43, and a shield 44 that covers the front opening portion 45 of the cap body 43.
  • the front opening portion 45 is provided at the front portion of the cap body 43 in order to secure the field of view of the wearer (driver J).
  • the light emitting unit 540 is a light emitting element such as an LED (Light Emitting Diode).
  • the light emitting unit 540 is arranged at a position where the wearer of the helmet 40 can visually check whether or not the light is emitted.
  • the light emitting portion 540 is arranged at the opening edge of the front opening portion 45 of the cap body 43.
  • the light emitting unit 540 is arranged so as to surround the line of sight of the wearer of the helmet 40 in a state of facing directly in front of the wearer.
  • the light emitting portion 540 includes an upper light emitting portion 541 arranged above the front opening portion 45, a right light emitting portion 542 arranged at the right portion of the front opening portion 45, and a left side arranged at the left portion of the front opening portion 45.
  • a light emitting unit 543 and a light emitting unit 543 are provided.
  • the upper light emitting unit 541, the right light emitting unit 542, and the left light emitting unit 543 may be provided as separate members from each other, or may be integrally provided and formed so as to emit light independently of each other.
  • the warning sound generator 550 is a directional speaker.
  • the warning sound generation unit 550 emits a warning sound.
  • the warning sound generation unit 550 is provided so that the wearer of the helmet 40 can perceive at least two directions of left and right as a source of the warning sound.
  • the warning sound generating unit 550 can make the wearer of the helmet 40 perceive the warning sound as a source of the warning sound by distinguishing the three directions of the right side, the left side, and the front side.
  • the helmet control unit 42 operates by the electric power of a power source (not shown).
  • the helmet control unit 42 is provided so as to be able to communicate with the line-of-sight guidance control unit 400.
  • the helmet control unit 42 acquires an operation command of the line-of-sight guidance unit 530 from the line-of-sight guidance control unit 400.
  • the helmet control unit 42 applies a voltage to the light emitting unit 540 based on the acquired operation command of the light emitting unit 540 to cause the light emitting unit 540 to emit light.
  • the helmet control unit 42 emits a warning sound from the warning sound generating unit 550 based on the acquired operation command of the warning sound generating unit 550.
  • FIG. 6 is a flowchart showing a processing flow by the line-of-sight guidance control unit.
  • 7 and 8 are views showing a scene in which the line-of-sight direction and the object direction are different.
  • FIG. 9 is a diagram illustrating a light emitting state of the light emitting unit, and is a front view of the helmet of the first embodiment. In FIG. 9, the shield 44 is not shown.
  • the line-of-sight guidance control unit 400 determines whether or not the line-of-sight direction De and the object direction Do are different when viewed from the vertical direction.
  • the line-of-sight guidance control unit 400 determines that the line-of-sight direction De and the object direction Do are different when the angle ⁇ 1 formed by the line-of-sight direction De and the object direction Do is larger than a predetermined first angle when viewed from the vertical direction ( (See FIG. 7).
  • the process proceeds to step S20.
  • the helmet control unit 400 determines that the line-of-sight direction De and the object direction Do do not differ when viewed from the vertical direction (S10: NO)
  • the helmet control unit 400 turns off the right light emitting unit 542 and the left light emitting unit 543.
  • a command is output to 42 (step S50), and the process proceeds to step S60.
  • step S20 the line-of-sight guidance control unit 400 determines whether or not the line-of-sight direction De is facing to the right with respect to the object direction Do.
  • the line-of-sight guidance control unit 400 proceeds to the process of step S30.
  • the line-of-sight direction De is not facing the right side with respect to the object direction Do (S20: NO), that is, when the line-of-sight direction De is facing the left side with respect to the object direction Do
  • the line-of-sight guidance control unit 400 processes in step S40. Proceed to.
  • step S30 the line-of-sight guidance control unit 400 outputs a command to the helmet control unit 42 so that the left light emitting unit 543 emits light, and causes the left light emitting unit 543 to emit light (see FIG. 9).
  • the light emitting unit 540 emits light on the same side as the object O with respect to the line of sight of the driver J.
  • the line-of-sight guidance control unit 400 lights, blinks, or blinks the left light emitting unit 543.
  • “Blinking” means repeating a lighting state and an extinguishing state of a constant emission intensity (luminance).
  • “Blinking” means repeating the lighting state and the extinguishing state while changing the light emission intensity.
  • the driver J shifts his consciousness to the left side and turns his gaze toward the object O side.
  • the line-of-sight guidance control unit 400 proceeds to the process of step S60.
  • step S40 the line-of-sight guidance control unit 400 outputs a command to the helmet control unit 42 so that the right light emitting unit 542 emits light, and causes the right light emitting unit 542 to emit light.
  • the light emitting unit 540 emits light on the same side as the object O with respect to the line of sight of the driver J.
  • the line-of-sight guidance control unit 400 lights, blinks, or blinks the right light emitting unit 542.
  • the driver J shifts his consciousness to the right side and turns his gaze toward the object O side.
  • the line-of-sight guidance control unit 400 proceeds to the process of step S60.
  • step S60 the line-of-sight guidance control unit 400 determines whether or not the line-of-sight direction De and the object direction Do are different when viewed from the vehicle width direction.
  • the line-of-sight guidance control unit 400 determines that the line-of-sight direction De and the object direction Do are different when the angle ⁇ 2 formed by the line-of-sight direction De and the object direction Do when viewed from the vehicle width direction is larger than a predetermined second angle. (See FIG. 8).
  • the driver J may be looking at the meter device 30.
  • step S70 When the line-of-sight guidance control unit 400 determines that the line-of-sight direction De and the object direction Do are different when viewed from the vehicle width direction (S60: YES), the process proceeds to step S70.
  • the line-of-sight guidance control unit 400 determines that the line-of-sight direction De and the object direction Do do not differ from each other when viewed from the vehicle width direction (S60: NO)
  • the line-of-sight guidance control unit 400 outputs a command to the helmet control unit to turn off the upper light emitting unit. (Step S80), a series of processes is completed.
  • step S70 the line-of-sight guidance control unit 400 outputs a command to the helmet control unit 42 so that the upper light emitting unit 541 emits light, and causes the upper light emitting unit 541 to emit light.
  • the light emitting unit 540 emits light on the same side as the object O with respect to the line of sight of the driver J.
  • the line-of-sight guidance control unit 400 lights, blinks, or blinks the upper light emitting unit 541.
  • the driver J shifts his consciousness to the upper side and turns his / her line of sight toward the object O when he / she is looking at the meter device 30. Then, the line-of-sight guidance control unit 400 ends the process.
  • the line-of-sight guidance control unit 400 may change at least one of the emission intensity, emission color, and emission cycle of the light emitting unit 540 according to the distance between the own vehicle M and the object O. Good.
  • the line-of-sight guidance control unit 400 outputs a command to the helmet control unit 42 so that the light emission intensity increases as the distance between the own vehicle M and the object O decreases.
  • the line-of-sight guidance control unit 400 outputs a command to the helmet control unit 42 so that the light emitting cycle of the blinking or blinking light emitting unit 540 decreases as the distance between the own vehicle M and the object O decreases.
  • the line-of-sight guidance control unit 400 outputs a signal to the helmet control unit 42 so as to emit a warning sound from the warning sound generation unit 550 in conjunction with the light emission of the light emitting unit 540.
  • the line-of-sight guidance control unit 400 causes the helmet control unit 42 to control the warning sound generation unit 550 so as to emit a warning sound from a position corresponding to the object direction Do with respect to the driver J.
  • the line-of-sight guidance control unit 400 causes the helmet control unit 42 to control the warning sound generation unit 550 so that the wearer of the helmet 40 emits a warning sound from the direction corresponding to the light emitting unit 540 that emits light.
  • the line-of-sight guidance control unit 400 emits a warning sound from the warning sound generating unit 550 so that the wearer of the helmet 40 perceives that the warning sound is sounding from the right side under the condition that the right light emitting unit 542 emits light. ..
  • the line-of-sight guidance control unit 400 vibrates the vibrating unit 560 provided at the position where the driver J contacts in conjunction with the light emission of the light emitting unit 540.
  • the line-of-sight guidance control unit 400 vibrates the vibrating unit 560 at a position corresponding to the object direction Do with respect to the driver J.
  • the line-of-sight guidance control unit 400 causes the driver J to vibrate the vibrating unit 560 that contacts the half of the right and left half of the body in the direction corresponding to the light emitting unit 540, and the driver J. To perceive vibration.
  • the line-of-sight guidance control unit 400 vibrates the vibrating unit 560 (for example, step 23 on the right side) that comes into contact with the right half of the driver J under the condition that the right light emitting unit 542 emits light.
  • the line-of-sight guidance control unit 400 may allow the driver J to perceive vibration regardless of the left and right while the light emitting unit 540 is emitting light. Further, the line-of-sight guidance control unit 400 may vibrate the vibrating unit 560 only when the distance between the own vehicle M and the object O is smaller than a predetermined distance.
  • the occupant state monitoring unit 150 that recognizes the line-of-sight direction De of the driver J and the direction (object) of the object O around the own vehicle M with respect to the driver J. It includes a line-of-sight guidance control unit 400 that recognizes the direction Do), and a light emitting unit 540 that emits light when the line-of-sight direction De and the object direction Do are different.
  • the driver is made to emit light by causing the light emitting unit 540 to emit light as compared with the configuration in which the driver perceives the sound and vibration as in the prior art. It is possible to make the driver J effectively recognize that the object O is in a direction different from the line-of-sight direction De of J. Therefore, it is possible to inform the driver J of the direction to be recognized.
  • the driving support system 1 causes the light emitting unit 540 to emit light when the line-of-sight direction De and the object direction Do are different when viewed from the vertical direction.
  • the light emitting unit 540 emits light. Therefore, it is possible to make the driver J recognize that the object O is in a direction different from the line-of-sight direction De of the driver J.
  • the driving support system 1 causes the light emitting unit 540 to emit light when the line-of-sight direction De and the object direction Do are different from each other when viewed from the vehicle width direction.
  • the line of sight is directed downward as compared with the case where the line of sight is directed to the object O around the own vehicle M. .. Therefore, with the above configuration, for example, when the line of sight of the driver J is directed to the meter device 30, the light emitting unit 540 emits light. Therefore, it is possible to make the driver J recognize that the object O is in a direction different from the line-of-sight direction De of the driver J.
  • the driving support system 1 causes the light emitting unit 540 to emit light when the angles ⁇ 1 and ⁇ 2 formed by the line-of-sight direction De and the object direction Do are equal to or greater than a predetermined angle. According to this configuration, it is possible to prevent the light emitting unit 540 from frequently emitting light in a situation where the line of sight of the driver J is slightly deviated from the object O and the driver J recognizes the object O.
  • the light emitting unit 540 changes at least one of the light emission intensity, the light emission color, and the light emission cycle according to the distance between the own vehicle M and the object O. According to this configuration, the light emitting form of the light emitting unit 540 can be changed according to the degree to which the driver J's line of sight needs to be guided. Therefore, it is possible to more reliably inform the driver J of the direction to be recognized.
  • the light emitting unit 540 is provided on the helmet 40.
  • the driver J can shift the line of sight to the object O side by directing the line of sight to the light emitting portion of the light emitting unit 540.
  • the line of sight of the driver J can be guided so that the driver J can recognize the position of the object O.
  • the line-of-sight guidance unit 530 includes a vibration unit 560 that vibrates in conjunction with the light emission of the light emitting unit 540 and transmits the vibration to the driver J. According to this configuration, it is possible to make the driver J more reliably recognize that the object O is in a direction different from the line-of-sight direction De of the driver J.
  • the vibrating unit 560 vibrates at a position corresponding to the object direction Do with respect to the driver J.
  • the driver J can shift the line of sight to the object O side by directing the line of sight to the vibrating unit 560 side.
  • the line of sight of the driver J can be guided so that the driver J can recognize the position of the object O.
  • the vibrating portion 560 is provided in at least one of the steering handle 16, the fuel tank 24, and the step 23. According to this configuration, since the vibrating portion 560 is arranged at the portion in contact with the driver J, the vibration of the vibrating portion 560 can be effectively transmitted to the driver J.
  • the line-of-sight guidance unit 530 includes a warning sound generating unit 550 that emits a warning sound when the distance between the own vehicle M and the object O is smaller than a predetermined distance.
  • a warning sound is emitted when the degree to which the driver J needs to guide the line of sight is higher than a predetermined reference. Therefore, it is possible to make the driver J more reliably recognize that the object O is in a direction different from the line-of-sight direction De of the driver J.
  • the warning sound generation unit 550 emits a warning sound from a position corresponding to the object direction Do with respect to the driver J.
  • the driver J can shift the line of sight to the object O side by directing the line of sight to the sound source side of the warning sound.
  • the line of sight of the driver J can be guided so that the driver J can recognize the position of the object O.
  • the warning sound generation unit 550 is provided on the helmet 40. According to this configuration, the driver J can effectively perceive the warning sound.
  • the warning sound generating unit 550 can be provided on the existing helmet.
  • the light emitting unit 540 of the line-of-sight guidance unit 530 is provided on the helmet 40, but the helmet 40 is not limited to this.
  • the light emitting unit 540 of the line-of-sight guidance unit 530 may be provided in the meter device 30.
  • the configuration in which the light emitting unit 540 of the line-of-sight guidance unit 530 is provided in the meter device 30 will be described.
  • the configuration other than that described below is the same as that of the above embodiment.
  • FIG. 10 is a front view of the meter device of the second embodiment.
  • the meter device 30 of the second embodiment includes a display unit 31 on which a vehicle speed meter 32, a tachometer 33, and the like are arranged, and a frame portion 38 that surrounds the display unit 31 when viewed from the rear.
  • the display unit 31 includes a fuel gauge 34, a water temperature gauge 35, an indicator lamp 36, an information display unit 37, and the like.
  • the vehicle speed meter 32, the tachometer 33, the fuel gauge 34, and the water temperature gauge 35 are analog type that point to the scale by rotating the pointer, but may be digital type.
  • the frame portion 38 constitutes the edge of the meter device 30.
  • the frame portion 38 is provided with a light emitting portion 540.
  • the light emitting unit 540 is arranged so as to surround the display unit 31.
  • the light emitting unit 540 includes an upper light emitting unit 541 provided on the upper side of the display unit 31, a right light emitting unit 542 arranged on the right side of the display unit 31, and a left light emitting unit 543 arranged on the left side of the display unit 31.
  • the upper light emitting unit 541, the right light emitting unit 542, and the left light emitting unit 543 may be provided as separate members from each other, or may be integrally provided and formed so as to emit light independently of each other.
  • the light emitting unit 540 is controlled by the line-of-sight guidance control unit 400.
  • the line-of-sight guidance control unit 400 determines whether or not the driver's line of sight is directed to the meter device 30 based on the information of the driver's line-of-sight direction recognized by the occupant condition monitoring unit 150. When the line-of-sight guidance control unit 400 determines that the driver's line of sight is directed to the meter device 30, the line-of-sight guidance control unit 400 performs the processes of steps S10 to S90 in the above-described first embodiment.
  • the driver J when the line of sight of the driver J is directed to the meter device 30, the driver J can shift the line of sight to the object O side by directing the line of sight to the light emitting portion of the light emitting unit 540. Therefore, the line of sight of the driver J can be guided so that the driver J can more reliably recognize the position of the object O.
  • the present invention is not limited to the above-described embodiment described with reference to the drawings, and various modifications can be considered within the technical scope thereof.
  • the saddle-riding vehicle to which the driver assistance system 1 is applied includes all vehicles in which a driver wearing a helmet rides across the vehicle body, and is not limited to motorcycles, but also three wheels (in addition to one front wheel and two rear wheels). Vehicles with two front wheels and one rear wheel) are also included.
  • the driving support system 1 of the above embodiment can execute so-called automatic driving, but is not limited to this. That is, the driving support system of the present invention may be applied to a vehicle that always requires operation by the driver when traveling.
  • the full-face type helmet 40 has been described as an example of the helmet provided with the light emitting unit 540 of the line-of-sight guidance unit 530, but the present invention is not limited to this.
  • the light emitting unit 540 of the line-of-sight guidance unit 530 can be provided on various types of helmets such as a jet type, a flip-up type, and an off-road type.
  • the driver surveillance camera 90 is arranged at the front of the vehicle, but the present invention is not limited to this.
  • the driver surveillance camera includes a camera placed at the front of the vehicle to detect the direction of the driver's head and a camera placed on the helmet to detect the direction of the driver's line of sight. You may.
  • the warning sound generating unit 550 is provided on the helmet 40, but the present invention is not limited to this.
  • the warning sound generating unit may be provided on the motorcycle 10.
  • the line-of-sight guidance unit 530 may include at least a light emitting unit 540, and may not include a warning sound generating unit 550 and a vibrating unit 560.
  • the vibrating unit 560 is provided in the motorcycle 10, but the present invention is not limited to this.
  • the vibrating portion 560 may be provided at a position where it comes into contact with the driver J, and may be provided at the helmet 40.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Helmets And Other Head Coverings (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

A driving assistance system for saddle-ride type vehicles that comprises: a line-of-sight direction recognition unit (150) that recognizes the line-of-sight direction (De) of a driver (J); an object direction recognition unit (400) that recognizes the direction (Do) of an object (O) in the vicinity of the host vehicle with respect to the driver (J); and a light source (540) that emits light when the line-of-sight direction (De) and the direction (Do) of the object (O) with respect to the driver (J) are different from each other.

Description

鞍乗り型車両の運転支援システムDriving support system for saddle-riding vehicles
 本発明は、鞍乗り型車両の運転支援システムに関する。 The present invention relates to a saddle-riding vehicle driving support system.
 鞍乗り型車両において、運転者はヘルメットを装着した状態で運転する。このため、視界が狭く、運転者の視線方向によっては車両前方に存在する障害物が視界に入らない場合がある。そこで、ヘルメットの装着者に認識させたい方向を知覚させる技術がある(例えば、特許文献1参照)。 In a saddle-riding vehicle, the driver drives with a helmet on. Therefore, the field of view is narrow, and obstacles existing in front of the vehicle may not enter the field of view depending on the direction of the driver's line of sight. Therefore, there is a technique for making the wearer of the helmet perceive the direction to be recognized (see, for example, Patent Document 1).
 特許文献1には、音像の定位によってヘルメット装着者に少なくとも方向に関する情報を与える方向情報報知装置であって、ヘルメットの複数箇所に配設されるとともにそれぞれが外部信号に応答してヘルメット装着者の頭部に局所的な機械的刺激を付与する複数の刺激付与手段と、所定の音像定位位置に対応する位置に配設された刺激付与手段の少なくとも一つに対し外部信号を付与する駆動手段と、を備える方向情報報知装置が開示されている。 Patent Document 1 is a direction information notification device that gives at least direction information to a helmet wearer by localizing a sound image, and is arranged at a plurality of places on the helmet and each of them responds to an external signal to the helmet wearer. A plurality of stimulus applying means for applying a local mechanical stimulus to the head, and a driving means for applying an external signal to at least one of the stimulus applying means arranged at a position corresponding to a predetermined sound image localization position. A direction information notification device including, is disclosed.
日本国特開2007-31875号公報Japanese Patent Application Laid-Open No. 2007-31875
 しかしながら、鞍乗り型車両においては、運転者は四輪車両のようなドアおよび屋根によって囲まれた環境にいないので、騒音の影響を受けやすい。また、運転者は、車両の走行時における路面からの振動も受ける。このため、従来技術のように音および振動では、運転者に認識させたい方向を運転者に十分に知らせることができない可能性がある。 However, in saddle-riding vehicles, the driver is not in an environment surrounded by doors and roofs like a four-wheeled vehicle, so it is easily affected by noise. The driver is also subject to vibrations from the road surface when the vehicle is traveling. Therefore, it may not be possible to sufficiently inform the driver of the direction to be recognized by the driver by sound and vibration as in the prior art.
 本発明は、運転者に認識させたい方向を知らせることができる鞍乗り型車両の運転支援システムを提供する。 The present invention provides a saddle-riding vehicle driving support system that can notify the driver of the direction in which he / she wants to be recognized.
(1)本発明に係る一態様の鞍乗り型車両の運転支援システムは、運転者(J)の視線方向(De)を認識する視線方向認識部(150)と、前記運転者(J)に対する自車両周辺の物体(O)の方向(Do)を認識する物体方向認識部(400)と、前記視線方向(De)と前記運転者(J)に対する前記物体(O)の方向(Do)とが相違する場合に発光する光源(540)と、を備えることを特徴とする。 (1) The driving support system for a saddle-riding vehicle according to the present invention has a line-of-sight direction recognition unit (150) that recognizes the line-of-sight direction (De) of the driver (J) and the driver (J). An object direction recognition unit (400) that recognizes the direction (Do) of an object (O) around the own vehicle, the line-of-sight direction (De), and the direction (Do) of the object (O) with respect to the driver (J). It is characterized by including a light source (540) that emits light when the two are different.
 本態様によれば、光は騒音または路面の振動の影響を受けないので、従来技術のように音および振動を運転者に知覚させる構成と比較して、光源を発光させることで運転者の視線方向とは異なる方向に物体があることを運転者に効果的に認識させることができる。したがって、運転者に認識させたい方向を知らせることができる。 According to this aspect, since light is not affected by noise or road surface vibration, the driver's line of sight is obtained by emitting a light source as compared with a configuration in which the driver perceives sound and vibration as in the prior art. It is possible to make the driver effectively recognize that the object is in a direction different from the direction. Therefore, it is possible to inform the driver of the desired direction.
(2)上記(1)の態様の鞍乗り型車両の運転支援システムにおいて、車両前後方向および車幅方向に直交する方向から見て、前記視線方向(De)と前記運転者(J)に対する前記物体(O)の方向(Do)とが相違する場合に、前記光源(540)を発光させてもよい。 (2) In the driving support system for a saddle-riding vehicle according to the aspect (1) above, the above-mentioned with respect to the line-of-sight direction (De) and the driver (J) when viewed from a direction orthogonal to the vehicle front-rear direction and the vehicle width direction. When the direction (Do) of the object (O) is different, the light source (540) may be made to emit light.
 上記のように構成することで、運転者の視線が自車両の周辺で物体以外の方向に向いている場合に、光源が発光する。よって、運転者の視線方向とは異なる方向に物体があることを運転者に認識させることができる。 With the above configuration, the light source emits light when the driver's line of sight is directed to a direction other than an object around the own vehicle. Therefore, it is possible to make the driver recognize that the object is in a direction different from the driver's line-of-sight direction.
(3)上記(1)または(2)の態様の鞍乗り型車両の運転支援システムにおいて、車幅方向から見て、前記視線方向(De)と前記運転者(J)に対する前記物体(O)の方向(Do)とが相違する場合に、前記光源(540)を発光させてもよい。 (3) In the driver assistance system for a saddle-riding vehicle according to the aspect (1) or (2), the object (O) with respect to the line-of-sight direction (De) and the driver (J) when viewed from the vehicle width direction. The light source (540) may be made to emit light when the direction (Do) is different from that of.
 例えば、メータ装置は、運転者から見て自車両の周辺の物体よりも手前にあるため、自車両の周辺の物体に視線を向けている場合よりも視線が下方に向く。このため、上記のように構成することで、例えば運転者の視線がメータ装置に向いている場合に、光源が発光する。よって、運転者の視線方向とは異なる方向に物体があることを運転者に認識させることができる。 For example, since the meter device is in front of the objects around the own vehicle when viewed from the driver, the line of sight is directed downward compared to the case where the line of sight is directed to the objects around the own vehicle. Therefore, with the above configuration, the light source emits light, for example, when the driver's line of sight is directed to the meter device. Therefore, it is possible to make the driver recognize that the object is in a direction different from the driver's line-of-sight direction.
(4)上記(1)から(3)のいずれか1つの態様の鞍乗り型車両の運転支援システムにおいて、前記視線方向(De)と前記運転者(J)に対する前記物体(O)の方向(Do)とのなす角度(θ1,θ2)が所定の角度以上の場合に、前記光源(540)を発光させてもよい。 (4) In the driving support system for a saddle-riding vehicle according to any one of (1) to (3) above, the line-of-sight direction (De) and the direction of the object (O) with respect to the driver (J) ( When the angle (θ1, θ2) formed with Do) is equal to or greater than a predetermined angle, the light source (540) may emit light.
 上記のように構成することで、運転者の視線が物体から僅かにずれており、かつ運転者が物体を認識している状況で、光源が頻繁に発光することを抑制できる。 With the above configuration, it is possible to prevent the light source from frequently emitting light when the driver's line of sight is slightly deviated from the object and the driver is recognizing the object.
(5)上記(1)から(4)のいずれか1つの態様の鞍乗り型車両の運転支援システムにおいて、前記光源(540)は、車両と前記物体(O)との距離に応じて、発光強度、発光色、および発光周期の少なくともいずれか1つを変化させてもよい。 (5) In the driving support system for a saddle-riding vehicle according to any one of (1) to (4) above, the light source (540) emits light according to the distance between the vehicle and the object (O). At least one of intensity, emission color, and emission period may be changed.
 上記のように構成することで、運転者の視線の誘導が必要な度合に応じて光源の発光形態を変化させることができる。したがって、運転者に認識させたい方向をより確実に知らせることができる。 By configuring as described above, it is possible to change the light emitting form of the light source according to the degree to which the driver's line of sight needs to be guided. Therefore, it is possible to more reliably inform the driver of the desired direction.
(6)上記(1)から(5)のいずれか1つの態様の鞍乗り型車両の運転支援システムにおいて、前記光源(540)は、ヘルメット(40)に設けられ、前記視線方向(De)と前記運転者(J)に対する前記物体(O)の方向(Do)とが相違する場合に、前記運転者(J)の視線に対して前記物体(O)と同じ側で発光してもよい。 (6) In the driving support system for a saddle-riding vehicle according to any one of (1) to (5) above, the light source (540) is provided on the helmet (40) and has a line-of-sight direction (De). When the direction (Do) of the object (O) with respect to the driver (J) is different, light may be emitted on the same side as the object (O) with respect to the line of sight of the driver (J).
 上記のように構成することで、運転者が視線を発光している光源に向けることによって、視線を物体側に移すことができる。これにより、運転者の視線を誘導して、運転者に物体の位置を認識させることができる。 By configuring as described above, the driver can shift his / her line of sight to the object side by directing his / her line of sight to the light source that emits light. This makes it possible to guide the driver's line of sight and make the driver recognize the position of the object.
(7)上記(1)から(5)のいずれか1つの態様の鞍乗り型車両の運転支援システムにおいて、前記光源(540)は、メータ装置(30)に設けられ、前記運転者(J)の視線が前記メータ装置(30)に向き、かつ前記視線方向(De)と前記運転者(J)に対する前記物体(O)の方向(Do)とが相違する場合に、前記運転者(J)の視線に対して前記物体(O)と同じ側で発光してもよい。 (7) In the driving support system for a saddle-riding vehicle according to any one of (1) to (5) above, the light source (540) is provided in the meter device (30), and the driver (J) When the line of sight is directed to the meter device (30) and the direction of the line of sight (De) and the direction (Do) of the object (O) with respect to the driver (J) are different, the driver (J) May emit light on the same side as the object (O) with respect to the line of sight of.
 上記のように構成することで、運転者の視線がメータ装置に向いている場合に、運転者が視線を発光している光源に向けることによって、視線を物体側に移すことができる。したがって、運転者の視線を誘導して、運転者に物体の位置をより確実に認識させることができる。 With the above configuration, when the driver's line of sight is directed to the meter device, the driver can shift the line of sight to the object side by directing the line of sight to the light source emitting light. Therefore, the line of sight of the driver can be guided so that the driver can more reliably recognize the position of the object.
(8)上記(1)から(7)のいずれか1つの態様の鞍乗り型車両の運転支援システムにおいて、前記光源(540)の発光に連動して振動し、前記運転者(J)に振動を伝達する振動部(560)をさらに備えていてもよい。 (8) In the driving support system for a saddle-riding vehicle according to any one of (1) to (7) above, the vehicle vibrates in conjunction with the light emission of the light source (540) and vibrates to the driver (J). A vibrating unit (560) may be further provided.
 上記のように構成することで、運転者の視線方向とは異なる方向に物体があることをより確実に運転者に認識させることができる。 By configuring as described above, it is possible to make the driver more surely recognize that the object is in a direction different from the driver's line-of-sight direction.
(9)上記(8)の態様の鞍乗り型車両の運転支援システムにおいて、前記振動部(560)は、前記運転者(J)に対する前記物体(O)の方向に対応する位置で振動してもよい。 (9) In the driving support system for the saddle-riding vehicle according to the aspect (8), the vibrating unit (560) vibrates at a position corresponding to the direction of the object (O) with respect to the driver (J). May be good.
 上記のように構成することで、振動している振動部側に運転者が視線を向けることによって、視線を物体側に移すことができる。これにより、運転者の視線を誘導して、運転者に物体の位置を認識させることができる。 With the above configuration, the driver can direct the line of sight to the vibrating part side, so that the line of sight can be moved to the object side. This makes it possible to guide the driver's line of sight and make the driver recognize the position of the object.
(10)上記(8)または(9)の態様の鞍乗り型車両の運転支援システムにおいて、前記振動部(560)は、ヘルメット(40)、操向ハンドル(16)、燃料タンク(24)およびステップ(23)のうち少なくともいずれか1つに設けられていてもよい。 (10) In the driver assistance system for a saddle-riding vehicle according to the aspect (8) or (9), the vibrating portion (560) includes a helmet (40), a steering handle (16), a fuel tank (24), and a fuel tank (24). It may be provided in at least one of steps (23).
 上記のように構成することで、振動部が運転者に接触する部分に配置されるので、運転者に振動部の振動を効果的に伝達することができる。 With the above configuration, the vibrating part is arranged in the portion that comes into contact with the driver, so that the vibration of the vibrating part can be effectively transmitted to the driver.
(11)上記(1)から(10)のいずれか1つの態様の鞍乗り型車両の運転支援システムにおいて、前記自車両と前記物体(O)との距離が所定の距離よりも小さい場合に警告音を発する警告音発生部(550)をさらに備えていてもよい。 (11) In the driving support system for a saddle-riding vehicle according to any one of (1) to (10) above, a warning is given when the distance between the own vehicle and the object (O) is smaller than a predetermined distance. A warning sound generating unit (550) that emits a sound may be further provided.
 上記のように構成することで、運転者の視線の誘導が必要な度合が所定の基準よりも高い場合に警告音が発せられる。したがって、運転者の視線方向とは異なる方向に物体があることをより確実に運転者に認識させることができる。 With the above configuration, a warning sound is emitted when the degree to which the driver's line of sight needs to be guided is higher than the predetermined standard. Therefore, it is possible to make the driver more surely recognize that the object is in a direction different from the driver's line-of-sight direction.
(12)上記(11)の態様の鞍乗り型車両の運転支援システムにおいて、前記警告音発生部(550)は、前記運転者(J)に対する前記物体(O)の方向(Do)に対応する位置から警告音を発してもよい。 (12) In the driving support system for a saddle-riding vehicle according to the embodiment (11), the warning sound generating unit (550) corresponds to the direction (Do) of the object (O) with respect to the driver (J). A warning sound may be emitted from the position.
 上記のように構成することで、運転者が視線を警告音の音源側に向けることによって、視線を物体側に移すことができる。これにより、運転者の視線を誘導して、運転者に物体の位置を認識させることができる。 By configuring as described above, the driver can shift the line of sight to the object side by directing the line of sight to the sound source side of the warning sound. This makes it possible to guide the driver's line of sight and make the driver recognize the position of the object.
(13)上記(11)または(12)の態様の鞍乗り型車両の運転支援システムにおいて、前記警告音発生部(550)は、ヘルメット(40)への後付けデバイスであってもよい。 (13) In the driving support system for a saddle-riding vehicle according to the aspect (11) or (12), the warning sound generating unit (550) may be a retrofit device to the helmet (40).
 上記のように構成することで、既存のヘルメットに警告音発生部を設けることができる。よって、上述した作用効果を奏する鞍乗り型車両の運転支援システムを導入容易とすることができる。 With the above configuration, a warning sound generator can be provided on the existing helmet. Therefore, it is possible to easily introduce a driving support system for a saddle-riding vehicle that exhibits the above-mentioned effects.
(14)上記(11)または(12)の態様の鞍乗り型車両の運転支援システムにおいて、前記警告音発生部(550)は、ヘルメット(40)に設けられていてもよい。 (14) In the driving support system for a saddle-riding vehicle according to the aspect (11) or (12), the warning sound generating unit (550) may be provided on the helmet (40).
 上記のように構成することで、運転者に警告音を効果的に知覚させることができる。 By configuring as described above, the driver can effectively perceive the warning sound.
 上記の鞍乗り型車両の運転支援システムによれば、運転者に認識させたい方向を知らせることができる。 According to the above-mentioned saddle-riding vehicle driving support system, it is possible to inform the driver of the direction to be recognized.
第1実施形態に係る運転支援システムの構成図である。It is a block diagram of the driving support system which concerns on 1st Embodiment. 自車位置認識部により走行車線に対する自車両の相対位置および姿勢が認識される様子を示す図である。It is a figure which shows the state which the relative position and posture of the own vehicle with respect to the traveling lane are recognized by the own vehicle position recognition part. 推奨車線に基づいて目標軌道が生成される様子を示す図である。It is a figure which shows how the target trajectory is generated based on a recommended lane. 第1実施形態の自動二輪車の左側面図である。It is a left side view of the motorcycle of 1st Embodiment. 第1実施形態のヘルメットの斜視図である。It is a perspective view of the helmet of 1st Embodiment. 視線誘導制御部による処理の流れを示すフローチャートである。It is a flowchart which shows the flow of processing by a line-of-sight guidance control unit. 視線方向と物体方向とが相違する場面を示す図である。It is a figure which shows the scene where the line-of-sight direction and the object direction are different. 視線方向と物体方向とが相違する場面を示す図である。It is a figure which shows the scene where the line-of-sight direction and the object direction are different. 発光部の発光状態を説明する図である。It is a figure explaining the light emitting state of a light emitting part. 第2実施形態のメータ装置の正面図である。It is a front view of the meter device of 2nd Embodiment.
 以下、図面を参照し、本実施形態の鞍乗り型車両の運転支援システムの一例について説明する。実施形態では、運転支援システムが自動運転車両に適用されたものとする。自動運転は、運転者による操作を原則として必要としない状態で車両が走行することをいい、運転支援の一種である。ここで、運転支援には、度合が存在する。例えば、運転支援の度合には、ACC(Adaptive Cruise Control System)やLKAS(Lane Keeping Assistance System)等の運転支援装置が作動することで運転支援を実行する第1の度合と、第1の度合よりも制御度合が高く、運転者が車両の運転操作子に対する操作を行わずに、車両の加減速または操舵のうち少なくとも一方を自動的に制御して自動運転を実行するが、運転者にある程度の周辺監視義務を課す第2の度合と、第2の度合よりも制御度合が高く、運転者に周辺監視義務を課さない(または第2の度合よりも低い周辺監視義務を課す)第3の度合と、がある。本実施形態において、第2の度合および第3の度合の運転支援が自動運転に相当する。 Hereinafter, an example of the driving support system for the saddle-riding vehicle of the present embodiment will be described with reference to the drawings. In the embodiment, it is assumed that the driving support system is applied to the autonomous driving vehicle. Autonomous driving is a type of driving assistance in which a vehicle runs in a state that does not require operation by the driver in principle. Here, there is a degree of driving support. For example, the degree of driving support includes the first degree of driving assistance by operating a driving support device such as ACC (Adaptive Cruise Control System) or LKAS (Lane Keeping Assistance System), and the first degree of driving assistance. The degree of control is also high, and the driver automatically controls at least one of acceleration / deceleration or steering of the vehicle without operating the driver of the vehicle to perform automatic driving, but the driver has some degree of control. A second degree that imposes a peripheral monitoring obligation, and a third degree that has a higher degree of control than the second degree and does not impose a peripheral monitoring obligation on the driver (or imposes a lower peripheral monitoring obligation than the second degree). And there is. In the present embodiment, the second degree and the third degree of driving support correspond to automatic driving.
(第1実施形態)
<全体構成>
 図1は、第1実施形態に係る運転支援システムの構成図である。
 図1に示す運転支援システム1が搭載される車両は、二輪や三輪等の鞍乗り型車両である。車両の原動機は、ガソリンエンジン等の内燃機関、電動機、または内燃機関および電動機の組み合わせである。電動機は、内燃機関に連結された発電機による発電電力、または、二次電池もしくは燃料電池の放電電力を使用して動作する。
(First Embodiment)
<Overall configuration>
FIG. 1 is a configuration diagram of a driving support system according to the first embodiment.
The vehicle equipped with the driving support system 1 shown in FIG. 1 is a saddle-riding vehicle such as a two-wheeled vehicle or a three-wheeled vehicle. The prime mover of a vehicle is an internal combustion engine such as a gasoline engine, an electric motor, or a combination of an internal combustion engine and an electric motor. The electric motor operates by using the electric power generated by the generator connected to the internal combustion engine or the electric power generated by the secondary battery or the fuel cell.
 例えば、運転支援システム1は、カメラ51と、レーダ装置52と、ファインダ53と、物体認識装置54(物体方向認識部)と、通信装置55と、HMI(Human Machine Interface)56と、車両センサ57と、ナビゲーション装置60と、MPU(Map Positioning Unit)70と、運転操作子80と、運転者監視カメラ90と、制御装置100と、走行駆動力出力装置500と、ブレーキ装置510と、ステアリング装置520と、視線誘導部530と、を備える。これらの装置や機器は、CAN(Controller Area Network)通信線等の多重通信線やシリアル通信線、無線通信網等によって互いに接続される。なお、図1に示す構成はあくまで一例であり、構成の一部が省略されてもよいし、更に別の構成が追加されてもよい。 For example, the driving support system 1 includes a camera 51, a radar device 52, a finder 53, an object recognition device 54 (object direction recognition unit), a communication device 55, an HMI (Human Machine Interface) 56, and a vehicle sensor 57. A navigation device 60, an MPU (Map Positioning Unit) 70, a driving operator 80, a driver monitoring camera 90, a control device 100, a traveling driving force output device 500, a braking device 510, and a steering device 520. And a line-of-sight guidance unit 530. These devices and devices are connected to each other by multiple communication lines such as CAN (Controller Area Network) communication lines, serial communication lines, wireless communication networks, and the like. The configuration shown in FIG. 1 is merely an example, and a part of the configuration may be omitted or another configuration may be added.
 カメラ51は、例えば、CCD(Charge Coupled Device)やCMOS(Complementary Metal Oxide Semiconductor)等の固体撮像素子を利用したデジタルカメラである。カメラ51は、運転支援システム1が搭載される車両(以下、自車両M)の任意の箇所に取り付けられる。カメラ51は、例えば、周期的に繰り返し自車両Mの周辺を撮像する。カメラ51は、ステレオカメラであってもよい。 The camera 51 is a digital camera that uses a solid-state image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor). The camera 51 is attached to an arbitrary position of the vehicle (hereinafter, own vehicle M) on which the driving support system 1 is mounted. The camera 51 periodically and repeatedly images the periphery of the own vehicle M, for example. The camera 51 may be a stereo camera.
 レーダ装置52は、自車両Mの周辺にミリ波等の電波を放射すると共に、物体によって反射された電波(反射波)を検出して少なくとも物体の位置(距離および方位)を検出する。レーダ装置52は、自車両Mの任意の箇所に取り付けられる。レーダ装置52は、FM-CW(Frequency Modulated Continuous Wave)方式によって物体の位置および速度を検出してもよい。 The radar device 52 radiates radio waves such as millimeter waves around the own vehicle M, and also detects radio waves (reflected waves) reflected by the object to detect at least the position (distance and direction) of the object. The radar device 52 is attached to an arbitrary position of the own vehicle M. The radar device 52 may detect the position and speed of the object by the FM-CW (Frequency Modulated Continuous Wave) method.
 ファインダ53は、LIDAR(Light Detection and Ranging)である。ファインダ53は、自車両Mの周辺に光を照射し、散乱光を測定する。ファインダ53は、発光から受光までの時間に基づいて、対象までの距離を検出する。照射される光は、例えば、パルス状のレーザー光である。ファインダ53は、自車両Mの任意の箇所に取り付けられる。 The finder 53 is a LIDAR (Light Detection and Ringing). The finder 53 irradiates the periphery of the own vehicle M with light and measures the scattered light. The finder 53 detects the distance to the target based on the time from light emission to light reception. The light to be irradiated is, for example, a pulsed laser beam. The finder 53 is attached to an arbitrary position of the own vehicle M.
 物体認識装置54は、カメラ51、レーダ装置52、およびファインダ53のうち一部または全部による検出結果に対してセンサフュージョン処理を行って、自車両Mの周辺の物体の位置や種類、速度等を認識する。物体認識装置54は、認識結果を制御装置100に出力する。物体認識装置54は、カメラ51、レーダ装置52、およびファインダ53の検出結果をそのまま制御装置100に出力してよい。 The object recognition device 54 performs sensor fusion processing on the detection results of a part or all of the camera 51, the radar device 52, and the finder 53 to determine the position, type, speed, and the like of the objects around the own vehicle M. recognize. The object recognition device 54 outputs the recognition result to the control device 100. The object recognition device 54 may output the detection results of the camera 51, the radar device 52, and the finder 53 to the control device 100 as they are.
 通信装置55は、例えば、セルラー網やWi-Fi網、Bluetooth(登録商標)、DSRC(Dedicated Short Range Communication)等を利用して、自車両Mの周辺に存在する他車両と通信し、または無線基地局を介して各種サーバ装置と通信する。 The communication device 55 communicates with another vehicle existing in the vicinity of the own vehicle M by using, for example, a cellular network, a Wi-Fi network, Bluetooth (registered trademark), DSRC (Dedicated Short Range Communication), or wirelessly. Communicates with various server devices via the base station.
 HMI56は、自車両Mの運転者に対して各種情報を提示すると共に、運転者による入力操作を受け付ける。HMI56は、各種表示装置、スピーカ、ブザー、タッチパネル、スイッチ、キーなどを含む。 The HMI 56 presents various information to the driver of the own vehicle M and accepts input operations by the driver. The HMI 56 includes various display devices, speakers, buzzers, touch panels, switches, keys and the like.
 車両センサ57は、自車両Mの速度を検出する車速センサや、加速度を検出する加速度センサ、鉛直軸回りの角速度を検出するヨーレートセンサ、自車両Mの向きを検出する方位センサ等を含む。 The vehicle sensor 57 includes a vehicle speed sensor that detects the speed of the own vehicle M, an acceleration sensor that detects the acceleration, a yaw rate sensor that detects the angular velocity around the vertical axis, an orientation sensor that detects the direction of the own vehicle M, and the like.
 ナビゲーション装置60は、例えば、GNSS(Global Navigation Satellite System)受信機61と、ナビHMI62と、経路決定部63と、を備える。ナビゲーション装置60は、HDD(Hard Disk Drive)やフラッシュメモリ等の記憶装置に第1地図情報64を保持している。GNSS受信機61は、GNSS衛星から受信した信号に基づいて、自車両Mの位置を特定する。自車両Mの位置は、車両センサ57の出力を利用したINS(Inertial Navigation System)によって特定または補完されてもよい。ナビHMI62は、表示装置やスピーカ、タッチパネル、キー等を含む。ナビHMI62は、前述したHMI56と一部または全部が共通化されてもよい。経路決定部63は、例えば、GNSS受信機61により特定された自車両Mの位置(または入力された任意の位置)から、ナビHMI62を用いて乗員により入力された目的地までの経路(以下、地図上経路)を、第1地図情報64を参照して決定する。第1地図情報64は、例えば、道路を示すリンクと、リンクによって接続されたノードと、によって道路形状が表現された情報である。第1地図情報64は、道路の曲率やPOI(Point Of Interest)情報等を含んでもよい。地図上経路は、MPU70に出力される。ナビゲーション装置60は、地図上経路に基づいて、ナビHMI62を用いた経路案内を行ってもよい。ナビゲーション装置60は、例えば、乗員の保有するスマートフォンやタブレット端末等の端末装置の機能によって実現されてもよい。ナビゲーション装置60は、通信装置55を介してナビゲーションサーバに現在位置と目的地を送信し、ナビゲーションサーバから地図上経路と同等の経路を取得してもよい。 The navigation device 60 includes, for example, a GNSS (Global Navigation Satellite System) receiver 61, a navigation HMI 62, and a route determination unit 63. The navigation device 60 holds the first map information 64 in a storage device such as an HDD (Hard Disk Drive) or a flash memory. The GNSS receiver 61 identifies the position of the own vehicle M based on the signal received from the GNSS satellite. The position of the own vehicle M may be specified or complemented by an INS (Inertial Navigation System) using the output of the vehicle sensor 57. The navigation HMI 62 includes a display device, a speaker, a touch panel, keys, and the like. The navigation HMI 62 may be partially or wholly shared with the above-mentioned HMI 56. The route determination unit 63, for example, has a route from the position of the own vehicle M (or an arbitrary position input) specified by the GNSS receiver 61 to the destination input by the occupant using the navigation HMI 62 (hereinafter,). The route on the map) is determined with reference to the first map information 64. The first map information 64 is information in which the road shape is expressed by, for example, a link indicating a road and a node connected by the link. The first map information 64 may include road curvature, POI (Point Of Interest) information, and the like. The route on the map is output to the MPU 70. The navigation device 60 may provide route guidance using the navigation HMI 62 based on the route on the map. The navigation device 60 may be realized by, for example, the function of a terminal device such as a smartphone or a tablet terminal owned by an occupant. The navigation device 60 may transmit the current position and the destination to the navigation server via the communication device 55, and may acquire a route equivalent to the route on the map from the navigation server.
 MPU70は、例えば、推奨車線決定部71を含む。MPU70は、HDDやフラッシュメモリ等の記憶装置に第2地図情報72を保持している。推奨車線決定部71は、ナビゲーション装置60から提供された地図上経路を複数のブロックに分割し(例えば、車両進行方向に関して100[m]毎に分割し)、第2地図情報72を参照してブロックごとに推奨車線を決定する。推奨車線決定部71は、左から何番目の車線を走行するといった決定を行う。推奨車線決定部71は、地図上経路に分岐箇所が存在する場合、自車両Mが分岐先に進行するための合理的な経路を走行できるように、推奨車線を決定する。 The MPU 70 includes, for example, a recommended lane determination unit 71. The MPU 70 holds the second map information 72 in a storage device such as an HDD or a flash memory. The recommended lane determination unit 71 divides the route on the map provided by the navigation device 60 into a plurality of blocks (for example, divides the route every 100 [m] with respect to the vehicle traveling direction), and refers to the second map information 72. Determine the recommended lane for each block. The recommended lane determination unit 71 determines which lane to drive from the left. When the recommended lane determination unit 71 has a branch point on the route on the map, the recommended lane determination unit 71 determines the recommended lane so that the own vehicle M can travel on a reasonable route to proceed to the branch destination.
 第2地図情報72は、第1地図情報64よりも高精度な地図情報である。第2地図情報72は、例えば、車線の中央の情報、または車線の境界の情報等を含んでいる。また、第2地図情報72には、道路情報や交通規制情報、住所情報(住所・郵便番号)、施設情報、電話番号情報等が含まれてよい。第2地図情報72は、通信装置55が他装置と通信することにより、随時、アップデートされてもよい。 The second map information 72 is more accurate map information than the first map information 64. The second map information 72 includes, for example, information on the center of the lane, information on the boundary of the lane, and the like. Further, the second map information 72 may include road information, traffic regulation information, address information (address / zip code), facility information, telephone number information, and the like. The second map information 72 may be updated at any time by the communication device 55 communicating with another device.
 運転操作子80は、例えば、アクセルグリップや、ブレーキペダル、ブレーキレバー、シフトペダル、操向ハンドル等の操作子を含む。運転操作子80には、操作量または操作の有無を検出するセンサが取り付けられている。センサの検出結果は、制御装置100、または、走行駆動力出力装置500、ブレーキ装置510、およびステアリング装置520のうち一部もしくは全部に出力される。 The driving operator 80 includes, for example, an accelerator grip, an operator such as a brake pedal, a brake lever, a shift pedal, and a steering handle. A sensor for detecting the amount of operation or the presence or absence of operation is attached to the operation operator 80. The detection result of the sensor is output to a part or all of the control device 100, the traveling driving force output device 500, the brake device 510, and the steering device 520.
 運転者監視カメラ90は、シートに着座する運転者を撮像可能な位置に配置されている。例えば、運転者監視カメラ90は、自車両Mの前部に取り付けられている。運転者監視カメラ90は、例えば、シートに着座する運転者の顔を中心に撮像する。運転者監視カメラ90は、CCDやCMOS等の固体撮像素子を利用したデジタルカメラである。運転者監視カメラ90は、例えば、周期的に運転者を撮像する。運転者監視カメラ90の撮像画像は、制御装置100に出力される。 The driver monitoring camera 90 is arranged at a position where the driver sitting on the seat can be imaged. For example, the driver surveillance camera 90 is attached to the front portion of the own vehicle M. The driver monitoring camera 90, for example, takes an image of the face of the driver sitting on the seat. The driver surveillance camera 90 is a digital camera that uses a solid-state image sensor such as a CCD or CMOS. The driver monitoring camera 90 periodically images the driver, for example. The captured image of the driver monitoring camera 90 is output to the control device 100.
 制御装置100は、マスター制御部110と、運転支援制御部200と、自動運転制御部300と、視線誘導制御部400と、を備える。なお、マスター制御部110は、運転支援制御部200または自動運転制御部300のどちらかに統合されてもよい。 The control device 100 includes a master control unit 110, a driving support control unit 200, an automatic driving control unit 300, and a line-of-sight guidance control unit 400. The master control unit 110 may be integrated into either the operation support control unit 200 or the automatic operation control unit 300.
 マスター制御部110は、運転支援の度合の切り替え、およびHMI56の制御を行う。例えば、マスター制御部110は、切替制御部120と、HMI制御部130と、操作子状態判定部140と、乗員状態監視部150(視線方向認識部)と、を備える。切替制御部120、HMI制御部130、操作子状態判定部140、および乗員状態監視部150は、それぞれ、CPU(Central Processing Unit)等のプロセッサがプログラムを実行することで実現される。また、これらの機能部のうち一部または全部は、LSI(Large Scale Integration)やASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)等のハードウェアによって実現されてもよいし、ソフトウェアとハードウェアの協働によって実現されてもよい。 The master control unit 110 switches the degree of driving support and controls the HMI 56. For example, the master control unit 110 includes a switching control unit 120, an HMI control unit 130, an operator state determination unit 140, and an occupant condition monitoring unit 150 (line-of-sight direction recognition unit). The switching control unit 120, the HMI control unit 130, the operator state determination unit 140, and the occupant condition monitoring unit 150 are each realized by executing a program by a processor such as a CPU (Central Processing Unit). In addition, some or all of these functional parts may be realized by hardware such as LSI (Large Scale Integration), ASIC (Application Specific Integrated Circuit), FPGA (Field-Programmable Gate Array), or software. And may be realized by the cooperation of hardware.
 切替制御部120は、例えば、HMI56に含まれる所定のスイッチから入力される操作信号に基づいて運転支援の度合を切り替える。また、切替制御部120は、例えば、アクセルグリップやブレーキペダル、ブレーキレバー、操向ハンドル等の運転操作子80に対する加速、減速または操舵を指示する操作に基づいて、運転支援をキャンセルして手動運転に切り替えてもよい。 The switching control unit 120 switches the degree of driving support based on, for example, an operation signal input from a predetermined switch included in the HMI 56. Further, the switching control unit 120 cancels the driving support and manually operates the vehicle based on, for example, an operation of instructing the driving controller 80 such as the accelerator grip, the brake pedal, the brake lever, and the steering handle to accelerate, decelerate, or steer. You may switch to.
 なお、切替制御部120は、後述する行動計画生成部330により生成される行動計画に基づいて、運転支援の度合を切り替えてもよい。例えば、切替制御部120は、行動計画によって規定される自動運転の終了予定地点で、運転支援を終了するようにしてもよい。 The switching control unit 120 may switch the degree of driving support based on the action plan generated by the action plan generation unit 330 described later. For example, the switching control unit 120 may end the driving support at the scheduled end point of the automatic driving defined by the action plan.
 HMI制御部130は、運転支援の度合の切り替えに関連する通知等を、HMI56に出力させる。また、HMI制御部130は、自車両Mに対する所定の事象が発生した場合に、HMI56に出力する内容を切り替える。また、HMI制御部130は、操作子状態判定部140または乗員状態監視部150の一方または双方による判定結果に関する情報を、HMI56に出力させてもよい。また、HMI制御部130は、HMI56により受け付けられた情報を運転支援制御部200または自動運転制御部300の一方または双方に出力してもよい。 The HMI control unit 130 causes the HMI 56 to output a notification or the like related to switching the degree of driving support. Further, the HMI control unit 130 switches the content to be output to the HMI 56 when a predetermined event for the own vehicle M occurs. Further, the HMI control unit 130 may output the information regarding the determination result by one or both of the operator state determination unit 140 and the occupant condition monitoring unit 150 to the HMI 56. Further, the HMI control unit 130 may output the information received by the HMI 56 to one or both of the operation support control unit 200 and the automatic operation control unit 300.
 操作子状態判定部140は、例えば、運転操作子80に含まれる操向ハンドルが操作されている状態(具体的には、現に意図的な操作を行っている場合、直ちに操作可能な状態、または把持状態を指すものとする)であるか否かを判定する。 The operator state determination unit 140 is, for example, in a state in which the steering handle included in the operation operator 80 is being operated (specifically, when an intentional operation is actually performed, a state in which the steering wheel can be immediately operated, or (It shall indicate the gripping state).
 乗員状態監視部150は、運転者監視カメラ90の撮像画像に基づいて、運転者の状態を監視する。乗員状態監視部150は、運転者が周辺の交通状況を継続して監視していることを監視する。乗員状態監視部150は、運転者監視カメラ90の撮像画像により運転者の顔画像を取得し、取得した顔画像から運転者の視線方向を認識する。例えば、乗員状態監視部150は、ニューラルネットワーク等を利用したディープラーニングによって、運転者監視カメラ90の撮像画像から乗員の視線方向を認識してもよい。 The occupant condition monitoring unit 150 monitors the driver's condition based on the image captured by the driver monitoring camera 90. The occupant condition monitoring unit 150 monitors that the driver is continuously monitoring the traffic conditions in the surrounding area. The occupant condition monitoring unit 150 acquires the driver's face image from the image captured by the driver monitoring camera 90, and recognizes the driver's line-of-sight direction from the acquired face image. For example, the occupant condition monitoring unit 150 may recognize the line-of-sight direction of the occupant from the image captured by the driver monitoring camera 90 by deep learning using a neural network or the like.
 運転支援制御部200は、第1の度合の運転支援を実行する。運転支援制御部200は、例えば、ACCやLKASその他の運転支援制御を実行する。例えば、運転支援制御部200は、ACCを実行する際には、カメラ51、レーダ装置52、およびファインダ53から物体認識装置54を介して入力される情報に基づいて、自車両Mと、前走車両との車間距離を一定に保った状態で走行するように走行駆動力出力装置500およびブレーキ装置510を制御する。すなわち、運転支援制御部200は、前走車両との車間距離に基づく加減速制御(速度制御)を行う。また、運転支援制御部200は、LKASを実行する際には、自車両Mが、現在走行中の走行車線を維持(レーンキープ)しながら走行するようにステアリング装置520を制御する。すなわち、運転支援制御部200は、車線維持のための操舵制御を行う。第1の度合の運転支援の種類に関しては、運転操作子80への操作を要求しない自動運転(第2の度合および第3の度合)以外の種々の制御を含んでよい。 The driving support control unit 200 executes the first degree of driving support. The driving support control unit 200 executes, for example, ACC, LKAS, or other driving support control. For example, when the driving support control unit 200 executes the ACC, the driving support control unit 200 moves forward with the own vehicle M based on the information input from the camera 51, the radar device 52, and the finder 53 via the object recognition device 54. The traveling driving force output device 500 and the braking device 510 are controlled so that the vehicle travels while keeping the distance between the vehicle and the vehicle constant. That is, the driving support control unit 200 performs acceleration / deceleration control (speed control) based on the inter-vehicle distance from the vehicle in front. Further, when executing the LKAS, the driving support control unit 200 controls the steering device 520 so that the own vehicle M travels while maintaining (lane keeping) the traveling lane in which the vehicle is currently traveling. That is, the driving support control unit 200 performs steering control for maintaining the lane. The type of driving support of the first degree may include various controls other than automatic driving (second degree and third degree) that do not require an operation on the driving operator 80.
 自動運転制御部300は、第2の度合および第3の度合の運転支援を実行する。自動運転制御部300は、例えば、第1制御部310と、第2制御部350と、を備える。第1制御部310および第2制御部350のそれぞれは、例えば、CPU等のハードウェアプロセッサがプログラム(ソフトウェア)を実行することにより実現される。また、これらの構成要素のうち一部または全部は、LSIやASIC、FPGA、GPU等のハードウェアによって実現されてもよいし、ソフトウェアとハードウェアの協働によって実現されてもよい。 The automatic driving control unit 300 executes the driving support of the second degree and the third degree. The automatic operation control unit 300 includes, for example, a first control unit 310 and a second control unit 350. Each of the first control unit 310 and the second control unit 350 is realized by, for example, a hardware processor such as a CPU executing a program (software). In addition, some or all of these components may be realized by hardware such as LSI, ASIC, FPGA, GPU, or may be realized by collaboration between software and hardware.
 第1制御部310は、例えば、認識部320と、行動計画生成部330と、を備える。第1制御部310は、例えば、AI(Artificial Intelligence;人工知能)による機能と、予め与えられたモデルによる機能と、を並行して実現する。例えば、「交差点を認識する」機能は、ディープラーニング等による交差点の認識と、予め与えられた条件(パターンマッチング可能な信号や道路標示等)に基づく認識と、が並行して実行され、双方に対してスコア付けして総合的に評価することで実現されてもよい。これによって、自動運転の信頼性が担保される。 The first control unit 310 includes, for example, a recognition unit 320 and an action plan generation unit 330. The first control unit 310, for example, realizes a function by AI (Artificial Intelligence) and a function by a model given in advance in parallel. For example, in the "intersection recognition" function, recognition of an intersection by deep learning or the like and recognition based on predetermined conditions (pattern matching signals, road markings, etc.) are executed in parallel, and both are executed in parallel. On the other hand, it may be realized by scoring and comprehensively evaluating. This ensures the reliability of autonomous driving.
 認識部320は、カメラ51、レーダ装置52、およびファインダ53から物体認識装置54を介して入力された情報に基づいて、周辺車両の位置や速度、加速度等の状態を認識する。周辺車両の位置は、例えば、自車両Mの代表点(重心や駆動軸中心など)を原点とした絶対座標上の位置として認識され、制御に使用される。周辺車両の位置は、その周辺車両の重心やコーナー等の代表点で表されてもよいし、表現された領域で表されてもよい。周辺車両の「状態」とは、物体の加速度もしくはジャーク、または「行動状態」(例えば車線変更をしている、またはしようとしているか否か)を含んでもよい。 The recognition unit 320 recognizes states such as the position, speed, and acceleration of surrounding vehicles based on the information input from the camera 51, the radar device 52, and the finder 53 via the object recognition device 54. The positions of peripheral vehicles are recognized as, for example, positions on absolute coordinates with the representative point (center of gravity, center of drive shaft, etc.) of the own vehicle M as the origin, and are used for control. The position of the peripheral vehicle may be represented by a representative point such as the center of gravity or a corner of the peripheral vehicle, or may be represented by the represented area. The "state" of the surrounding vehicle may include the acceleration or jerk of the object, or the "behavioral state" (eg, whether or not the vehicle is changing lanes or is about to change lanes).
 また、認識部320は、例えば、自車両Mが走行している車線(走行車線)を認識する。例えば、認識部320は、第2地図情報72から得られる道路区画線のパターン(例えば実線と破線の配列)と、カメラ51によって撮像された画像から認識される自車両Mの周辺の道路区画線のパターンと、を比較することで、走行車線を認識する。なお、認識部320は、道路区画線に限らず、道路区画線や路肩、縁石、中央分離帯、ガードレール等を含む走路境界(道路境界)を認識することで、走行車線を認識してもよい。この認識において、ナビゲーション装置60から取得される自車両Mの位置、またはINSによる処理結果が加味されてもよい。また、認識部320は、一時停止線や障害物、赤信号、料金所、その他の道路事象等を認識する。 Further, the recognition unit 320 recognizes, for example, the lane (traveling lane) in which the own vehicle M is traveling. For example, the recognition unit 320 has a road marking line pattern (for example, an arrangement of a solid line and a broken line) obtained from the second map information 72 and a road marking line around the own vehicle M recognized from the image captured by the camera 51. By comparing with the pattern of, the traveling lane is recognized. The recognition unit 320 may recognize the traveling lane by recognizing not only the road marking line but also the running road boundary (road boundary) including the road marking line, the shoulder, the curb, the median strip, the guardrail, and the like. .. In this recognition, the position of the own vehicle M acquired from the navigation device 60 or the processing result by the INS may be added. In addition, the recognition unit 320 recognizes a stop line, an obstacle, a red light, a tollgate, other road events, and the like.
 認識部320は、走行車線を認識する際に、走行車線に対する自車両Mの位置および姿勢を認識する。
 図2は、認識部により走行車線に対する自車両の相対位置および姿勢が認識される様子の一例を示す図である。
 図2に示すように、認識部320は、例えば、自車両Mの基準点(例えば重心)の走行車線中央CLからの乖離OS、および自車両Mの進行方向の走行車線中央CLを連ねた線に対してなす角度θを、走行車線L1に対する自車両Mの相対位置および姿勢として認識してもよい。また、これに代えて、認識部320は、走行車線L1の何れかの側端部(道路区画線または道路境界)に対する自車両Mの基準点の位置等を、走行車線に対する自車両Mの相対位置として認識してもよい。
When recognizing the traveling lane, the recognition unit 320 recognizes the position and orientation of the own vehicle M with respect to the traveling lane.
FIG. 2 is a diagram showing an example of how the recognition unit recognizes the relative position and posture of the own vehicle with respect to the traveling lane.
As shown in FIG. 2, the recognition unit 320 is, for example, a line connecting the deviation OS of the reference point (for example, the center of gravity) of the own vehicle M from the central CL of the traveling lane and the central CL of the traveling lane in the traveling direction of the own vehicle M. The angle θ formed with respect to the traveling lane L1 may be recognized as the relative position and orientation of the own vehicle M with respect to the traveling lane L1. Alternatively, the recognition unit 320 sets the position of the reference point of the own vehicle M with respect to any side end (road marking line or road boundary) of the traveling lane L1 relative to the traveling lane. It may be recognized as a position.
 図1に示すように、行動計画生成部330は、自動運転により自車両Mを走行させる行動計画を生成する。行動計画生成部330は、原則的には推奨車線決定部71により決定された推奨車線を走行し、更に、自車両Mの周辺状況に対応できるように、自車両Mが自動的に(運転者の操作に依らずに)将来走行する目標軌道を生成する。目標軌道には、例えば、将来の自車両Mの位置を定めた位置要素と、将来の自車両Mの速度や加速度等を定めた速度要素と、が含まれる。例えば、行動計画生成部330は、自車両Mが順に到達すべき複数の地点(軌道点)を、目標軌道の位置要素として決定する。軌道点は、所定の走行距離(例えば数[m]程度)ごとの自車両Mの到達すべき地点である。所定の走行距離は、例えば、経路に沿って進んだときの道なり距離によって計算されてもよい。また、行動計画生成部330は、所定のサンプリング時間(例えば0コンマ数秒程度)ごとの目標速度および目標加速度を、目標軌道の速度要素として決定する。また、軌道点は、所定のサンプリング時間ごとの、そのサンプリング時刻における自車両Mの到達すべき位置であってもよい。この場合、目標速度および目標加速度は、サンプリング時間および軌道点の間隔によって決定される。 As shown in FIG. 1, the action plan generation unit 330 generates an action plan for driving the own vehicle M by automatic driving. In principle, the action plan generation unit 330 travels in the recommended lane determined by the recommended lane determination unit 71, and the own vehicle M automatically (driver) so as to be able to respond to the surrounding conditions of the own vehicle M. Generate a target track to run in the future (regardless of the operation of). The target trajectory includes, for example, a position element that determines the position of the own vehicle M in the future and a speed element that determines the speed and acceleration of the own vehicle M in the future. For example, the action plan generation unit 330 determines a plurality of points (track points) that the own vehicle M should reach in order as position elements of the target track. The track point is a point to be reached by the own vehicle M for each predetermined mileage (for example, about several [m]). The predetermined mileage may be calculated, for example, by the road distance when traveling along the route. Further, the action plan generation unit 330 determines the target speed and the target acceleration for each predetermined sampling time (for example, about 0 comma several seconds) as the speed elements of the target trajectory. Further, the track point may be a position to be reached by the own vehicle M at the sampling time for each predetermined sampling time. In this case, the target velocity and target acceleration are determined by the sampling time and the interval between the orbital points.
 行動計画生成部330は、目標軌道を生成するにあたり、自動運転のイベントを設定してよい。自動運転のイベントには、例えば、一定速度で同じ走行車線を走行する定速走行イベントや、前走車両に追従して走行する追従走行イベント、自車両Mの走行車線を変更する車線変更イベント、道路の分岐地点で自車両Mを目的の方向に走行させる分岐イベント、合流地点で自車両Mを合流させる合流イベント、前走車両を追い越す追い越しイベント等がある。行動計画生成部330は、起動させたイベントに応じた目標軌道を生成する。 The action plan generation unit 330 may set an event for automatic driving when generating a target trajectory. Examples of the automatic driving event include a constant speed driving event in which the vehicle travels in the same lane at a constant speed, a following driving event in which the vehicle follows the vehicle in front, and a lane change event in which the vehicle M changes the traveling lane. There are a branching event in which the own vehicle M travels in a desired direction at a branch point on the road, a merging event in which the own vehicle M merges at the merging point, an overtaking event in which the preceding vehicle overtakes, and the like. The action plan generation unit 330 generates a target trajectory according to the activated event.
 図3は、推奨車線に基づいて目標軌道が生成される様子を示す図である。
 図3に示すように、推奨車線は、目的地までの経路に沿って走行するのに都合が良いように設定される。行動計画生成部330は、推奨車線の切り替わり地点の所定距離手前(イベントの種類に応じて決定されてよい)に差し掛かると、車線変更イベントや分岐イベント、合流イベント等を起動する。各イベントの実行中に、障害物を回避する必要が生じた場合には、図示するように回避軌道が生成される。
FIG. 3 is a diagram showing how a target trajectory is generated based on the recommended lane.
As shown in FIG. 3, the recommended lane is set so as to be convenient for traveling along the route to the destination. When the action plan generation unit 330 approaches a predetermined distance before the recommended lane switching point (may be determined according to the type of event), the action plan generation unit 330 activates a lane change event, a branch event, a merging event, and the like. If it becomes necessary to avoid an obstacle during the execution of each event, an avoidance trajectory is generated as shown in the figure.
 図1に戻り、第2制御部350は、行動計画生成部330によって生成された目標軌道を、予定の時刻通りに自車両Mが通過するように、走行駆動力出力装置500、ブレーキ装置510、およびステアリング装置520を制御する。 Returning to FIG. 1, the second control unit 350 has a traveling driving force output device 500, a brake device 510, so that the own vehicle M passes the target trajectory generated by the action plan generation unit 330 at the scheduled time. And controls the steering device 520.
 第2制御部350は、例えば、取得部352と、速度制御部354と、操舵制御部356と、を備える。取得部352は、行動計画生成部330により生成された目標軌道(軌道点)の情報を取得し、メモリ(不図示)に記憶させる。速度制御部354は、メモリに記憶された目標軌道に付随する速度要素に基づいて、走行駆動力出力装置500またはブレーキ装置510を制御する。操舵制御部356は、メモリに記憶された目標軌道の曲がり具合に応じて、ステアリング装置520を制御する。速度制御部354および操舵制御部356の処理は、例えば、フィードフォワード制御とフィードバック制御との組み合わせにより実現される。一例として、操舵制御部356は、自車両Mの前方の道路の曲率に応じたフィードフォワード制御と、目標軌道からの乖離に基づくフィードバック制御と、を組み合わせて実行する。 The second control unit 350 includes, for example, an acquisition unit 352, a speed control unit 354, and a steering control unit 356. The acquisition unit 352 acquires the information of the target trajectory (orbit point) generated by the action plan generation unit 330 and stores it in a memory (not shown). The speed control unit 354 controls the traveling driving force output device 500 or the brake device 510 based on the speed element associated with the target trajectory stored in the memory. The steering control unit 356 controls the steering device 520 according to the degree of bending of the target trajectory stored in the memory. The processing of the speed control unit 354 and the steering control unit 356 is realized by, for example, a combination of feedforward control and feedback control. As an example, the steering control unit 356 executes a combination of feedforward control according to the curvature of the road in front of the own vehicle M and feedback control based on the deviation from the target trajectory.
 視線誘導制御部400は、物体認識装置54が認識した情報に基づいて、運転者の位置に対する自車両Mの周辺の物体の方向を認識する。以下、運転者の位置に対する自車両Mの周辺の物体の方向を単に「物体方向」という。運転者の位置は、例えば、運転者監視カメラ90の撮像画像により認識する。運転者の位置は、例えば、運転者の頭部の重心で表される。運転者の位置は、予め車両に対して固定的に設定されてもよい。また、視線誘導制御部400は、自車両Mと自車両Mの周囲の物体との距離を認識する。視線誘導制御部400は、物体方向と、乗員状態監視部150が認識した運転者の視線方向と、を比較する。視線誘導制御部400は、物体方向と運転者の視線方向との比較結果、および自車両Mと物体との距離に基づいて、後述する視線誘導部530の動作を決定する。視線誘導制御部400は、決定した視線誘導部530の動作指令を後述するヘルメット40のヘルメット制御部42に出力する。なお、視線誘導制御部400の一部は、自動運転制御部300の認識部320に統合されてもよい。視線誘導制御部400の機能の詳細については後述する。 The line-of-sight guidance control unit 400 recognizes the direction of an object around the own vehicle M with respect to the position of the driver based on the information recognized by the object recognition device 54. Hereinafter, the direction of an object around the own vehicle M with respect to the position of the driver is simply referred to as "object direction". The position of the driver is recognized by, for example, an image captured by the driver monitoring camera 90. The position of the driver is represented by, for example, the center of gravity of the driver's head. The position of the driver may be fixedly set with respect to the vehicle in advance. Further, the line-of-sight guidance control unit 400 recognizes the distance between the own vehicle M and an object around the own vehicle M. The line-of-sight guidance control unit 400 compares the direction of the object with the direction of the driver's line of sight recognized by the occupant condition monitoring unit 150. The line-of-sight guidance control unit 400 determines the operation of the line-of-sight guidance unit 530, which will be described later, based on the result of comparison between the object direction and the line-of-sight direction of the driver and the distance between the own vehicle M and the object. The line-of-sight guidance control unit 400 outputs the determined operation command of the line-of-sight guidance unit 530 to the helmet control unit 42 of the helmet 40, which will be described later. A part of the line-of-sight guidance control unit 400 may be integrated with the recognition unit 320 of the automatic driving control unit 300. The details of the function of the line-of-sight guidance control unit 400 will be described later.
 走行駆動力出力装置500は、自車両Mが走行するための走行駆動力(トルク)を駆動輪に出力する。走行駆動力出力装置500は、例えば、内燃機関や電動機、変速機等の組み合わせと、これらを制御するECU(Electronic Control Unit)と、を備える。ECUは、第2制御部350から入力される情報、または運転操作子80から入力される情報に従って、上記の構成を制御する。 The traveling driving force output device 500 outputs a traveling driving force (torque) for the own vehicle M to travel to the drive wheels. The traveling driving force output device 500 includes, for example, a combination of an internal combustion engine, an electric motor, a transmission, and the like, and an ECU (Electronic Control Unit) that controls them. The ECU controls the above configuration according to the information input from the second control unit 350 or the information input from the operation operator 80.
 ブレーキ装置510は、例えば、ブレーキキャリパーと、ブレーキキャリパーに油圧を伝達するシリンダと、シリンダに油圧を発生させる電動モータと、ブレーキECUと、を備える。ブレーキECUは、第2制御部350から入力される情報、または運転操作子80から入力される情報に従って電動モータを制御し、制動操作に応じたブレーキトルクが各車輪に出力されるようにする。ブレーキ装置510は、運転操作子80に含まれるブレーキレバーまたはブレーキペダルの操作によって発生させた油圧を、マスターシリンダを介してシリンダに伝達する機構をバックアップとして備えてよい。なお、ブレーキ装置510は、上記説明した構成に限らず、第2制御部350から入力される情報に従ってアクチュエータを制御して、マスターシリンダの油圧をシリンダに伝達する電子制御式油圧ブレーキ装置であってもよい。 The brake device 510 includes, for example, a brake caliper, a cylinder that transmits hydraulic pressure to the brake caliper, an electric motor that generates hydraulic pressure in the cylinder, and a brake ECU. The brake ECU controls the electric motor according to the information input from the second control unit 350 or the information input from the operation operator 80 so that the brake torque corresponding to the braking operation is output to each wheel. The brake device 510 may include, as a backup, a mechanism for transmitting the hydraulic pressure generated by the operation of the brake lever or the brake pedal included in the operation operator 80 to the cylinder via the master cylinder. The brake device 510 is not limited to the configuration described above, and is an electronically controlled hydraulic brake device that controls the actuator according to the information input from the second control unit 350 to transmit the hydraulic pressure of the master cylinder to the cylinder. May be good.
 ステアリング装置520は、例えば、ステアリングECUと、電動モータと、を備える。電動モータは、例えば、操舵輪(前輪)の向きを変更する。ステアリングECUは、第2制御部350から入力される情報、または運転操作子80から入力される情報に従って、電動モータを駆動し、操舵輪の向きを変更させる。 The steering device 520 includes, for example, a steering ECU and an electric motor. The electric motor changes the direction of the steering wheels (front wheels), for example. The steering ECU drives the electric motor according to the information input from the second control unit 350 or the information input from the driving operator 80 to change the direction of the steering wheels.
 視線誘導部530は、運転者の視線を視線誘導制御部400が決定した所定の方向に誘導する。視線誘導部530は、発光部540(光源)と、警告音発生部550と、振動部560と、を備える。発光部540および警告音発生部550は、運転者が着用するヘルメットに設けられている。振動部560は、車両のうち運転者に接触する箇所に設けられている。発光部540、警告音発生部550および振動部560の詳細については後述する。 The line-of-sight guidance unit 530 guides the driver's line of sight in a predetermined direction determined by the line-of-sight guidance control unit 400. The line-of-sight guidance unit 530 includes a light emitting unit 540 (light source), a warning sound generating unit 550, and a vibrating unit 560. The light emitting unit 540 and the warning sound generating unit 550 are provided on a helmet worn by the driver. The vibrating unit 560 is provided at a position in the vehicle that comes into contact with the driver. Details of the light emitting unit 540, the warning sound generating unit 550, and the vibrating unit 560 will be described later.
<車両全体>
 次に、本実施形態の運転支援システム1が備える構成の一部の配置について、鞍乗り型車両、および鞍乗り型車両の運転者が着用したヘルメットの構造とともに説明する。なお、以下の説明における前後左右等の向きは、特に記載が無ければ以下に説明する車両における向きと同一とする。また以下の説明に用いる図中適所には、車両前方を示す矢印FR、車両左方を示す矢印LH、車両上方を示す矢印UPが示されている。
<Whole vehicle>
Next, a part of the arrangement of the configuration included in the driving support system 1 of the present embodiment will be described together with the structure of the saddle-riding vehicle and the helmet worn by the driver of the saddle-riding vehicle. The orientations of the front, rear, left, right, etc. in the following description shall be the same as the orientations in the vehicle described below unless otherwise specified. Further, in the appropriate place in the figure used in the following description, an arrow FR indicating the front of the vehicle, an arrow LH indicating the left side of the vehicle, and an arrow UP indicating the upper part of the vehicle are shown.
 図4は、第1実施形態の自動二輪車を示す左側面図である。
 図4に示すように、自動二輪車10は、実施形態の運転支援システム1が搭載された鞍乗り型車両である。自動二輪車10は、操舵輪である前輪11と、駆動輪である後輪12と、原動機13(図示の例ではエンジン)を支持する車体フレーム20と、を主に備える。
FIG. 4 is a left side view showing the motorcycle of the first embodiment.
As shown in FIG. 4, the motorcycle 10 is a saddle-riding vehicle equipped with the driving support system 1 of the embodiment. The motorcycle 10 mainly includes a front wheel 11 which is a steering wheel, a rear wheel 12 which is a driving wheel, and a vehicle body frame 20 which supports a prime mover 13 (an engine in the illustrated example).
 前輪11は、操舵機構を介して車体フレーム20に操向可能に支持されている。操舵機構は、前輪11を支持するフロントフォーク14と、フロントフォーク14を支持するステアリングステム15と、を備える。ステアリングステム15の上部には、運転者Jが握る操向ハンドル16が取り付けられている。前輪11は、ブレーキ装置510によって制動される。 The front wheel 11 is steerably supported by the vehicle body frame 20 via a steering mechanism. The steering mechanism includes a front fork 14 that supports the front wheels 11 and a steering stem 15 that supports the front fork 14. A steering handle 16 held by the driver J is attached to the upper part of the steering stem 15. The front wheels 11 are braked by the braking device 510.
 後輪12は、車両後部で前後方向に延びるスイングアーム17の後端部に支持されている。スイングアーム17の前端部は、車体フレーム20に上下揺動可能に支持されている。後輪12は、ブレーキ装置510によって制動される。 The rear wheel 12 is supported by the rear end of the swing arm 17 extending in the front-rear direction at the rear of the vehicle. The front end portion of the swing arm 17 is supported by the vehicle body frame 20 so as to be able to swing up and down. The rear wheel 12 is braked by the braking device 510.
 車体フレーム20は、前端部に設けられたヘッドパイプ21によって、ステアリングステム15を回動可能に支持している。車体フレーム20は、上述した原動機13の他、運転者Jが着座するシート22や、運転者Jが足を載せる左右のステップ23、シート22の前方に配置された燃料タンク24等を支持している。燃料タンク24は、運転者Jによってニーグリップ可能に設けられている。車両前部には、車体フレーム20に支持されたフロントカウル25が装着される。フロントカウル25の内側には、メータ装置30が配置されている。 The vehicle body frame 20 rotatably supports the steering stem 15 by a head pipe 21 provided at the front end portion. In addition to the prime mover 13 described above, the vehicle body frame 20 supports the seat 22 on which the driver J sits, the left and right steps 23 on which the driver J rests his / her feet, the fuel tank 24 arranged in front of the seat 22, and the like. There is. The fuel tank 24 is provided so that the driver J can knee grip it. A front cowl 25 supported by the vehicle body frame 20 is mounted on the front portion of the vehicle. A meter device 30 is arranged inside the front cowl 25.
 操向ハンドル16、燃料タンク24、および左右のステップ23うち少なくともいずれか1つには、振動部560(図1参照)が設けられている。振動部560は、視線誘導制御部400によって制御される。振動部560は、運転者Jに振動を伝達して、運転者Jに振動を知覚させる。 A vibrating portion 560 (see FIG. 1) is provided on at least one of the steering handle 16, the fuel tank 24, and the left and right steps 23. The vibrating unit 560 is controlled by the line-of-sight guidance control unit 400. The vibrating unit 560 transmits the vibration to the driver J to make the driver J perceive the vibration.
<ヘルメット>
 図5は、第1実施形態のヘルメットの斜視図である。
 図5に示すように、ヘルメット40は、実施形態の運転支援システム1が搭載されたフルフェイスタイプのヘルメットである。ヘルメット40は、運転者Jの頭部を覆うヘルメット本体41と、ヘルメット本体41に設けられた発光部540および警告音発生部550と、発光部540および警告音発生部550を制御するヘルメット制御部42と、ヘルメット制御部42に電力を供給する電源(不図示)と、を備える。発光部540および警告音発生部550は、それぞれヘルメット40に予め組み込まれたものであってもよいし、ヘルメット40への後付けデバイスであってもよい。例えば、電源は、二次電池であって、ヘルメット本体41に埋め込まれている。
<Helmet>
FIG. 5 is a perspective view of the helmet of the first embodiment.
As shown in FIG. 5, the helmet 40 is a full-face type helmet equipped with the driving support system 1 of the embodiment. The helmet 40 includes a helmet body 41 that covers the head of the driver J, a light emitting unit 540 and a warning sound generating unit 550 provided on the helmet body 41, and a helmet control unit that controls the light emitting unit 540 and the warning sound generating unit 550. A power supply (not shown) for supplying power to the helmet control unit 42 is provided with the 42. The light emitting unit 540 and the warning sound generating unit 550 may be preliminarily incorporated in the helmet 40, or may be a retrofit device to the helmet 40. For example, the power source is a secondary battery, which is embedded in the helmet body 41.
 ヘルメット本体41は、外殻部材としての帽体43と、帽体43の内側に配置された内装材(不図示)と、帽体43の前面開放部45を覆うシールド44と、を備える。前面開放部45は、帽体43の前部において、着用者(運転者J)の視界を確保するために設けられている。 The helmet body 41 includes a cap body 43 as an outer shell member, an interior material (not shown) arranged inside the cap body 43, and a shield 44 that covers the front opening portion 45 of the cap body 43. The front opening portion 45 is provided at the front portion of the cap body 43 in order to secure the field of view of the wearer (driver J).
 発光部540は、LED(Light Emitting Diode)等の発光素子である。発光部540は、発光しているか否かをヘルメット40の着用者が視認可能な位置に配置されている。本実施形態では、発光部540は、帽体43の前面開放部45の開口縁に配置されている。発光部540は、ヘルメット40の着用者の視線が真正面に向いた状態で、その視線を囲むように配置されている。発光部540は、前面開放部45の上部に配置された上側発光部541と、前面開放部45の右部に配置された右側発光部542と、前面開放部45の左部に配置された左側発光部543と、を備える。上側発光部541、右側発光部542および左側発光部543は、互いに別部材として設けられていてもよいし、一体的に設けられて互いに独立して発光するように形成されていてもよい。 The light emitting unit 540 is a light emitting element such as an LED (Light Emitting Diode). The light emitting unit 540 is arranged at a position where the wearer of the helmet 40 can visually check whether or not the light is emitted. In the present embodiment, the light emitting portion 540 is arranged at the opening edge of the front opening portion 45 of the cap body 43. The light emitting unit 540 is arranged so as to surround the line of sight of the wearer of the helmet 40 in a state of facing directly in front of the wearer. The light emitting portion 540 includes an upper light emitting portion 541 arranged above the front opening portion 45, a right light emitting portion 542 arranged at the right portion of the front opening portion 45, and a left side arranged at the left portion of the front opening portion 45. A light emitting unit 543 and a light emitting unit 543 are provided. The upper light emitting unit 541, the right light emitting unit 542, and the left light emitting unit 543 may be provided as separate members from each other, or may be integrally provided and formed so as to emit light independently of each other.
 警告音発生部550は、指向性スピーカである。警告音発生部550は、警告音を発する。警告音発生部550は、ヘルメット40の着用者に対し、警告音の発生源として少なくとも左右の2方向を区別して知覚させることができるように設けられている。本実施形態では、警告音発生部550は、ヘルメット40の着用者に対し、警告音の発生源として右側、左側および前側の3方向を区別して知覚させることができる。 The warning sound generator 550 is a directional speaker. The warning sound generation unit 550 emits a warning sound. The warning sound generation unit 550 is provided so that the wearer of the helmet 40 can perceive at least two directions of left and right as a source of the warning sound. In the present embodiment, the warning sound generating unit 550 can make the wearer of the helmet 40 perceive the warning sound as a source of the warning sound by distinguishing the three directions of the right side, the left side, and the front side.
 ヘルメット制御部42は、図示しない電源の電力によって動作する。ヘルメット制御部42は、視線誘導制御部400と通信可能に設けられている。ヘルメット制御部42は、視線誘導部530の動作指令を視線誘導制御部400から取得する。ヘルメット制御部42は、取得した発光部540の動作指令に基づいて、発光部540に電圧を印加し、発光部540を発光させる。また、ヘルメット制御部42は、取得した警告音発生部550の動作指令に基づいて、警告音発生部550から警告音を発させる。 The helmet control unit 42 operates by the electric power of a power source (not shown). The helmet control unit 42 is provided so as to be able to communicate with the line-of-sight guidance control unit 400. The helmet control unit 42 acquires an operation command of the line-of-sight guidance unit 530 from the line-of-sight guidance control unit 400. The helmet control unit 42 applies a voltage to the light emitting unit 540 based on the acquired operation command of the light emitting unit 540 to cause the light emitting unit 540 to emit light. Further, the helmet control unit 42 emits a warning sound from the warning sound generating unit 550 based on the acquired operation command of the warning sound generating unit 550.
<視線誘導制御部の機能>
 以下、本実施形態に係る視線誘導制御部400の機能について図6から図10を参照して説明する。この処理フローは、運転者に周辺監視義務が課される状態において繰り返し実施される。すなわち、運転支援が実行されていない状態(手動運転状態)、または第1の度合もしくは第2の度合の運転支援が実行されている状態において実施される。
<Function of line-of-sight guidance control unit>
Hereinafter, the function of the line-of-sight guidance control unit 400 according to the present embodiment will be described with reference to FIGS. 6 to 10. This processing flow is repeatedly carried out in a state where the driver is obliged to monitor the surroundings. That is, the driving support is not executed (manual driving state), or the driving support of the first degree or the second degree is executed.
 図6は、視線誘導制御部による処理の流れを示すフローチャートである。図7および図8は、視線方向と物体方向とが相違する場面を示す図である。図9は、発光部の発光状態を説明する図であって、第1実施形態のヘルメットの正面図である。なお、図9においては、シールド44の図示を省略している。
 図6に示すように、ステップS10において、視線誘導制御部400は、上下方向から見て視線方向Deと物体方向Doとが相違するか否かを判定する。視線誘導制御部400は、上下方向から見て視線方向Deと物体方向Doとのなす角度θ1が所定の第1角度よりも大きい場合に、視線方向Deと物体方向Doとが相違すると判定する(図7参照)。視線誘導制御部400は、上下方向から見て視線方向Deと物体方向Doとが相違すると判定した場合(S10:YES)、ステップS20の処理に進む。視線誘導制御部400は、上下方向から見て視線方向Deと物体方向Doとが相違しないと判定した場合(S10:NO)、右側発光部542および左側発光部543を消灯させるようにヘルメット制御部42に指令を出力し(ステップS50)、ステップS60の処理に進む。
FIG. 6 is a flowchart showing a processing flow by the line-of-sight guidance control unit. 7 and 8 are views showing a scene in which the line-of-sight direction and the object direction are different. FIG. 9 is a diagram illustrating a light emitting state of the light emitting unit, and is a front view of the helmet of the first embodiment. In FIG. 9, the shield 44 is not shown.
As shown in FIG. 6, in step S10, the line-of-sight guidance control unit 400 determines whether or not the line-of-sight direction De and the object direction Do are different when viewed from the vertical direction. The line-of-sight guidance control unit 400 determines that the line-of-sight direction De and the object direction Do are different when the angle θ1 formed by the line-of-sight direction De and the object direction Do is larger than a predetermined first angle when viewed from the vertical direction ( (See FIG. 7). When the line-of-sight guidance control unit 400 determines that the line-of-sight direction De and the object direction Do are different when viewed from the vertical direction (S10: YES), the process proceeds to step S20. When the line-of-sight guidance control unit 400 determines that the line-of-sight direction De and the object direction Do do not differ when viewed from the vertical direction (S10: NO), the helmet control unit 400 turns off the right light emitting unit 542 and the left light emitting unit 543. A command is output to 42 (step S50), and the process proceeds to step S60.
 ステップS20において、視線誘導制御部400は、視線方向Deが物体方向Doに対して右側に向いているか否かを判定する。視線方向Deが物体方向Doに対して右側に向いている場合(S20:YES)、視線誘導制御部400はステップS30の処理に進む。視線方向Deが物体方向Doに対して右側に向いてない場合(S20:NO)、すなわち視線方向Deが物体方向Doに対して左側に向いている場合、視線誘導制御部400はステップS40の処理に進む。 In step S20, the line-of-sight guidance control unit 400 determines whether or not the line-of-sight direction De is facing to the right with respect to the object direction Do. When the line-of-sight direction De faces the right side with respect to the object direction Do (S20: YES), the line-of-sight guidance control unit 400 proceeds to the process of step S30. When the line-of-sight direction De is not facing the right side with respect to the object direction Do (S20: NO), that is, when the line-of-sight direction De is facing the left side with respect to the object direction Do, the line-of-sight guidance control unit 400 processes in step S40. Proceed to.
 ステップS30において、視線誘導制御部400は、左側発光部543を発光させるようにヘルメット制御部42に指令を出力して左側発光部543を発光させる(図9参照)。これにより、発光部540は、運転者Jの視線に対して物体Oと同じ側で発光する。この際、視線誘導制御部400は、左側発光部543を点灯、点滅、または明滅させる。「点滅」とは、一定の発光強度(輝度)の点灯状態と消灯状態とを繰り返すことである。「明滅」とは、発光強度を変化させながら点灯状態および消灯状態を繰り返すことである。これにより、運転者Jは、左側に意識が移り、視線を物体O側に向ける。続いて、視線誘導制御部400は、ステップS60の処理に進む。 In step S30, the line-of-sight guidance control unit 400 outputs a command to the helmet control unit 42 so that the left light emitting unit 543 emits light, and causes the left light emitting unit 543 to emit light (see FIG. 9). As a result, the light emitting unit 540 emits light on the same side as the object O with respect to the line of sight of the driver J. At this time, the line-of-sight guidance control unit 400 lights, blinks, or blinks the left light emitting unit 543. "Blinking" means repeating a lighting state and an extinguishing state of a constant emission intensity (luminance). "Blinking" means repeating the lighting state and the extinguishing state while changing the light emission intensity. As a result, the driver J shifts his consciousness to the left side and turns his gaze toward the object O side. Subsequently, the line-of-sight guidance control unit 400 proceeds to the process of step S60.
 ステップS40において、視線誘導制御部400は、右側発光部542を発光させるようにヘルメット制御部42に指令を出力して右側発光部542を発光させる。これにより、発光部540は、運転者Jの視線に対して物体Oと同じ側で発光する。この際、視線誘導制御部400は、右側発光部542を点灯、点滅、または明滅させる。これにより、運転者Jは、右側に意識が移り、視線を物体O側に向ける。続いて、視線誘導制御部400は、ステップS60の処理に進む。 In step S40, the line-of-sight guidance control unit 400 outputs a command to the helmet control unit 42 so that the right light emitting unit 542 emits light, and causes the right light emitting unit 542 to emit light. As a result, the light emitting unit 540 emits light on the same side as the object O with respect to the line of sight of the driver J. At this time, the line-of-sight guidance control unit 400 lights, blinks, or blinks the right light emitting unit 542. As a result, the driver J shifts his consciousness to the right side and turns his gaze toward the object O side. Subsequently, the line-of-sight guidance control unit 400 proceeds to the process of step S60.
 ステップS60において、視線誘導制御部400は、車幅方向から見て視線方向Deと物体方向Doとが相違するか否かを判定する。視線誘導制御部400は、車幅方向から見て視線方向Deと物体方向Doとのなす角度θ2が所定の第2角度よりも大きい場合に、視線方向Deと物体方向Doとが相違すると判定する(図8参照)。特に、車幅方向から見て視線方向Deが物体方向Doに対して下側に向いている場合、運転者Jはメータ装置30を見ている可能性がある。視線誘導制御部400は、車幅方向から見て視線方向Deと物体方向Doとが相違すると判定した場合(S60:YES)、ステップS70の処理に進む。視線誘導制御部400は、車幅方向から見て視線方向Deと物体方向Doとが相違しないと判定した場合(S60:NO)、上側発光部を消灯させるようにヘルメット制御部に指令を出力し(ステップS80)、一連の処理を終了する。 In step S60, the line-of-sight guidance control unit 400 determines whether or not the line-of-sight direction De and the object direction Do are different when viewed from the vehicle width direction. The line-of-sight guidance control unit 400 determines that the line-of-sight direction De and the object direction Do are different when the angle θ2 formed by the line-of-sight direction De and the object direction Do when viewed from the vehicle width direction is larger than a predetermined second angle. (See FIG. 8). In particular, when the line-of-sight direction De is facing downward with respect to the object direction Do when viewed from the vehicle width direction, the driver J may be looking at the meter device 30. When the line-of-sight guidance control unit 400 determines that the line-of-sight direction De and the object direction Do are different when viewed from the vehicle width direction (S60: YES), the process proceeds to step S70. When the line-of-sight guidance control unit 400 determines that the line-of-sight direction De and the object direction Do do not differ from each other when viewed from the vehicle width direction (S60: NO), the line-of-sight guidance control unit 400 outputs a command to the helmet control unit to turn off the upper light emitting unit. (Step S80), a series of processes is completed.
 ステップS70において、視線誘導制御部400は、上側発光部541を発光させるようにヘルメット制御部42に指令を出力して上側発光部541を発光させる。これにより、発光部540は、運転者Jの視線に対して物体Oと同じ側で発光する。この際、視線誘導制御部400は、上側発光部541を点灯、点滅、または明滅させる。これにより、運転者Jは、上側に意識が移り、メータ装置30を見ていた場合に視線を物体O側に向ける。そして、視線誘導制御部400は、処理を終了する。 In step S70, the line-of-sight guidance control unit 400 outputs a command to the helmet control unit 42 so that the upper light emitting unit 541 emits light, and causes the upper light emitting unit 541 to emit light. As a result, the light emitting unit 540 emits light on the same side as the object O with respect to the line of sight of the driver J. At this time, the line-of-sight guidance control unit 400 lights, blinks, or blinks the upper light emitting unit 541. As a result, the driver J shifts his consciousness to the upper side and turns his / her line of sight toward the object O when he / she is looking at the meter device 30. Then, the line-of-sight guidance control unit 400 ends the process.
 上記一連の処理において、視線誘導制御部400は、自車両Mと物体Oとの距離に応じて、発光部540の発光強度、発光色、および発光周期の少なくともいずれか1つを変化させてもよい。例えば、視線誘導制御部400は、自車両Mと物体Oとの距離が小さくなるに従い、発光強度が高くなるようにヘルメット制御部42に指令を出力する。例えば、視線誘導制御部400は、自車両Mと物体Oとの距離が小さくなるに従い、点滅または明滅する発光部540の発光周期が小さくなるようにヘルメット制御部42に指令を出力する。 In the above series of processes, the line-of-sight guidance control unit 400 may change at least one of the emission intensity, emission color, and emission cycle of the light emitting unit 540 according to the distance between the own vehicle M and the object O. Good. For example, the line-of-sight guidance control unit 400 outputs a command to the helmet control unit 42 so that the light emission intensity increases as the distance between the own vehicle M and the object O decreases. For example, the line-of-sight guidance control unit 400 outputs a command to the helmet control unit 42 so that the light emitting cycle of the blinking or blinking light emitting unit 540 decreases as the distance between the own vehicle M and the object O decreases.
 また、視線誘導制御部400は、発光部540の発光に連動して、警告音発生部550から警告音を発させるようにヘルメット制御部42に信号を出力する。視線誘導制御部400は、運転者Jに対する物体方向Doに対応する位置から警告音を発させるように、ヘルメット制御部42に警告音発生部550を制御させる。換言すると、視線誘導制御部400は、ヘルメット40の着用者に対して発光する発光部540に対応する方向から警告音を発させるように、ヘルメット制御部42に警告音発生部550を制御させる。例えば、視線誘導制御部400は、右側発光部542を発光させる条件下で、ヘルメット40の着用者が右側から警告音が鳴っていると知覚するように警告音発生部550から警告音を発させる。 Further, the line-of-sight guidance control unit 400 outputs a signal to the helmet control unit 42 so as to emit a warning sound from the warning sound generation unit 550 in conjunction with the light emission of the light emitting unit 540. The line-of-sight guidance control unit 400 causes the helmet control unit 42 to control the warning sound generation unit 550 so as to emit a warning sound from a position corresponding to the object direction Do with respect to the driver J. In other words, the line-of-sight guidance control unit 400 causes the helmet control unit 42 to control the warning sound generation unit 550 so that the wearer of the helmet 40 emits a warning sound from the direction corresponding to the light emitting unit 540 that emits light. For example, the line-of-sight guidance control unit 400 emits a warning sound from the warning sound generating unit 550 so that the wearer of the helmet 40 perceives that the warning sound is sounding from the right side under the condition that the right light emitting unit 542 emits light. ..
 また、視線誘導制御部400は、発光部540の発光に連動して、運転者Jが接触する箇所に設けられた振動部560を振動させる。視線誘導制御部400は、運転者Jに対する物体方向Doに対応する位置で振動部560を振動させる。換言すると、視線誘導制御部400は、運転者Jに対し、右半身および左半身のうち発光している発光部540に対応する方向の半身に接触する振動部560を振動させて、運転者Jに振動を知覚させる。例えば、視線誘導制御部400は、右側発光部542を発光させる条件下で、運転者Jの右半身に接触する振動部560(例えば右側のステップ23)を振動させる。なお、視線誘導制御部400は、発光部540が発光している間、運転者Jに対して左右関係なく振動を知覚させてもよい。また、視線誘導制御部400は、自車両Mと物体Oとの距離が所定の距離よりも小さい場合のみに、振動部560を振動させてもよい。 Further, the line-of-sight guidance control unit 400 vibrates the vibrating unit 560 provided at the position where the driver J contacts in conjunction with the light emission of the light emitting unit 540. The line-of-sight guidance control unit 400 vibrates the vibrating unit 560 at a position corresponding to the object direction Do with respect to the driver J. In other words, the line-of-sight guidance control unit 400 causes the driver J to vibrate the vibrating unit 560 that contacts the half of the right and left half of the body in the direction corresponding to the light emitting unit 540, and the driver J. To perceive vibration. For example, the line-of-sight guidance control unit 400 vibrates the vibrating unit 560 (for example, step 23 on the right side) that comes into contact with the right half of the driver J under the condition that the right light emitting unit 542 emits light. The line-of-sight guidance control unit 400 may allow the driver J to perceive vibration regardless of the left and right while the light emitting unit 540 is emitting light. Further, the line-of-sight guidance control unit 400 may vibrate the vibrating unit 560 only when the distance between the own vehicle M and the object O is smaller than a predetermined distance.
 以上に説明したように、本実施形態の運転支援システム1は、運転者Jの視線方向Deを認識する乗員状態監視部150と、運転者Jに対する自車両Mの周辺の物体Oの方向(物体方向Do)を認識する視線誘導制御部400と、視線方向Deと物体方向Doとが相違する場合に発光する発光部540と、を備える。
 この構成によれば、光は騒音または路面の振動の影響を受けないので、従来技術のように音および振動を運転者に知覚させる構成と比較して、発光部540を発光させることで運転者Jの視線方向Deとは異なる方向に物体Oがあることを運転者Jに効果的に認識させることができる。したがって、運転者Jに認識させたい方向を知らせることができる。
As described above, in the driving support system 1 of the present embodiment, the occupant state monitoring unit 150 that recognizes the line-of-sight direction De of the driver J and the direction (object) of the object O around the own vehicle M with respect to the driver J. It includes a line-of-sight guidance control unit 400 that recognizes the direction Do), and a light emitting unit 540 that emits light when the line-of-sight direction De and the object direction Do are different.
According to this configuration, the light is not affected by noise or vibration of the road surface. Therefore, the driver is made to emit light by causing the light emitting unit 540 to emit light as compared with the configuration in which the driver perceives the sound and vibration as in the prior art. It is possible to make the driver J effectively recognize that the object O is in a direction different from the line-of-sight direction De of J. Therefore, it is possible to inform the driver J of the direction to be recognized.
 また、運転支援システム1は、上下方向から見て視線方向Deと物体方向Doとが相違する場合に発光部540を発光させる。
 この構成によれば、運転者Jの視線が自車両Mの周辺で物体O以外の方向に向いている場合に、発光部540が発光する。よって、運転者Jの視線方向Deとは異なる方向に物体Oがあることを運転者Jに認識させることができる。
Further, the driving support system 1 causes the light emitting unit 540 to emit light when the line-of-sight direction De and the object direction Do are different when viewed from the vertical direction.
According to this configuration, when the line of sight of the driver J is directed to a direction other than the object O around the own vehicle M, the light emitting unit 540 emits light. Therefore, it is possible to make the driver J recognize that the object O is in a direction different from the line-of-sight direction De of the driver J.
 また、運転支援システム1は、車幅方向から見て視線方向Deと物体方向Doとが相違する場合に発光部540を発光させる。
 例えば、メータ装置30は、運転者Jから見て自車両Mの周辺の物体Oよりも手前にあるため、自車両Mの周辺の物体Oに視線を向けている場合よりも視線が下方に向く。このため、上記のように構成することで、例えば運転者Jの視線がメータ装置30に向いている場合に、発光部540が発光する。よって、運転者Jの視線方向Deとは異なる方向に物体Oがあることを運転者Jに認識させることができる。
Further, the driving support system 1 causes the light emitting unit 540 to emit light when the line-of-sight direction De and the object direction Do are different from each other when viewed from the vehicle width direction.
For example, since the meter device 30 is in front of the object O around the own vehicle M when viewed from the driver J, the line of sight is directed downward as compared with the case where the line of sight is directed to the object O around the own vehicle M. .. Therefore, with the above configuration, for example, when the line of sight of the driver J is directed to the meter device 30, the light emitting unit 540 emits light. Therefore, it is possible to make the driver J recognize that the object O is in a direction different from the line-of-sight direction De of the driver J.
 また、運転支援システム1は、視線方向Deと物体方向Doとのなす角度θ1,θ2が所定の角度以上の場合に発光部540を発光させる。
 この構成によれば、運転者Jの視線が物体Oから僅かにずれており、かつ運転者Jが物体Oを認識している状況で、発光部540が頻繁に発光することを抑制できる。
Further, the driving support system 1 causes the light emitting unit 540 to emit light when the angles θ1 and θ2 formed by the line-of-sight direction De and the object direction Do are equal to or greater than a predetermined angle.
According to this configuration, it is possible to prevent the light emitting unit 540 from frequently emitting light in a situation where the line of sight of the driver J is slightly deviated from the object O and the driver J recognizes the object O.
 また、発光部540は、自車両Mと物体Oとの距離に応じて、発光強度、発光色、および発光周期の少なくともいずれか1つを変化させる。
 この構成によれば、運転者Jの視線の誘導が必要な度合に応じて発光部540の発光形態を変化させることができる。したがって、運転者Jに認識させたい方向をより確実に知らせることができる。
Further, the light emitting unit 540 changes at least one of the light emission intensity, the light emission color, and the light emission cycle according to the distance between the own vehicle M and the object O.
According to this configuration, the light emitting form of the light emitting unit 540 can be changed according to the degree to which the driver J's line of sight needs to be guided. Therefore, it is possible to more reliably inform the driver J of the direction to be recognized.
 また、発光部540は、ヘルメット40に設けられている。発光部540は、視線方向Deと物体方向Doとが相違する場合に、運転者Jの視線に対して物体と同じ側で発光する。
 この構成によれば、運転者Jが視線を発光部540における発光部分に向けることによって、視線を物体O側に移すことができる。これにより、運転者Jの視線を誘導して、運転者Jに物体Oの位置を認識させることができる。
Further, the light emitting unit 540 is provided on the helmet 40. When the line-of-sight direction De and the object direction Do are different, the light emitting unit 540 emits light on the same side as the object with respect to the line of sight of the driver J.
According to this configuration, the driver J can shift the line of sight to the object O side by directing the line of sight to the light emitting portion of the light emitting unit 540. As a result, the line of sight of the driver J can be guided so that the driver J can recognize the position of the object O.
 また、視線誘導部530は、発光部540の発光に連動して振動し、運転者Jに振動を伝達する振動部560を備える。
 この構成によれば、運転者Jの視線方向Deとは異なる方向に物体Oがあることをより確実に運転者Jに認識させることができる。
Further, the line-of-sight guidance unit 530 includes a vibration unit 560 that vibrates in conjunction with the light emission of the light emitting unit 540 and transmits the vibration to the driver J.
According to this configuration, it is possible to make the driver J more reliably recognize that the object O is in a direction different from the line-of-sight direction De of the driver J.
 また、振動部560は、運転者Jに対する物体方向Doに対応する位置で振動する。
 この構成によれば、振動している振動部560側に運転者Jが視線を向けることによって、視線を物体O側に移すことができる。これにより、運転者Jの視線を誘導して、運転者Jに物体Oの位置を認識させることができる。
Further, the vibrating unit 560 vibrates at a position corresponding to the object direction Do with respect to the driver J.
According to this configuration, the driver J can shift the line of sight to the object O side by directing the line of sight to the vibrating unit 560 side. As a result, the line of sight of the driver J can be guided so that the driver J can recognize the position of the object O.
 また、振動部560は、操向ハンドル16、燃料タンク24およびステップ23のうち少なくともいずれか1つに設けられている。
 この構成によれば、振動部560が運転者Jに接触する部分に配置されるので、運転者Jに振動部560の振動を効果的に伝達することができる。
Further, the vibrating portion 560 is provided in at least one of the steering handle 16, the fuel tank 24, and the step 23.
According to this configuration, since the vibrating portion 560 is arranged at the portion in contact with the driver J, the vibration of the vibrating portion 560 can be effectively transmitted to the driver J.
 また、視線誘導部530は、自車両Mと物体Oとの距離が所定の距離よりも小さい場合に警告音を発する警告音発生部550を備える。
 この構成によれば、運転者Jの視線の誘導が必要な度合が所定の基準よりも高い場合に警告音が発せられる。したがって、運転者Jの視線方向Deとは異なる方向に物体Oがあることをより確実に運転者Jに認識させることができる。
Further, the line-of-sight guidance unit 530 includes a warning sound generating unit 550 that emits a warning sound when the distance between the own vehicle M and the object O is smaller than a predetermined distance.
According to this configuration, a warning sound is emitted when the degree to which the driver J needs to guide the line of sight is higher than a predetermined reference. Therefore, it is possible to make the driver J more reliably recognize that the object O is in a direction different from the line-of-sight direction De of the driver J.
 また、警告音発生部550は、運転者Jに対する物体方向Doに対応する位置から警告音を発する。
 この構成によれば、運転者Jが視線を警告音の音源側に向けることによって、視線を物体O側に移すことができる。これにより、運転者Jの視線を誘導して、運転者Jに物体Oの位置を認識させることができる。
Further, the warning sound generation unit 550 emits a warning sound from a position corresponding to the object direction Do with respect to the driver J.
According to this configuration, the driver J can shift the line of sight to the object O side by directing the line of sight to the sound source side of the warning sound. As a result, the line of sight of the driver J can be guided so that the driver J can recognize the position of the object O.
 また、警告音発生部550は、ヘルメット40に設けられている。
 この構成によれば、運転者Jに警告音を効果的に知覚させることができる。
Further, the warning sound generation unit 550 is provided on the helmet 40.
According to this configuration, the driver J can effectively perceive the warning sound.
 なお、警告音発生部550をヘルメット40への後付けデバイスとすることで、既存のヘルメットに警告音発生部550を設けることができる。発光部540についても同様である。よって、上述した作用効果を奏する運転支援システム1を導入容易とすることができる。 By using the warning sound generating unit 550 as a retrofit device to the helmet 40, the warning sound generating unit 550 can be provided on the existing helmet. The same applies to the light emitting unit 540. Therefore, it is possible to easily introduce the driving support system 1 that exhibits the above-mentioned effects.
(第2実施形態)
 上記実施形態では、視線誘導部530の発光部540がヘルメット40に設けられているが、これに限定されない。視線誘導部530の発光部540は、メータ装置30に設けられていてもよい。以下、視線誘導部530の発光部540がメータ装置30に設けられた構成について説明する。なお、以下で説明する以外の構成は、上記実施形態と同様である。
(Second Embodiment)
In the above embodiment, the light emitting unit 540 of the line-of-sight guidance unit 530 is provided on the helmet 40, but the helmet 40 is not limited to this. The light emitting unit 540 of the line-of-sight guidance unit 530 may be provided in the meter device 30. Hereinafter, the configuration in which the light emitting unit 540 of the line-of-sight guidance unit 530 is provided in the meter device 30 will be described. The configuration other than that described below is the same as that of the above embodiment.
 図10は、第2実施形態のメータ装置の正面図である。
 図10に示すように、第2実施形態のメータ装置30は、車速メータ32やタコメータ33等が配置された表示部31と、後方から見た正面視で表示部31を囲む枠部38と、を備える。表示部31には、車速メータ32およびタコメータ33の他に、燃料計34や水温計35、インジケータランプ36、情報表示部37等が配置されている。図示の例では、車速メータ32やタコメータ33、燃料計34、水温計35は指針を回動させて目盛を指し示すアナログ式とされているが、デジタル式であってもよい。
FIG. 10 is a front view of the meter device of the second embodiment.
As shown in FIG. 10, the meter device 30 of the second embodiment includes a display unit 31 on which a vehicle speed meter 32, a tachometer 33, and the like are arranged, and a frame portion 38 that surrounds the display unit 31 when viewed from the rear. To be equipped. In addition to the vehicle speed meter 32 and the tachometer 33, the display unit 31 includes a fuel gauge 34, a water temperature gauge 35, an indicator lamp 36, an information display unit 37, and the like. In the illustrated example, the vehicle speed meter 32, the tachometer 33, the fuel gauge 34, and the water temperature gauge 35 are analog type that point to the scale by rotating the pointer, but may be digital type.
 枠部38は、メータ装置30の縁を構成している。枠部38には、発光部540が設けられている。発光部540は、表示部31を囲むように配置されている。発光部540は、表示部31の上側に設けられた上側発光部541と、表示部31の右側に配置された右側発光部542と、表示部31の左側に配置された左側発光部543と、を備える。上側発光部541、右側発光部542および左側発光部543は、互いに別部材として設けられていてもよいし、一体的に設けられて互いに独立して発光するように形成されていてもよい。発光部540は、視線誘導制御部400によって制御される。 The frame portion 38 constitutes the edge of the meter device 30. The frame portion 38 is provided with a light emitting portion 540. The light emitting unit 540 is arranged so as to surround the display unit 31. The light emitting unit 540 includes an upper light emitting unit 541 provided on the upper side of the display unit 31, a right light emitting unit 542 arranged on the right side of the display unit 31, and a left light emitting unit 543 arranged on the left side of the display unit 31. To be equipped. The upper light emitting unit 541, the right light emitting unit 542, and the left light emitting unit 543 may be provided as separate members from each other, or may be integrally provided and formed so as to emit light independently of each other. The light emitting unit 540 is controlled by the line-of-sight guidance control unit 400.
 本実施形態において、視線誘導制御部400は、乗員状態監視部150が認識した運転者の視線方向の情報に基づき、運転者の視線がメータ装置30に向いているか否かを判定する。視線誘導制御部400は、運転者の視線がメータ装置30に向いていると判定した場合に、上述した第1実施形態におけるステップS10からステップS90の処理を行う。 In the present embodiment, the line-of-sight guidance control unit 400 determines whether or not the driver's line of sight is directed to the meter device 30 based on the information of the driver's line-of-sight direction recognized by the occupant condition monitoring unit 150. When the line-of-sight guidance control unit 400 determines that the driver's line of sight is directed to the meter device 30, the line-of-sight guidance control unit 400 performs the processes of steps S10 to S90 in the above-described first embodiment.
 この構成によれば、運転者Jの視線がメータ装置30に向いている場合に、運転者Jが視線を発光部540における発光部分に向けることによって、視線を物体O側に移すことができる。したがって、運転者Jの視線を誘導して、運転者Jに物体Oの位置をより確実に認識させることができる。 According to this configuration, when the line of sight of the driver J is directed to the meter device 30, the driver J can shift the line of sight to the object O side by directing the line of sight to the light emitting portion of the light emitting unit 540. Therefore, the line of sight of the driver J can be guided so that the driver J can more reliably recognize the position of the object O.
 なお、本発明は、図面を参照して説明した上述の実施形態に限定されるものではなく、その技術的範囲において様々な変形例が考えられる。
 例えば、上記実施形態では、運転支援システム1の自動二輪車への適用を例に説明したが、これに限定されない。運転支援システム1が適用される鞍乗り型車両は、ヘルメットを着用した運転者が車体を跨いで乗車する車両全般が含まれ、自動二輪車のみならず、三輪(前一輪かつ後二輪の他に、前二輪かつ後一輪の車両も含む)の車両も含まれる。
The present invention is not limited to the above-described embodiment described with reference to the drawings, and various modifications can be considered within the technical scope thereof.
For example, in the above embodiment, the application of the driving support system 1 to a motorcycle has been described as an example, but the present invention is not limited to this. The saddle-riding vehicle to which the driver assistance system 1 is applied includes all vehicles in which a driver wearing a helmet rides across the vehicle body, and is not limited to motorcycles, but also three wheels (in addition to one front wheel and two rear wheels). Vehicles with two front wheels and one rear wheel) are also included.
 また、上記実施形態の運転支援システム1は、いわゆる自動運転を実行できるものであるが、これに限定されない。すなわち、走行に際して常に運転者による操作を必要とする車両に本発明の運転支援システムを適用してもよい。 Further, the driving support system 1 of the above embodiment can execute so-called automatic driving, but is not limited to this. That is, the driving support system of the present invention may be applied to a vehicle that always requires operation by the driver when traveling.
 また、上記実施形態では、視線誘導部530の発光部540が設けられるヘルメットとして、フルフェイスタイプのヘルメット40を例に説明したが、これに限定されない。視線誘導部530の発光部540は、ジェットタイプやフリップアップタイプ、オフロードタイプ等、種々の型式のヘルメットに設けることができる。 Further, in the above embodiment, the full-face type helmet 40 has been described as an example of the helmet provided with the light emitting unit 540 of the line-of-sight guidance unit 530, but the present invention is not limited to this. The light emitting unit 540 of the line-of-sight guidance unit 530 can be provided on various types of helmets such as a jet type, a flip-up type, and an off-road type.
 また、上記実施形態では、運転者監視カメラ90が車両前部に配置されているが、これに限定されない。運転者監視カメラは、運転者の頭部の向きを検出するために車両前部に配置されたカメラと、運転者の視線の向きを検出するためにヘルメットに配置されたカメラと、を備えていてもよい。 Further, in the above embodiment, the driver surveillance camera 90 is arranged at the front of the vehicle, but the present invention is not limited to this. The driver surveillance camera includes a camera placed at the front of the vehicle to detect the direction of the driver's head and a camera placed on the helmet to detect the direction of the driver's line of sight. You may.
 また、上記実施形態では、警告音発生部550がヘルメット40に設けられているが、これに限定されない。警告音発生部は、自動二輪車10に設けられていてもよい。また、視線誘導部530は、少なくとも発光部540を備えていればよく、警告音発生部550および振動部560を備えていなくてもよい。 Further, in the above embodiment, the warning sound generating unit 550 is provided on the helmet 40, but the present invention is not limited to this. The warning sound generating unit may be provided on the motorcycle 10. Further, the line-of-sight guidance unit 530 may include at least a light emitting unit 540, and may not include a warning sound generating unit 550 and a vibrating unit 560.
 また、上記実施形態では、振動部560が自動二輪車10に設けられているが、これに限定されない。振動部560は、運転者Jに接触する箇所に設けられていればよく、ヘルメット40に設けられていてもよい。 Further, in the above embodiment, the vibrating unit 560 is provided in the motorcycle 10, but the present invention is not limited to this. The vibrating portion 560 may be provided at a position where it comes into contact with the driver J, and may be provided at the helmet 40.
 その他、本発明の趣旨を逸脱しない範囲で、上記した実施の形態における構成要素を周知の構成要素に置き換えることは適宜可能である。 In addition, it is possible to replace the components in the above-described embodiment with well-known components as appropriate without departing from the spirit of the present invention.
 16 操向ハンドル
 24 燃料タンク
 23 ステップ
 30 メータ装置
 40 ヘルメット
 150 乗員状態監視部(視線方向認識部)
 400 視線誘導制御部(物体方向認識部)
 540 発光部(光源)
 550 警告音発生部
 560 振動部
 De 視線方向
 Do 物体の方向
 J 運転者
 θ1,θ2 角度
16 Steering handle 24 Fuel tank 23 Step 30 Meter device 40 Helmet 150 Crew condition monitoring unit (line-of-sight direction recognition unit)
400 Line-of-sight guidance control unit (object direction recognition unit)
540 Light emitting part (light source)
550 Warning sound generator 560 Vibration part De Line-of-sight direction Do Object direction J Driver θ1, θ2 Angle

Claims (14)

  1.  運転者(J)の視線方向(De)を認識する視線方向認識部(150)と、
     前記運転者(J)に対する自車両周辺の物体(O)の方向(Do)を認識する物体方向認識部(400)と、
     前記視線方向(De)と前記運転者(J)に対する前記物体(O)の方向(Do)とが相違する場合に発光する光源(540)と、
     を備える鞍乗り型車両の運転支援システム。
    The line-of-sight direction recognition unit (150) that recognizes the line-of-sight direction (De) of the driver (J),
    An object direction recognition unit (400) that recognizes the direction (Do) of an object (O) around the own vehicle with respect to the driver (J), and
    A light source (540) that emits light when the line-of-sight direction (De) and the direction (Do) of the object (O) with respect to the driver (J) are different.
    A saddle-riding vehicle driving support system equipped with.
  2.  車両前後方向および車幅方向に直交する方向から見て、前記視線方向(De)と前記運転者(J)に対する前記物体(O)の方向(Do)とが相違する場合に、前記光源(540)を発光させる、
     請求項1に記載の鞍乗り型車両の運転支援システム。
    The light source (540) when the line-of-sight direction (De) and the direction (Do) of the object (O) with respect to the driver (J) are different from each other when viewed from the direction orthogonal to the vehicle front-rear direction and the vehicle width direction. ) Is emitted.
    The driver assistance system for a saddle-riding vehicle according to claim 1.
  3.  車幅方向から見て、前記視線方向(De)と前記運転者(J)に対する前記物体(O)の方向(Do)とが相違する場合に、前記光源(540)を発光させる、
     請求項1または請求項2に記載の鞍乗り型車両の運転支援システム。
    When the line-of-sight direction (De) and the direction (Do) of the object (O) with respect to the driver (J) are different from each other when viewed from the vehicle width direction, the light source (540) is made to emit light.
    The driver assistance system for a saddle-riding vehicle according to claim 1 or 2.
  4.  前記視線方向(De)と前記運転者(J)に対する前記物体(O)の方向(Do)とのなす角度(θ1,θ2)が所定の角度以上の場合に、前記光源(540)を発光させる、
     請求項1から請求項3のいずれか1項に記載の鞍乗り型車両の運転支援システム。
    When the angle (θ1, θ2) formed by the line-of-sight direction (De) and the direction (Do) of the object (O) with respect to the driver (J) is equal to or greater than a predetermined angle, the light source (540) is made to emit light. ,
    The driver assistance system for a saddle-riding vehicle according to any one of claims 1 to 3.
  5.  前記光源(540)は、車両と前記物体(O)との距離に応じて、発光強度、発光色、および発光周期の少なくともいずれか1つを変化させる、
     請求項1から請求項4のいずれか1項に記載の鞍乗り型車両の運転支援システム。
    The light source (540) changes at least one of emission intensity, emission color, and emission cycle depending on the distance between the vehicle and the object (O).
    The driver assistance system for a saddle-riding vehicle according to any one of claims 1 to 4.
  6.  前記光源(540)は、ヘルメット(40)に設けられ、前記視線方向(De)と前記運転者(J)に対する前記物体(O)の方向(Do)とが相違する場合に、前記運転者(J)の視線に対して前記物体(O)と同じ側で発光する、
     請求項1から請求項5のいずれか1項に記載の鞍乗り型車両の運転支援システム。
    The light source (540) is provided on the helmet (40), and when the line-of-sight direction (De) and the direction (Do) of the object (O) with respect to the driver (J) are different, the driver (Do) It emits light on the same side as the object (O) with respect to the line of sight of J).
    The driver assistance system for a saddle-riding vehicle according to any one of claims 1 to 5.
  7.  前記光源(540)は、メータ装置(30)に設けられ、前記運転者(J)の視線が前記メータ装置(30)に向き、かつ前記視線方向(De)と前記運転者(J)に対する前記物体(O)の方向(Do)とが相違する場合に、前記運転者(J)の視線に対して前記物体(O)と同じ側で発光する、
     請求項1から請求項5のいずれか1項に記載の鞍乗り型車両の運転支援システム。
    The light source (540) is provided in the meter device (30), the line of sight of the driver (J) is directed to the meter device (30), and the line-of-sight direction (De) and the driver (J) are directed. When the direction (Do) of the object (O) is different, light is emitted on the same side as the object (O) with respect to the line of sight of the driver (J).
    The driver assistance system for a saddle-riding vehicle according to any one of claims 1 to 5.
  8.  前記光源(540)の発光に連動して振動し、前記運転者(J)に振動を伝達する振動部(560)をさらに備える、
     請求項1から請求項7のいずれか1項に記載の鞍乗り型車両の運転支援システム。
    A vibrating unit (560) that vibrates in conjunction with the light emission of the light source (540) and transmits the vibration to the driver (J) is further provided.
    The driver assistance system for a saddle-riding vehicle according to any one of claims 1 to 7.
  9.  前記振動部(560)は、前記運転者(J)に対する前記物体(O)の方向に対応する位置で振動する、
     請求項8に記載の鞍乗り型車両の運転支援システム。
    The vibrating unit (560) vibrates at a position corresponding to the direction of the object (O) with respect to the driver (J).
    The driver assistance system for a saddle-riding vehicle according to claim 8.
  10.  前記振動部(560)は、ヘルメット(40)、操向ハンドル(16)、燃料タンク(24)およびステップ(23)のうち少なくともいずれか1つに設けられている、
     請求項8または請求項9に記載の鞍乗り型車両の運転支援システム。
    The vibrating portion (560) is provided on at least one of a helmet (40), a steering handle (16), a fuel tank (24), and a step (23).
    The driver assistance system for a saddle-riding vehicle according to claim 8 or 9.
  11.  前記自車両と前記物体(O)との距離が所定の距離よりも小さい場合に警告音を発する警告音発生部(550)をさらに備える、
     請求項1から請求項10のいずれか1項に記載の鞍乗り型車両の運転支援システム。
    A warning sound generating unit (550) that emits a warning sound when the distance between the own vehicle and the object (O) is smaller than a predetermined distance is further provided.
    The driver assistance system for a saddle-riding vehicle according to any one of claims 1 to 10.
  12.  前記警告音発生部(550)は、前記運転者(J)に対する前記物体(O)の方向(Do)に対応する位置から警告音を発する、
     請求項11に記載の鞍乗り型車両の運転支援システム。
    The warning sound generating unit (550) emits a warning sound from a position corresponding to the direction (Do) of the object (O) with respect to the driver (J).
    The driver assistance system for a saddle-riding vehicle according to claim 11.
  13.  前記警告音発生部(550)は、ヘルメット(40)への後付けデバイスである、
     請求項11または請求項12に記載の鞍乗り型車両の運転支援システム。
    The warning sound generator (550) is a retrofit device to the helmet (40).
    The driver assistance system for a saddle-riding vehicle according to claim 11 or 12.
  14.  前記警告音発生部(550)は、ヘルメット(40)に設けられている、
     請求項11または請求項12に記載の鞍乗り型車両の運転支援システム。
    The warning sound generating unit (550) is provided on the helmet (40).
    The driver assistance system for a saddle-riding vehicle according to claim 11 or 12.
PCT/JP2019/013690 2019-03-28 2019-03-28 Driving assistance system for saddle-ride type vehicles WO2020194686A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2021508616A JP7232320B2 (en) 2019-03-28 2019-03-28 Driving support system for saddle type vehicle
PCT/JP2019/013690 WO2020194686A1 (en) 2019-03-28 2019-03-28 Driving assistance system for saddle-ride type vehicles

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/013690 WO2020194686A1 (en) 2019-03-28 2019-03-28 Driving assistance system for saddle-ride type vehicles

Publications (1)

Publication Number Publication Date
WO2020194686A1 true WO2020194686A1 (en) 2020-10-01

Family

ID=72611256

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/013690 WO2020194686A1 (en) 2019-03-28 2019-03-28 Driving assistance system for saddle-ride type vehicles

Country Status (2)

Country Link
JP (1) JP7232320B2 (en)
WO (1) WO2020194686A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007031875A (en) * 2005-07-26 2007-02-08 Yamaha Corp Direction information communicator
JP2008186281A (en) * 2007-01-30 2008-08-14 Toyota Motor Corp Alarm display for vehicle
JP2009042896A (en) * 2007-08-07 2009-02-26 Yamaha Motor Co Ltd Attention information presentation system and motorcycle
JP2010083205A (en) * 2008-09-29 2010-04-15 Denso Corp Device for supporting recognition of collision warning vehicle

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012221487A (en) * 2012-02-02 2012-11-12 Pioneer Electronic Corp Ambient condition detection system of mobile body
JP5999032B2 (en) * 2013-06-14 2016-09-28 株式会社デンソー In-vehicle display device and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007031875A (en) * 2005-07-26 2007-02-08 Yamaha Corp Direction information communicator
JP2008186281A (en) * 2007-01-30 2008-08-14 Toyota Motor Corp Alarm display for vehicle
JP2009042896A (en) * 2007-08-07 2009-02-26 Yamaha Motor Co Ltd Attention information presentation system and motorcycle
JP2010083205A (en) * 2008-09-29 2010-04-15 Denso Corp Device for supporting recognition of collision warning vehicle

Also Published As

Publication number Publication date
JP7232320B2 (en) 2023-03-02
JPWO2020194686A1 (en) 2021-12-09

Similar Documents

Publication Publication Date Title
JP7074432B2 (en) Vehicle control systems, vehicle control methods, and vehicle control programs
JP6765522B2 (en) Vehicle control systems, vehicle control methods, and programs
JP6460425B2 (en) Vehicle control system, vehicle control method, and program
JP6495971B2 (en) Vehicle control system, vehicle control method, and vehicle control program
JP6771670B2 (en) Vehicle control systems, vehicle control methods, and programs
JP6946425B2 (en) Vehicle control systems, vehicle control methods, and vehicle control programs
WO2018096644A1 (en) Vehicle display control device, vehicle display control method, and vehicle display control program
JP6765523B2 (en) Vehicle control systems, vehicle control methods, and vehicle control programs
JP2018203008A (en) Vehicle control system, vehicle control method and program
CN111727145A (en) Vehicle control system, vehicle control method, and program
JP7035447B2 (en) Vehicle control unit
JP2018203009A (en) Vehicle control system, vehicle control method, and program
JP6853903B2 (en) Vehicle control systems, vehicle control methods, and programs
JP2018203010A (en) Vehicle control system, vehicle control method and program
JP6840035B2 (en) Vehicle control system
JP6941636B2 (en) Vehicle control system and vehicle
JP7133086B2 (en) saddle-riding vehicle
JP7043444B2 (en) Vehicle control systems, vehicle control methods, and vehicle control programs
JP2018203012A (en) Vehicle control system, vehicle control method and program
JP2022152607A (en) Driving support device, driving support method, and program
JP7138239B2 (en) saddle-riding vehicle
WO2020194686A1 (en) Driving assistance system for saddle-ride type vehicles
WO2020194694A1 (en) Saddled vehicle
JP2022152715A (en) Vehicle control device, vehicle control method, and program
JP7461989B2 (en) Driving assistance device, driving assistance method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19920661

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021508616

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19920661

Country of ref document: EP

Kind code of ref document: A1