Nothing Special   »   [go: up one dir, main page]

WO2022227766A1 - 交通异常检测的方法和装置 - Google Patents

交通异常检测的方法和装置 Download PDF

Info

Publication number
WO2022227766A1
WO2022227766A1 PCT/CN2022/075071 CN2022075071W WO2022227766A1 WO 2022227766 A1 WO2022227766 A1 WO 2022227766A1 CN 2022075071 W CN2022075071 W CN 2022075071W WO 2022227766 A1 WO2022227766 A1 WO 2022227766A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
target vehicle
traffic
abnormal
speed
Prior art date
Application number
PCT/CN2022/075071
Other languages
English (en)
French (fr)
Inventor
吴文灏
赵禹翔
Original Assignee
北京百度网讯科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京百度网讯科技有限公司 filed Critical 北京百度网讯科技有限公司
Priority to JP2022561125A priority Critical patent/JP2023527265A/ja
Priority to US17/963,058 priority patent/US20230036864A1/en
Publication of WO2022227766A1 publication Critical patent/WO2022227766A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/2433Single-class perspective, e.g. one-against-all classification; Novelty detection; Outlier detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • G06V10/763Non-hierarchical techniques, e.g. based on statistics of modelling distributions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0133Traffic data processing for classifying traffic situation
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/052Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/056Detecting movement of traffic to be counted or controlled with provision for distinguishing direction of travel
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/164Centralised systems, e.g. external to vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles

Definitions

  • the present disclosure relates to the field of artificial intelligence, in particular to computer vision and deep learning technologies, and can be applied to video analysis scenarios, in particular to a method and device for traffic anomaly detection.
  • Anomaly detection in traffic scenarios is aimed at vehicles with abnormality on the main road, such as crashes, stalls, deviations from the main road, and driving in the opposite direction.
  • Traditional video traffic monitoring relies on the manual judgment of observers, which consumes a lot of manpower and is inefficient.
  • the resulting visual anomaly detection system uses computer vision to detect abnormal vehicles.
  • the present disclosure provides a method, apparatus, device, storage medium, and computer program product for traffic anomaly detection.
  • a method for detecting traffic anomalies comprising: acquiring at least two frames of continuous traffic images; respectively identifying the position of a target vehicle from the at least two frames of continuous traffic images, and obtaining the position information set; determine the travel direction and speed of the target vehicle according to the position information set; compare the travel direction and speed of the target vehicle with a pre-generated vehicle vector field to determine whether the target vehicle is abnormal.
  • an apparatus for traffic abnormality detection comprising: an acquisition unit configured to acquire at least two consecutive frames of traffic images; and an identification unit configured to acquire images from the at least two consecutive frames of traffic The position of the target vehicle is respectively identified in the image, and a set of position information is obtained; the determining unit is configured to determine the traveling direction and speed of the target vehicle according to the set of position information; the detection unit is configured to The traveling direction and speed of the target vehicle are compared with the pre-generated vehicle vector field to determine whether the target vehicle is abnormal.
  • an electronic device comprising: at least one processor; and a memory communicatively connected to the at least one processor; wherein the memory stores information that can be used by the at least one processor Executable instructions that are executed by the at least one processor to enable the at least one processor to perform the method of any one of the first aspects.
  • a non-transitory computer-readable storage medium storing computer instructions, wherein the computer instructions are used to cause the computer to perform the method according to any one of the first aspects .
  • a computer program product comprising a computer program which, when executed by a processor, implements the method according to any one of the first aspects.
  • a vehicle vector field is obtained through vehicle tracking, and an abnormal vehicle is judged according to the vehicle vector field.
  • the analysis of retrograde, collision, stall and break down situations can be realized. It effectively solves the massive computing demand generated by the global use of deep learning networks, and improves the current abnormal analysis of vehicle monitoring.
  • FIG. 1 is an exemplary system architecture diagram to which an embodiment of the present disclosure may be applied;
  • FIG. 2 is a flowchart of one embodiment of a method for traffic anomaly detection according to the present disclosure
  • FIG. 3 is a schematic diagram of an application scenario of the method for traffic anomaly detection according to the present disclosure.
  • FIG. 4 is a flowchart of yet another embodiment of a method for traffic anomaly detection according to the present disclosure
  • FIG. 5 is a schematic diagram of yet another application scenario of the method for traffic anomaly detection according to the present disclosure.
  • FIG. 6 is a schematic structural diagram of an embodiment of an apparatus for detecting traffic anomalies according to the present disclosure
  • FIG. 7 is a block diagram of an electronic device used to implement the method of traffic abnormality detection according to an embodiment of the present disclosure.
  • FIG. 1 shows an exemplary system architecture 100 of an embodiment of the method of traffic anomaly detection or the apparatus for traffic anomaly detection of the present application to which the present application may be applied.
  • the system architecture 100 may include cameras 101 , 102 , 103 , a network 104 and a server 105 .
  • the network 104 is the medium used to provide the communication link between the cameras 101 , 102 , 103 and the server 105 .
  • the network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
  • the user can use the cameras 101, 102, 103 to interact with the server 105 through the network 104 to receive or send messages and the like.
  • the cameras 101 , 102 , and 103 generally refer to cameras used for vehicle monitoring and can identify vehicle information (eg, license plate number, model). It can be an electronic police officer who captures illegal (for example, crossing a lane compaction line, driving in the opposite direction, occupying a non-motorized vehicle lane, not following a guide sign, running a red light, etc.) vehicles at an intersection. It can also be a bayonet camera installed on some key sections of expressways, provincial roads and national roads to capture illegal acts of speeding driving. The cameras 101 , 102 and 103 may also be illegal parking capture cameras, traffic monitoring cameras, Skynet monitoring cameras, mobile capture cameras, and the like.
  • vehicle information eg, license plate number, model
  • It can be an electronic police officer who captures illegal (for example, crossing a lane compaction line, driving in the opposite direction, occupying a non-motorized vehicle lane, not following a guide sign, running a red light, etc.) vehicles at an intersection. It can also be a bayone
  • the server 105 may be a server that provides various services, for example, a background analysis server that provides analysis on the vehicle data collected on the cameras 101 , 102 , and 103 .
  • the background analysis server can analyze and process the received vehicle data, and output the processing results (for example, abnormal situations such as retrograde, collision, stall, and breakdown).
  • the server may be hardware or software.
  • the server can be implemented as a distributed server cluster composed of multiple servers, or as a single server.
  • the server can be implemented as multiple software or software modules (for example, multiple software or software modules for providing distributed services), or can be implemented as a single software or software module.
  • the server can also be a server of a distributed system, or a server combined with a blockchain.
  • the server can also be a cloud server, or an intelligent cloud computing server or intelligent cloud host with artificial intelligence technology.
  • the traffic abnormality detection method provided in the embodiment of the present application is generally executed by the server 105 , and accordingly, the traffic abnormality detection device is generally set in the server 105 .
  • FIG. 1 the numbers of cameras, networks and servers in FIG. 1 are merely illustrative. There can be any number of cameras, networks and servers depending on the implementation needs.
  • the method for detecting traffic anomalies includes the following steps:
  • Step 201 Acquire at least two consecutive frames of traffic images.
  • the execution body of the traffic abnormality detection method may receive at least two consecutive frames of traffic images from the camera through a wired connection or a wireless connection.
  • the acquired traffic images are captured by the same camera at the same location and at the same angle. Because the driving direction and speed of the vehicle need to be judged through images, at least two frames of images are required to determine the position change of the vehicle.
  • a video can be obtained, which includes multiple frames of images.
  • the frame rate of the camera is generally 25 to 30 frames per second. The time interval between different traffic images can be determined according to the frame rate.
  • Step 202 Identify the position of the target vehicle from at least two consecutive frames of traffic images to obtain a set of position information.
  • a multi-target tracking algorithm in the prior art can be used to identify vehicles in a traffic image.
  • the target vehicle here refers to the vehicle that can be tracked and identified. Usually there will be multiple vehicles in a traffic image, so there will be multiple target vehicles.
  • the target vehicle can be detected in each traffic image, framed by a detection frame, and the position of the center point of the detection frame can be used as the position of the target vehicle. These center points are connected to form the trajectory of the vehicle, which can be represented by a set of position information. Each position information is the coordinate position of the center point of the detection frame in the image.
  • Step 203 Determine the traveling direction and speed of the target vehicle according to the location information set.
  • the driving route of the target vehicle can be determined according to the position change of the center point of the detection frame.
  • the mapping relationship between the pixel distance and the actual distance is determined according to the known camera parameters of the camera.
  • the actual moving distance of the target vehicle can be calculated according to the moving distance of the center point of the detection frame and the mapping relationship.
  • the speed of the target vehicle can be calculated by calculating the travel time of the target vehicle according to the frame rate.
  • Step 204 Compare the traveling direction and speed of the target vehicle with the pre-generated vehicle vector field to determine whether the target vehicle is abnormal.
  • the vehicle vector field can be automatically generated according to historical data or manually calibrated.
  • the vehicle vector field includes the direction and speed of the vehicle's travel.
  • the direction of the vehicle vector field is shown in Figure 3(a).
  • the speed may be the average speed of a large number of vehicles that have passed the segment of the road captured by the camera. Compare the current detected travel direction of the target vehicle with the direction of the vehicle vector field. If it is the opposite, it means that the target vehicle is traveling in the wrong direction.
  • the currently detected speed of the target vehicle can also be compared with the speed of the vehicle vector field. If it exceeds a certain range, the target vehicle is abnormal.
  • the speed of the target vehicle is less than 0.5 times the speed of the vehicle vector field, or the speed of the target vehicle is greater than 1.5 times the speed of the vehicle vector field. If the speed of the vehicle vector field is 60km/h, the speed of the target vehicle is abnormal when the speed of the target vehicle is less than 30km/h (the speed is abnormally low) or greater than 90km/h (the speed is abnormally high). If the number of vehicles with an abnormally low speed reaches the predetermined vehicle threshold, it means that the road section is congested, and a prompt message can be output to the monitoring personnel to further check whether there is a traffic accident.
  • vehicle owner information can be obtained through license plate recognition. In addition to identity information, it can also include the contact information of the person and his family, medical history (such as asthma, heart disease, etc.) and other related information. You can also notify the traffic control department for processing. The traffic control department can find the vehicle according to the location, and arrange a tow truck if the vehicle is faulty. If the owner becomes ill, first aid can be given based on a pre-identified medical history and family members can be contacted.
  • abnormal vehicles can be quickly and accurately determined, and relevant information can be provided for subsequent processing. , which can quickly resolve abnormal vehicles.
  • comparing the traveling direction and speed of the target vehicle with a pre-generated vehicle vector field to determine whether the target vehicle is abnormal includes: comparing the traveling direction of the target vehicle with the vector field of the vehicle The directions are compared, and if the included angle exceeds a predetermined threshold, it is determined that the target vehicle trajectory is abnormal. If the included angle exceeds the predetermined threshold, it means that the vehicle changes lanes. In order to filter out the normal lane change overtaking of the vehicle, multiple frames can be continuously monitored. If the obtained trajectory is not a normal lane change trajectory, it means that the vehicle is abnormal. When the included angle is about 180 degrees, it means that the vehicle is going backwards. In this way, anomaly detection can be performed without the need to manually mark the driving direction of the vehicle, which improves the flexibility of detection. This method can be applied to any unlabeled road segment.
  • the number of vehicles with abnormal trajectories at the same location exceeds two, it is determined that a vehicle collision has occurred.
  • the number of vehicles with abnormal trajectories at the same position in the same set of images exceeds 2, indicating that at least 2 vehicles collided, as shown in (c) in Figure 3.
  • It can quickly and accurately detect traffic accidents, and can also quickly detect the license plate of the accident vehicle and notify the relevant insurance company for processing.
  • the relevant video is directly retained as evidence for reporting the case, and the traffic police do not need to review the video from beginning to end to find the time of the accident.
  • the technical solution of the present disclosure can directly locate the time of the accident. It is convenient for the traffic control department to quickly investigate the cause of the accident and divide the responsibility for the accident. Improve the efficiency of handling traffic accidents.
  • FIG. 3 is a schematic diagram of an application scenario of the method for traffic abnormality detection according to this embodiment.
  • the vehicle vector field under normal conditions is shown in Figure (a).
  • the white area in the figure is a road, and it can be seen that the driving directions of the vehicles on the two roads are the same.
  • Figure (b) shows the situation where the vehicle is driven off the road. Whether the vehicle has left the road can be determined according to whether the position of the vehicle detection frame is within the road area. If you drive off the road, you have an accident.
  • Figure (c) shows the vehicle trajectory when two vehicles collide. The directions and vector fields of the two vehicles are not the same, and the deviation is large.
  • Figure (d) shows that the vehicle stalls or breaks down. It can be determined that the number of frames overlapping the detection frame exceeds the predetermined frame number threshold and that the vehicle is not moving and that the vehicle is in the road area to avoid false detection of vehicles in the parking lot.
  • the flow 400 of the method for detecting traffic anomalies includes the following steps:
  • Step 401 acquiring traffic videos of a predetermined time length.
  • the execution body of the method for traffic abnormality detection may receive a traffic video with a predetermined length of time from a camera through a wired connection or a wireless connection.
  • the length of time is related to the collection time. If it is a peak traffic period, the required time is short, and if it is a traffic trough period, it needs to be collected for a long time.
  • Step 402 create a matrix with the same size as the frame of the traffic video.
  • each element in the matrix represents a pixel on the screen, and the initial value of each element is zero. For example, if the video resolution is 900*1024, an all-zero matrix of 900*1024 is created.
  • Step 403 track and detect the vehicle in the traffic video, and set the element corresponding to the pixel in the detection frame of the vehicle to a non-zero value.
  • the pixels in the detection frame are marked as roads, and the elements in the corresponding positions of the matrix can be set to non-zero values, for example, 1.
  • the initial matrix is
  • Step 404 Determine the pixel point corresponding to the non-zero value in the matrix as a road to obtain a road area.
  • the non-zero value pixel points determined in step 403 are connected to form a road area.
  • the road area may be re-determined according to steps 401-404 when the abnormality detection rate increases to a predetermined threshold. Because the angle of the camera may change slightly due to strong wind and other reasons, it will be inaccurate to judge the abnormality according to the originally determined road area, which will lead to the sudden detection of a large number of abnormal vehicles (for example, the detection of a large number of vehicles leaving the road) ), the road area needs to be corrected. Only using the corrected road area can accurately detect vehicles in the currently captured image.
  • the road is determined by actually passing vehicles instead of using the road recognition model for road recognition, because the road recognition model will mistakenly detect the parking lot as a road.
  • the method can relabel roads at any time as needed. Compared with manual annotation, the road area can be updated in time according to the current situation to avoid false detection.
  • FIG. 5 is a schematic diagram of an application scenario of the method for traffic abnormality detection according to this embodiment.
  • Figure (a) shows the traffic image captured by the camera.
  • Figure (b) shows that the vehicle is detected by the tracking algorithm, and the vehicle is framed by the detection frame.
  • Figure (c) shows that the pixels in the area where the detection frame is located are filled with white.
  • Figure (d) shows that the non-white area is set as the road area, and the other color areas are set as the non-road area, which is uniformly black.
  • the method further includes: obtaining a vehicle trajectory set according to a position change of the detection frame of the vehicle.
  • the vehicle trajectory set is clustered and analyzed to generate a vehicle vector field.
  • Cluster analysis can be performed by common clustering algorithms such as kmeans.
  • the final clustering result is the trajectories classified by lane, and the trajectories include directions. And based on the trajectory of each lane, the average speed of vehicles passing through that lane is calculated. Velocity and direction make up a vector field.
  • the vector field obtained in this way is more convenient and faster than manual labeling. And it can be updated in real time according to the current situation. For example, for reasons such as congestion, some lanes are temporarily changed in direction (see tidal lanes). At this time, the method of the present disclosure can update the vector field in time without misjudgment.
  • comparing the traveling direction and speed of the target vehicle with a pre-generated vehicle vector field to determine whether the target vehicle is abnormal includes: in response to determining that the location of the target vehicle is in the road area And the time when the speed is zero exceeds the predetermined duration threshold, it is determined that the target vehicle stops abnormally. If the detection boxes overlap, the vehicle is not moving and the speed is 0.
  • the parking time can be determined by the number of overlapping frames of the detection frame. If the parking time exceeds the predetermined duration threshold, the abnormal parking cannot be determined, and it is necessary to further determine whether the vehicle is in the road area at this time. Parking is only considered abnormal parking if it is parked in the road area. If parked off the road, it does not interfere with other vehicles. This method can quickly and accurately detect abnormal parking, so that it can be dealt with as soon as possible, such as notifying other drivers through traffic broadcasts, so that they can pay attention to safety and collision, and can also make them change lanes earlier.
  • comparing the traveling direction and speed of the target vehicle with a pre-generated vehicle vector field to determine whether the target vehicle is abnormal includes: in response to detecting that the target vehicle travels from the road area to Off-road area, determine the target vehicle to leave the road. Driving off the road is also an anomaly, possibly caused by a punctured tire, etc.
  • the method can quickly and accurately detect the vehicle leaving the road, and can quickly notify the traffic control department of the location of the accident, so as to carry out rescue in time and avoid secondary injuries.
  • the present disclosure provides an embodiment of an apparatus for detecting traffic anomalies.
  • the apparatus embodiment corresponds to the method embodiment shown in FIG. 2 .
  • the apparatus Specifically, it can be applied to various electronic devices.
  • the apparatus 600 for detecting traffic anomalies in this embodiment includes: an acquiring unit 601 , an identifying unit 602 , a determining unit 603 and a detecting unit 604 .
  • the acquiring unit 601 is configured to acquire at least two consecutive frames of traffic images.
  • the identification unit 602 is configured to respectively identify the position of the target vehicle from at least two consecutive frames of traffic images to obtain a set of position information.
  • the determining unit 603 is configured to determine the traveling direction and speed of the target vehicle according to the location information set.
  • the detection unit 604 is configured to compare the traveling direction and speed of the target vehicle with a pre-generated vehicle vector field to determine whether the target vehicle is abnormal.
  • the apparatus 600 further includes an extraction unit 605, which is configured to: acquire a traffic video of a predetermined time length. Create a matrix of the same size as the picture of the traffic video, where each element in the matrix represents a pixel on the picture, and the initial value of each element is zero. Track and detect vehicles in the traffic video, and set the elements corresponding to the pixels in the detection frame of the vehicle to non-zero values. The pixels corresponding to the non-zero values in the matrix are determined as roads to obtain the road area.
  • an extraction unit 605 is configured to: acquire a traffic video of a predetermined time length. Create a matrix of the same size as the picture of the traffic video, where each element in the matrix represents a pixel on the picture, and the initial value of each element is zero. Track and detect vehicles in the traffic video, and set the elements corresponding to the pixels in the detection frame of the vehicle to non-zero values. The pixels corresponding to the non-zero values in the matrix are determined as roads to obtain the road area.
  • the extraction unit 605 is further configured to: obtain a vehicle trajectory set according to a position change of the detection frame of the vehicle.
  • the vehicle trajectory set is clustered and analyzed to generate a vehicle vector field.
  • the detection unit 604 is further configured to: in response to determining that the position of the target vehicle is in the road area and the time when the speed is zero exceeds a predetermined duration threshold, determine that the target vehicle is parked abnormally.
  • the detection unit 604 is further configured to: in response to detecting that the target vehicle travels from the road area to the non-road area, determine that the target vehicle is off the road.
  • the detection unit 604 is further configured to: compare the traveling direction of the target vehicle with the direction of the vehicle vector field, and if the included angle exceeds a predetermined threshold, determine that the trajectory of the target vehicle is abnormal .
  • the detection unit 604 is further configured to: in response to determining that the number of vehicles with abnormal trajectories at the same location exceeds two, determine that a vehicle collision occurs.
  • the present disclosure also provides an electronic device, a readable storage medium, and a computer program product.
  • An electronic device comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor, the instructions being executed by the at least one processor At least one processor executes to enable the at least one processor to perform the method described in process 200 or process 400 .
  • a non-transitory computer-readable storage medium storing computer instructions, wherein the computer instructions are used to cause the computer to perform the method described in the process 200 or the process 400.
  • a computer program product includes a computer program that, when executed by a processor, implements the method described in process 200 or process 400 .
  • FIG. 7 shows a schematic block diagram of an example electronic device 700 that may be used to implement embodiments of the present disclosure.
  • Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframe computers, and other suitable computers.
  • Electronic devices may also represent various forms of mobile devices, such as personal digital processors, cellular phones, smart phones, wearable devices, and other similar computing devices.
  • the components shown herein, their connections and relationships, and their functions are by way of example only, and are not intended to limit implementations of the disclosure described and/or claimed herein.
  • the device 700 includes a computing unit 701 that can be executed according to a computer program stored in a read only memory (ROM) 702 or loaded into a random access memory (RAM) 703 from a storage unit 708 Various appropriate actions and handling. In the RAM 703, various programs and data required for the operation of the device 700 can also be stored.
  • the computing unit 701, the ROM 702, and the RAM 703 are connected to each other through a bus 704.
  • An input/output (I/O) interface 705 is also connected to bus 704 .
  • Various components in the device 700 are connected to the I/O interface 705, including: an input unit 706, such as a keyboard, mouse, etc.; an output unit 707, such as various types of displays, speakers, etc.; a storage unit 708, such as a magnetic disk, an optical disk, etc. ; and a communication unit 709, such as a network card, a modem, a wireless communication transceiver, and the like.
  • the communication unit 709 allows the device 700 to exchange information/data with other devices through a computer network such as the Internet and/or various telecommunication networks.
  • Computing unit 701 may be various general-purpose and/or special-purpose processing components with processing and computing capabilities. Some examples of computing units 701 include, but are not limited to, central processing units (CPUs), graphics processing units (GPUs), various specialized artificial intelligence (AI) computing chips, various computing units that run machine learning model algorithms, digital signal processing processor (DSP), and any suitable processor, controller, microcontroller, etc.
  • the computing unit 701 executes the various methods and processes described above, such as the method of traffic anomaly detection.
  • the method of traffic anomaly detection may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as storage unit 708 .
  • part or all of the computer program may be loaded and/or installed on device 700 via ROM 702 and/or communication unit 709 .
  • ROM 702 and/or communication unit 709 When a computer program is loaded into RAM 703 and executed by computing unit 701, one or more steps of the method of traffic anomaly detection described above may be performed.
  • the computing unit 701 may be configured to perform the method of traffic anomaly detection by any other suitable means (eg, by means of firmware).
  • Various implementations of the systems and techniques described herein above may be implemented in digital electronic circuitry, integrated circuit systems, field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), application specific standard products (ASSPs), systems on chips system (SOC), load programmable logic device (CPLD), computer hardware, firmware, software, and/or combinations thereof.
  • FPGAs field programmable gate arrays
  • ASICs application specific integrated circuits
  • ASSPs application specific standard products
  • SOC systems on chips system
  • CPLD load programmable logic device
  • computer hardware firmware, software, and/or combinations thereof.
  • These various embodiments may include being implemented in one or more computer programs executable and/or interpretable on a programmable system including at least one programmable processor that
  • the processor which may be a special purpose or general-purpose programmable processor, may receive data and instructions from a storage system, at least one input device, and at least one output device, and transmit data and instructions to the storage system, the at least one input device, and the at least one output device an output device.
  • Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer or other programmable data processing apparatus, such that the program code, when executed by the processor or controller, performs the functions/functions specified in the flowcharts and/or block diagrams. Action is implemented.
  • the program code may execute entirely on the machine, partly on the machine, partly on the machine and partly on a remote machine as a stand-alone software package or entirely on the remote machine or server.
  • a machine-readable medium may be a tangible medium that may contain or store a program for use by or in connection with the instruction execution system, apparatus or device.
  • the machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
  • Machine-readable media may include, but are not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems, devices, or devices, or any suitable combination of the foregoing.
  • machine-readable storage media would include one or more wire-based electrical connections, portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), fiber optics, compact disk read only memory (CD-ROM), optical storage, magnetic storage, or any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read only memory
  • CD-ROM compact disk read only memory
  • magnetic storage or any suitable combination of the foregoing.
  • the systems and techniques described herein may be implemented on a computer having a display device (eg, a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to the user ); and a keyboard and pointing device (eg, a mouse or trackball) through which a user can provide input to the computer.
  • a display device eg, a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
  • a keyboard and pointing device eg, a mouse or trackball
  • Other kinds of devices can also be used to provide interaction with the user; for example, the feedback provided to the user can be any form of sensory feedback (eg, visual feedback, auditory feedback, or tactile feedback); and can be in any form (including acoustic input, voice input, or tactile input) to receive input from the user.
  • the systems and techniques described herein may be implemented on a computing system that includes back-end components (eg, as a data server), or a computing system that includes middleware components (eg, an application server), or a computing system that includes front-end components (eg, a user computer having a graphical user interface or web browser through which a user may interact with implementations of the systems and techniques described herein), or including such backend components, middleware components, Or any combination of front-end components in a computing system.
  • the components of the system may be interconnected by any form or medium of digital data communication (eg, a communication network). Examples of communication networks include: Local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
  • a computer system can include clients and servers. Clients and servers are generally remote from each other and usually interact through a communication network. The relationship of client and server arises by computer programs running on the respective computers and having a client-server relationship to each other.
  • the server can be a distributed system server, or a server combined with a blockchain.
  • the server can also be a cloud server, or an intelligent cloud computing server or intelligent cloud host with artificial intelligence technology.
  • the server can be a distributed system server, or a server combined with a blockchain.
  • the server can also be a cloud server, or an intelligent cloud computing server or intelligent cloud host with artificial intelligence technology.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Probability & Statistics with Applications (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Analysis (AREA)

Abstract

本公开提供了交通异常检测的方法和装置,涉及人工智能领域,具体涉及计算机视觉和深度学习技术,可应用于视频分析场景下。具体实现方案为:获取至少两帧连续的交通图像。从至少两帧连续的交通图像中分别识别出目标车辆的位置,得到位置信息集合。根据位置信息集合确定出目标车辆的行进方向和速度。将目标车辆的行进方向和速度与预先生成的车辆矢量场进行比较,确定目标车辆是否异常。该实施方式能够提高交通异常检测的速度和准确率。

Description

交通异常检测的方法和装置
本专利申请要求于2021年4月28日提交的、申请号为202110466631.2、发明名称为“交通异常检测的方法和装置”的中国专利申请的优先权,该申请的全文以引用的方式并入本申请中。
技术领域
本公开涉及人工智能领域,具体涉及计算机视觉和深度学习技术,可应用于视频分析场景下,特别是交通异常检测的方法和装置。
背景技术
交通场景下的异常检测是针对主干道路处出现异常的车辆,如撞车,失速,偏离主干道以及反方向行驶等问题。传统的视频交通监控依赖观察员的人工判断,这种方式需要消耗大量的人力,效率低下。由此而生的视觉异常检测系统则是利用计算机视觉对异常车辆检测。
现有技术的计算机进行异常交通事件检测方法虽然对不同的场景具有不错的鲁棒性,但是识别精度和能够识别出的异常情况并不能够应对较为复杂的交通场景。
发明内容
本公开提供了一种交通异常检测的方法、装置、设备、存储介质以及计算机程序产品。
根据本公开的第一方面,提供了一种交通异常检测的方法,包括:获取至少两帧连续的交通图像;从所述至少两帧连续的交通图像中分别识别出目标车辆的位置,得到位置信息集合;根据所述位置信息集合确定出所述目标车辆的行进方向和速度;将所述目标车辆的行进方向和速度与预先生成的车辆矢量场进行比较,确定所述目标车辆是否异常。
根据本公开的第二方面,提供了一种交通异常检测的装置,包括:获取单元,被配置成获取至少两帧连续的交通图像;识别单元,被配置成从所述至少两帧连续的交通图像中分别识别出目标车辆的位置,得到位置信 息集合;确定单元,被配置成根据所述位置信息集合确定出所述目标车辆的行进方向和速度;检测单元,被配置成将所述目标车辆的行进方向和速度与预先生成的车辆矢量场进行比较,确定所述目标车辆是否异常。
根据本公开的第三方面,提供了一种电子设备,包括:至少一个处理器;以及与所述至少一个处理器通信连接的存储器;其中,所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行第一方面中任一项所述的方法。
根据本公开的第四方面,提供了一种存储有计算机指令的非瞬时计算机可读存储介质,其中,所述计算机指令用于使所述计算机执行根据第一方面中任一项所述的方法。
根据本公开的第五方面,提供了一种计算机程序产品,包括计算机程序,所述计算机程序在被处理器执行时实现根据第一方面中任一项所述的方法。
本公开的实施例提供的交通异常检测的方法和装置,通过车辆跟踪获得车辆矢量场,根据车辆矢量场对异常车辆进行判断。从而实现对于逆行,碰撞,失速以及抛锚等情况的分析。有效解决全局使用深度学习网络产生的大量计算需求,改善了目前的车辆监控异常分析情况。
应当理解,本部分所描述的内容并非旨在标识本公开的实施例的关键或重要特征,也不用于限制本公开的范围。本公开的其它特征将通过以下的说明书而变得容易理解。
附图说明
附图用于更好地理解本方案,不构成对本公开的限定。其中:
图1是本公开的一个实施例可以应用于其中的示例性系统架构图;
图2是根据本公开的交通异常检测的方法的一个实施例的流程图;
图3是根据本公开的交通异常检测的方法的一个应用场景的示意图;
图4是根据本公开的交通异常检测的方法的又一个实施例的流程图;
图5是根据本公开的交通异常检测的方法的又一个应用场景的示意图;
图6是根据本公开的交通异常检测的装置的一个实施例的结构示意 图;
图7是用来实现本公开实施例的交通异常检测的方法的电子设备的框图。
具体实施方式
以下结合附图对本公开的示范性实施例做出说明,其中包括本公开实施例的各种细节以助于理解,应当将它们认为仅仅是示范性的。因此,本领域普通技术人员应当认识到,可以对这里描述的实施例做出各种改变和修改,而不会背离本公开的范围和精神。同样,为了清楚和简明,以下的描述中省略了对公知功能和结构的描述。
图1示出了可以应用本申请的交通异常检测的方法或交通异常检测的装置的实施例的示例性系统架构100。
如图1所示,系统架构100可以包括摄像头101、102、103,网络104和服务器105。网络104用以在摄像头101、102、103和服务器105之间提供通信链路的介质。网络104可以包括各种连接类型,例如有线、无线通信链路或者光纤电缆等等。
用户可以使用摄像头101、102、103通过网络104与服务器105交互,以接收或发送消息等。
摄像头101、102、103泛指用于进行车辆监控的、可识别出车辆信息(例如车牌号、车型)的摄像头。可以是在十字路口对违法(比如,跨越车道压实线、逆向行驶、占用非机动车道、不按导向标识行驶、闯红灯等)车辆进行抓拍的电子警察。还可以是安装位置在高速公路、省道和国道的一些重点路段用来抓拍超速开车违法行为的卡口摄像头。摄像头101、102、103还可以是违停抓拍摄像头、流量监控摄像头、天网监控摄像头、流动抓拍摄像头等。
服务器105可以是提供各种服务的服务器,例如对摄像头101、102、103上采集的车辆数据提供分析的后台分析服务器。后台分析服务器可以对接收到的车辆数据进行分析等处理,并将处理结果(例如逆行,碰撞,失速以及抛锚等异常情况)输出。
需要说明的是,服务器可以是硬件,也可以是软件。当服务器为硬件 时,可以实现成多个服务器组成的分布式服务器集群,也可以实现成单个服务器。当服务器为软件时,可以实现成多个软件或软件模块(例如用来提供分布式服务的多个软件或软件模块),也可以实现成单个软件或软件模块。在此不做具体限定。服务器也可以为分布式系统的服务器,或者是结合了区块链的服务器。服务器也可以是云服务器,或者是带人工智能技术的智能云计算服务器或智能云主机。
需要说明的是,本申请实施例所提供的交通异常检测的方法一般由服务器105执行,相应地,交通异常检测的装置一般设置于服务器105中。
应该理解,图1中的摄像头、网络和服务器的数目仅仅是示意性的。根据实现需要,可以具有任意数目的摄像头、网络和服务器。
继续参考图2,示出了根据本申请的交通异常检测的方法的一个实施例的流程200。该交通异常检测的方法,包括以下步骤:
步骤201,获取至少两帧连续的交通图像。
在本实施例中,交通异常检测的方法的执行主体(例如图1所示的服务器)可以通过有线连接方式或者无线连接方式从摄像头接收至少两帧连续的交通图像。所获取的交通图像是由同一摄像头在相同的位置以相同的角度拍摄的。因为需要通过图像判断车辆的行驶方向和速度,因此需要至少两帧图像才能确定车辆的位置变化。实际使用中可获取一段视频,该视频中包括多帧图像。摄像头的帧率一般是25至30帧/每秒。可根据帧率确定不同交通图像之间的时间间隔。
步骤202,从至少两帧连续的交通图像中分别识别出目标车辆的位置,得到位置信息集合。
在本实施例中,可采用现有技术的多目标跟踪算法(例如deepsort算法)对交通图像中的车辆进行识别。这里的目标车辆指的是能够跟踪识别出的车辆。通常一张交通图像中会有多个车辆,这样就会有多个目标车辆。在每张交通图像中可以检测出目标车辆,用检测框框起来,可将检测框中心点的位置作为目标车辆的位置。这些中心点连接起来就形成了车辆的轨迹,可以用位置信息集合来表示,每个位置信息是检测框的中心点在图像中的坐标位置。
步骤203,根据位置信息集合确定出目标车辆的行进方向和速度。
在本实施例中,根据检测框的中心点的位置变化可以确定出目标车辆的行驶路线,即方向。根据已知的摄像头的相机参数确定出像素点距离与实际距离的映射关系。目标车辆移动后,根据检测框的中心点移动的距离和映射关系可以计算出目标车辆实际移动的距离。再根据帧率计算出目标车辆行驶的时间,就可以计算出目标车辆的速度。
步骤204,将目标车辆的行进方向和速度与预先生成的车辆矢量场进行比较,确定目标车辆是否异常。
在本实施例中,车辆矢量场可以根据历史数据自动生成,也可由人工标定。车辆矢量场包括车辆行驶的方向和速度。车辆矢量场的方向如图3中(a)所示。速度可以是已通过摄像头拍摄的路段的大量车辆的平均速度。将当前检测到的目标车辆的行进方向与车辆矢量场的方向比较,如果相反,则说明目标车辆逆行。还可将当前检测到的目标车辆的速度与车辆矢量场的速度比较,如果超过一定范围,则说明目标车辆异常。例如,目标车辆的车速小于车辆矢量场的速度0.5倍,或目标车辆的车速大于车辆矢量场的速度1.5倍。如果车辆矢量场的速度为60km/h,则当目标车辆的车速小于30km/h(车速异常低)或大于90km/h(车速异常高)时,目标车辆的车速异常。如果车速异常低的车辆数量达到预定车辆阈值,则说明该路段出现拥堵,可输出提示信息给监控人员,提示进一步检查是否出现交通事故。
还可直接定位出异常车辆的位置,方便救援。一旦检测到异常车辆,则可通过车牌识别获取到车主信息,除了身份信息之外,还可包括本人及家属的联系方式、病史(例如哮喘、心脏病等)等相关信息。还可通知交管部门进行处理。交管部门可根据定位找到车辆,如果是车辆故障则安排拖车。如果是车主病倒,则可根据预先识别出的病史进行急救,并联系其家人。
本公开的上述实施例提供的方法,通过对车辆跟踪检测得到的车辆的方向和速度,与已知的车辆矢量场进行比较,可以快速、准确地确定出异常车辆,并为后续处理提供相关信息,能够快速地解决异常车辆。
在本实施例的一些可选的实现方式中,将目标车辆的行进方向和速度与预先生成的车辆矢量场进行比较,确定目标车辆是否异常,包括:将目标车辆的行进方向与车辆矢量场的方向进行比较,如果夹角超过预定阈值,则确定出目标车辆轨迹异常。如果夹角超过预定阈值,则说明车辆变道,为了过滤掉车辆正常变道超车的情况,可连续监测多帧,得到的轨迹如果不是正常的变道轨迹,则说明车辆异常。当夹角约为180度时,说明车辆逆行。这样不需要人工标注出车辆行驶的方向即可进行异常检测,提高了检测的灵活性。可将此方法应用于任何未经标注的路段使用。
在本实施例的一些可选的实现方式中,响应于确定同一位置出现轨迹异常的车辆的数量超过2辆,确定发生车辆碰撞。在同一组图像中相同位置出现轨迹异常的车辆的数量超过2辆,说明至少2车相撞,如图3中(c)所示。可以快速、准确地检测出交通事故,还可快速检测出事故车辆的车牌,通知相关的保险公司进行处理。直接将相关视频保留下来作为报案的证据,不需要交警从头到尾把视频回看查找事故发生时刻,本公开的技术方案可以直接定位出发生事故的时间。方便交管部门快速排查事故的原因,划分事故责任。提高交通事故的处理效率。
继续参见图3,图3是根据本实施例的交通异常检测的方法的应用场景的一个示意图。在图3的应用场景中,正常情况下的车辆矢量场如图(a)所示。图中白色区域为道路,可以看出两条道路的车辆行驶方向是一致的。图(b)展示的是车辆行驶到道路之外的情况。可以根据车辆检测框的位置是否在道路区域内,判断出车辆是否驶出道路。如果驶出道路则说明出了交通事故。图(c)展示的是两车相撞时的车辆轨迹。两辆车的方向与矢量场都不相同,偏差较大。图(d)展示的是车辆失速或抛锚,可根据检测框重叠的帧数超过预定帧数阈值确定出车辆没有移动,并且车辆是在道路区域内,以避免将停车场内的车辆误检。
进一步参考图4,其示出了交通异常检测的方法的又一个实施例的流程400。该交通异常检测的方法的流程400,包括以下步骤:
步骤401,获取预定时间长度的交通视频。
在本实施例中,交通异常检测的方法的执行主体(例如图1所示的服 务器)可以通过有线连接方式或者无线连接方式从摄像头接收预定时间长度的交通视频。时间长度和采集时刻有关,如果是交通高峰期,则需要的时间短,如果是交通低谷期,则需要采集较长的时间。采用与步骤201相同的摄像头在相同位置以相同角度采集视频。这样可以保证该视频分割出来的道路区域可用于流程200。
步骤402,创建一个与交通视频的画面同等大小的矩阵。
在本实施例中,矩阵中每个元素代表画面上的一个像素点,每个元素的初始值为零。例如,视频分辨率为900*1024,则创建900*1024的全零矩阵。
步骤403,对交通视频中的车辆进行跟踪检测,将车辆的检测框中的像素点对应的元素设置为非零值。
在本实施例中,每检测到一个车辆,将检测框中的像素点标记为道路,可将矩阵相应位置的元素设置为非零值,例如,1。假设初始矩阵为
Figure PCTCN2022075071-appb-000001
在图像中(0,1)、(1,1)和(2,1)位置处检测到车辆,则矩阵修改为
Figure PCTCN2022075071-appb-000002
步骤404,将矩阵中非零值对应的像素点确定为道路,得到道路区域。
在本实施例中,将步骤403中确定的非零值的像素点连接起来就构成了道路区域。
可选地,可在异常检测率增加到预定阈值时重新根据步骤401-404确定道路区域。因为可能由于大风等原因导致摄像头的角度发生微小变化,再按照原来确定的道路区域进行异常判断时就不准确了,会导致突然检测出大量车辆异常的情况(例如,检测出大量车辆驶出道路),需要对道路区域进行校正。使用校正后的道路区域才能准确地检测当前拍摄的图像中的车辆。
本公开的上述实施例提供的方法,通过实际通过的车辆来确定道路而不是利用道路识别模型进行道路识别,是因为道路识别模型会将停车场误检为道路。此外该方法可根据需求随时重新标注道路。与人工标注相比, 能够根据当前情况及时更新道路区域,避免误检。
继续参见图5,图5是根据本实施例的交通异常检测的方法的应用场景的一个示意图。在图5的应用场景中,图(a)展示的是摄像头拍摄的交通图像。图(b)展示的是通过跟踪算法检测到车辆,用检测框框出车辆。图(c)展示的是将检测框所在区域的像素点用白色填充。图(d)展示的是将非白色区域设置为道路区域,其它颜色区域设置为非道路区域,统一为黑色。
在本实施例的一些可选的实现方式中,该方法还包括:根据车辆的检测框的位置变化得到车辆轨迹集。将车辆轨迹集进行聚类分析,生成车辆矢量场。可通过常见的聚类算法例如kmeans进行聚类分析。最终聚类结果为按车道分类的轨迹,轨迹包括方向。并且根据每个车道的轨迹计算通过该车道的车辆的平均速度。速度和方向构成了矢量场。这种方式得到的矢量场比人工标注方式方便、快捷。并且能够根据当前状况实时更新,例如,对于拥堵等原因,让某些车道临时转换方向(参见潮汐车道),此时通过本公开的方法可以及时的更新矢量场,而不会发生误判。
在本实施例的一些可选的实现方式中,将目标车辆的行进方向和速度与预先生成的车辆矢量场进行比较,确定目标车辆是否异常,包括:响应于确定目标车辆的位置在道路区域中且速度为零的时间超过预定时长阈值,确定目标车辆异常停车。如果检测框重叠,则说明车辆未移动,速度为0。可通过检测框重叠的帧数确定停车时间,如果停车时间超过预定时长阈值,还不能确定异常停车,需要进一步判断车辆此时是否在道路区域内。如果在道路区域内停车才算异常停车。如果停在道路外,则不妨碍其它车辆行驶。这种方法能够快速、准确地检测出异常停车,从而尽快进行处理,例如通过交通广播通知其它司机,让他们注意安全小心撞车,也可以让他们提前变道。
在本实施例的一些可选的实现方式中,将目标车辆的行进方向和速度与预先生成的车辆矢量场进行比较,确定目标车辆是否异常,包括:响应于检测到目标车辆从道路区域行进到非道路区域,确定目标车辆驶出道路。驶出道路也是一种异常,可能是爆胎等原因造成的。本方法快速、准确地 检测出车辆驶出道路,可快速通知交管部门发生事故的位置,及时进行救援,避免二次伤害。
进一步参考图6,作为对上述各图所示方法的实现,本公开提供了一种交通异常检测的装置的一个实施例,该装置实施例与图2所示的方法实施例相对应,该装置具体可以应用于各种电子设备中。
如图6所示,本实施例的交通异常检测的装置600包括:获取单元601、识别单元602、确定单元603和检测单元604。其中,获取单元601,被配置成获取至少两帧连续的交通图像。识别单元602,被配置成从至少两帧连续的交通图像中分别识别出目标车辆的位置,得到位置信息集合。确定单元603,被配置成根据位置信息集合确定出目标车辆的行进方向和速度。检测单元604,被配置成将目标车辆的行进方向和速度与预先生成的车辆矢量场进行比较,确定目标车辆是否异常。
在本实施例中,交通异常检测的装置600的获取单元601、识别单元602、确定单元603和检测单元604的具体处理可以参考图2对应实施例中的步骤201、步骤202、步骤203、步骤204。
在本实施例的一些可选的实现方式中,装置600还包括提取单元605,被配置成:获取预定时间长度的交通视频。创建一个与交通视频的画面同等大小的矩阵,其中,矩阵中每个元素代表画面上的一个像素点,每个元素的初始值为零。对交通视频中的车辆进行跟踪检测,将车辆的检测框中的像素点对应的元素设置为非零值。将矩阵中非零值对应的像素点确定为道路,得到道路区域。
在本实施例的一些可选的实现方式中,提取单元605进一步被配置成:根据车辆的检测框的位置变化得到车辆轨迹集。将车辆轨迹集进行聚类分析,生成车辆矢量场。
在本实施例的一些可选的实现方式中,检测单元604进一步被配置成:响应于确定目标车辆的位置在道路区域中且速度为零的时间超过预定时长阈值,确定目标车辆异常停车。
在本实施例的一些可选的实现方式中,检测单元604进一步被配置成:响应于检测到目标车辆从道路区域行进到非道路区域,确定目标车辆驶出 道路。
在本实施例的一些可选的实现方式中,检测单元604进一步被配置成:将目标车辆的行进方向与车辆矢量场的方向进行比较,如果夹角超过预定阈值,则确定出目标车辆轨迹异常。
在本实施例的一些可选的实现方式中,检测单元604进一步被配置成:响应于确定同一位置出现轨迹异常的车辆的数量超过2辆,确定发生车辆碰撞。
根据本公开的实施例,本公开还提供了一种电子设备、一种可读存储介质和一种计算机程序产品。
一种电子设备,包括:至少一个处理器;以及与所述至少一个处理器通信连接的存储器;其中,所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行流程200或流程400所述的方法。
一种存储有计算机指令的非瞬时计算机可读存储介质,其中,所述计算机指令用于使所述计算机执行流程200或流程400所述的方法。
一种计算机程序产品,包括计算机程序,所述计算机程序在被处理器执行时实现流程200或流程400所述的方法。
图7示出了可以用来实施本公开的实施例的示例电子设备700的示意性框图。电子设备旨在表示各种形式的数字计算机,诸如,膝上型计算机、台式计算机、工作台、个人数字助理、服务器、刀片式服务器、大型计算机、和其它适合的计算机。电子设备还可以表示各种形式的移动装置,诸如,个人数字处理、蜂窝电话、智能电话、可穿戴设备和其它类似的计算装置。本文所示的部件、它们的连接和关系、以及它们的功能仅仅作为示例,并且不意在限制本文中描述的和/或者要求的本公开的实现。
如图7所示,设备700包括计算单元701,其可以根据存储在只读存储器(ROM)702中的计算机程序或者从存储单元708加载到随机访问存储器(RAM)703中的计算机程序,来执行各种适当的动作和处理。在RAM 703中,还可存储设备700操作所需的各种程序和数据。计算单元701、ROM 702以及RAM 703通过总线704彼此相连。输入/输出(I/O)接口705也连接至总线704。
设备700中的多个部件连接至I/O接口705,包括:输入单元706,例如键盘、鼠标等;输出单元707,例如各种类型的显示器、扬声器等;存储单元708,例如磁盘、光盘等;以及通信单元709,例如网卡、调制解调器、无线通信收发机等。通信单元709允许设备700通过诸如因特网的计算机网络和/或各种电信网络与其他设备交换信息/数据。
计算单元701可以是各种具有处理和计算能力的通用和/或专用处理组件。计算单元701的一些示例包括但不限于中央处理单元(CPU)、图形处理单元(GPU)、各种专用的人工智能(AI)计算芯片、各种运行机器学习模型算法的计算单元、数字信号处理器(DSP)、以及任何适当的处理器、控制器、微控制器等。计算单元701执行上文所描述的各个方法和处理,例如交通异常检测的方法。例如,在一些实施例中,交通异常检测的方法可被实现为计算机软件程序,其被有形地包含于机器可读介质,例如存储单元708。在一些实施例中,计算机程序的部分或者全部可以经由ROM 702和/或通信单元709而被载入和/或安装到设备700上。当计算机程序加载到RAM 703并由计算单元701执行时,可以执行上文描述的交通异常检测的方法的一个或多个步骤。备选地,在其他实施例中,计算单元701可以通过其他任何适当的方式(例如,借助于固件)而被配置为执行交通异常检测的方法。
本文中以上描述的系统和技术的各种实施方式可以在数字电子电路系统、集成电路系统、场可编程门阵列(FPGA)、专用集成电路(ASIC)、专用标准产品(ASSP)、芯片上系统的系统(SOC)、负载可编程逻辑设备(CPLD)、计算机硬件、固件、软件、和/或它们的组合中实现。这些各种实施方式可以包括:实施在一个或者多个计算机程序中,该一个或者多个计算机程序可在包括至少一个可编程处理器的可编程系统上执行和/或解释,该可编程处理器可以是专用或者通用可编程处理器,可以从存储系统、至少一个输入装置、和至少一个输出装置接收数据和指令,并且将数据和指令传输至该存储系统、该至少一个输入装置、和该至少一个输出装置。
用于实施本公开的方法的程序代码可以采用一个或多个编程语言的任何组合来编写。这些程序代码可以提供给通用计算机、专用计算机或其 他可编程数据处理装置的处理器或控制器,使得程序代码当由处理器或控制器执行时使流程图和/或框图中所规定的功能/操作被实施。程序代码可以完全在机器上执行、部分地在机器上执行,作为独立软件包部分地在机器上执行且部分地在远程机器上执行或完全在远程机器或服务器上执行。
在本公开的上下文中,机器可读介质可以是有形的介质,其可以包含或存储以供指令执行系统、装置或设备使用或与指令执行系统、装置或设备结合地使用的程序。机器可读介质可以是机器可读信号介质或机器可读储存介质。机器可读介质可以包括但不限于电子的、磁性的、光学的、电磁的、红外的、或半导体系统、装置或设备,或者上述内容的任何合适组合。机器可读存储介质的更具体示例会包括基于一个或多个线的电气连接、便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或快闪存储器)、光纤、便捷式紧凑盘只读存储器(CD-ROM)、光学储存设备、磁储存设备、或上述内容的任何合适组合。
为了提供与用户的交互,可以在计算机上实施此处描述的系统和技术,该计算机具有:用于向用户显示信息的显示装置(例如,CRT(阴极射线管)或者LCD(液晶显示器)监视器);以及键盘和指向装置(例如,鼠标或者轨迹球),用户可以通过该键盘和该指向装置来将输入提供给计算机。其它种类的装置还可以用于提供与用户的交互;例如,提供给用户的反馈可以是任何形式的传感反馈(例如,视觉反馈、听觉反馈、或者触觉反馈);并且可以用任何形式(包括声输入、语音输入或者、触觉输入)来接收来自用户的输入。
可以将此处描述的系统和技术实施在包括后台部件的计算系统(例如,作为数据服务器)、或者包括中间件部件的计算系统(例如,应用服务器)、或者包括前端部件的计算系统(例如,具有图形用户界面或者网络浏览器的用户计算机,用户可以通过该图形用户界面或者该网络浏览器来与此处描述的系统和技术的实施方式交互)、或者包括这种后台部件、中间件部件、或者前端部件的任何组合的计算系统中。可以通过任何形式或者介质的数字数据通信(例如,通信网络)来将系统的部件相互连接。通信网络的示例包括:局域网(LAN)、广域网(WAN)和互联网。
计算机系统可以包括客户端和服务器。客户端和服务器一般远离彼此并且通常通过通信网络进行交互。通过在相应的计算机上运行并且彼此具有客户端-服务器关系的计算机程序来产生客户端和服务器的关系。服务器可以为分布式系统的服务器,或者是结合了区块链的服务器。服务器也可以是云服务器,或者是带人工智能技术的智能云计算服务器或智能云主机。服务器可以为分布式系统的服务器,或者是结合了区块链的服务器。服务器也可以是云服务器,或者是带人工智能技术的智能云计算服务器或智能云主机。
应该理解,可以使用上面所示的各种形式的流程,重新排序、增加或删除步骤。例如,本发公开中记载的各步骤可以并行地执行也可以顺序地执行也可以不同的次序执行,只要能够实现本公开公开的技术方案所期望的结果,本文在此不进行限制。
上述具体实施方式,并不构成对本公开保护范围的限制。本领域技术人员应该明白的是,根据设计要求和其他因素,可以进行各种修改、组合、子组合和替代。任何在本公开的精神和原则之内所作的修改、等同替换和改进等,均应包含在本公开保护范围之内。

Claims (17)

  1. 一种交通异常检测的方法,包括:
    获取至少两帧连续的交通图像;
    从所述至少两帧连续的交通图像中分别识别出目标车辆的位置,得到位置信息集合;
    根据所述位置信息集合确定出所述目标车辆的行进方向和速度;
    将所述目标车辆的行进方向和速度与预先生成的车辆矢量场进行比较,确定所述目标车辆是否异常。
  2. 根据权利要求1所述的方法,其中,所述方法还包括:
    获取预定时间长度的交通视频;
    创建一个与所述交通视频的画面同等大小的矩阵,其中,所述矩阵中每个元素代表所述画面上的一个像素点,每个元素的初始值为零;
    对所述交通视频中的车辆进行跟踪检测,将车辆的检测框中的像素点对应的元素设置为非零值;
    将所述矩阵中非零值对应的像素点确定为道路,得到道路区域。
  3. 根据权利要求2所述的方法,其中,所述方法还包括:
    根据车辆的检测框的位置变化得到车辆轨迹集;
    将所述车辆轨迹集进行聚类分析,生成车辆矢量场。
  4. 根据权利要求2所述的方法,所述将所述目标车辆的行进方向和速度与预先生成的车辆矢量场进行比较,确定所述目标车辆是否异常,包括:
    响应于确定所述目标车辆的位置在所述道路区域中且速度为零的时间超过预定时长阈值,确定所述目标车辆异常停车。
  5. 根据权利要求2所述的方法,所述将所述目标车辆的行进方 向和速度与预先生成的车辆矢量场进行比较,确定所述目标车辆是否异常,包括:
    响应于检测到所述目标车辆从所述道路区域行进到非道路区域,确定所述目标车辆驶出道路。
  6. 根据权利要求1所述的方法,所述将所述目标车辆的行进方向和速度与预先生成的车辆矢量场进行比较,确定所述目标车辆是否异常,包括:
    将所述目标车辆的行进方向与所述车辆矢量场的方向进行比较,响应于确定夹角超过预定阈值,确定所述目标车辆轨迹异常。
  7. 根据权利要求6所述的方法,所述方法还包括:
    响应于确定同一位置出现轨迹异常的车辆的数量超过2辆,确定发生车辆碰撞。
  8. 一种交通异常检测的装置,包括:
    获取单元,被配置成获取至少两帧连续的交通图像;
    识别单元,被配置成从所述至少两帧连续的交通图像中分别识别出目标车辆的位置,得到位置信息集合;
    确定单元,被配置成根据所述位置信息集合确定出所述目标车辆的行进方向和速度;
    检测单元,被配置成将所述目标车辆的行进方向和速度与预先生成的车辆矢量场进行比较,确定所述目标车辆是否异常。
  9. 根据权利要求8所述的装置,其中,所述装置还包括提取单元,被配置成:
    获取预定时间长度的交通视频;
    创建一个与所述交通视频的画面同等大小的矩阵,其中,所述矩阵中每个元素代表所述画面上的一个像素点,每个元素的初始值为零;
    对所述交通视频中的车辆进行跟踪检测,将车辆的检测框中的像素点对应的元素设置为非零值;
    将所述矩阵中非零值对应的像素点确定为道路,得到道路区域。
  10. 根据权利要求9所述的装置,其中,所述提取单元进一步被配置成:
    根据车辆的检测框的位置变化得到车辆轨迹集;
    将所述车辆轨迹集进行聚类分析,生成车辆矢量场。
  11. 根据权利要求9所述的装置,所述检测单元进一步被配置成:
    响应于确定所述目标车辆的位置在所述道路区域中且速度为零的时间超过预定时长阈值,确定所述目标车辆异常停车。
  12. 根据权利要求9所述的装置,所述检测单元进一步被配置成:
    响应于检测到所述目标车辆从所述道路区域行进到非道路区域,确定所述目标车辆驶出道路。
  13. 根据权利要求8所述的装置,所述检测单元进一步被配置成:
    将所述目标车辆的行进方向与所述车辆矢量场的方向进行比较,响应于确定夹角超过预定阈值,确定所述目标车辆轨迹异常。
  14. 根据权利要求13所述的装置,所述检测单元进一步被配置成:
    响应于同一位置出现轨迹异常的车辆的数量超过2辆,确定发生车辆碰撞。
  15. 一种电子设备,包括:
    至少一个处理器;以及
    与所述至少一个处理器通信连接的存储器;其中,
    所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行权利要求1-7中任一项所述的方法。
  16. 一种存储有计算机指令的非瞬时计算机可读存储介质,其中,所述计算机指令用于使所述计算机执行根据权利要求1-7中任一项所述的方法。
  17. 一种计算机程序产品,包括计算机程序,所述计算机程序在被处理器执行时实现根据权利要求1-7中任一项所述的方法。
PCT/CN2022/075071 2021-04-28 2022-01-29 交通异常检测的方法和装置 WO2022227766A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2022561125A JP2023527265A (ja) 2021-04-28 2022-01-29 交通異常を検出する方法および装置、電子機器、記憶媒体並びにコンピュータプログラム
US17/963,058 US20230036864A1 (en) 2021-04-28 2022-10-10 Method and apparatus for detecting traffic anomaly

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110466631.2 2021-04-28
CN202110466631.2A CN113139482B (zh) 2021-04-28 2021-04-28 交通异常检测的方法和装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/963,058 Continuation US20230036864A1 (en) 2021-04-28 2022-10-10 Method and apparatus for detecting traffic anomaly

Publications (1)

Publication Number Publication Date
WO2022227766A1 true WO2022227766A1 (zh) 2022-11-03

Family

ID=76816313

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/075071 WO2022227766A1 (zh) 2021-04-28 2022-01-29 交通异常检测的方法和装置

Country Status (4)

Country Link
US (1) US20230036864A1 (zh)
JP (1) JP2023527265A (zh)
CN (1) CN113139482B (zh)
WO (1) WO2022227766A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116777703A (zh) * 2023-04-24 2023-09-19 深圳市普拉图科技发展有限公司 一种基于大数据的智慧城市管理方法和系统
CN117523858A (zh) * 2023-11-24 2024-02-06 邯郸市鼎舜科技开发有限公司 道路电子卡口检测方法和装置
CN117636270A (zh) * 2024-01-23 2024-03-01 南京理工大学 基于单目摄像头的车辆抢道事件识别方法及设备

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113139482B (zh) * 2021-04-28 2024-11-01 北京百度网讯科技有限公司 交通异常检测的方法和装置
CN113742589A (zh) * 2021-09-07 2021-12-03 北京百度网讯科技有限公司 一种行车提示方法、装置、电子设备及存储介质
CN114758322B (zh) * 2022-05-13 2022-11-22 安徽省路通公路工程检测有限公司 基于机器识别的道路质量检测系统
CN115331457B (zh) * 2022-05-17 2024-03-29 重庆交通大学 一种车速管理方法及系统
CN115410370A (zh) * 2022-08-31 2022-11-29 南京慧尔视智能科技有限公司 一种异常停车检测方法、装置、电子设备及存储介质
CN116798237B (zh) * 2023-03-24 2024-04-30 浪潮智慧科技有限公司 一种交通流量监测方法及设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103150901A (zh) * 2013-02-05 2013-06-12 长安大学 一种基于车辆运动矢量场分析的异常交通状况检测方法
CN111079675A (zh) * 2019-12-23 2020-04-28 武汉唯理科技有限公司 基于目标检测与目标跟踪的行驶行为分析方法
CN111402612A (zh) * 2019-01-03 2020-07-10 北京嘀嘀无限科技发展有限公司 一种交通事件通知方法及装置
CN112507993A (zh) * 2021-02-05 2021-03-16 上海闪马智能科技有限公司 一种异常驶离的检测方法、系统、电子设备及存储介质
WO2021066784A1 (en) * 2019-09-30 2021-04-08 Siemens Mobility, Inc. System and method for detecting speed anomalies in a connected vehicle infrastructure environment
CN113139482A (zh) * 2021-04-28 2021-07-20 北京百度网讯科技有限公司 交通异常检测的方法和装置

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09305891A (ja) * 1996-05-17 1997-11-28 Sumitomo Electric Ind Ltd 車両監視装置
JPH1040490A (ja) * 1996-07-22 1998-02-13 Toyota Motor Corp 道路監視装置
JP4600929B2 (ja) * 2005-07-20 2010-12-22 パナソニック株式会社 停止低速車両検出装置
JP5702544B2 (ja) * 2010-03-15 2015-04-15 株式会社Kddi研究所 車両交通監視装置およびプログラム
US8970701B2 (en) * 2011-10-21 2015-03-03 Mesa Engineering, Inc. System and method for predicting vehicle location
CN113538911B (zh) * 2020-02-11 2022-08-02 北京百度网讯科技有限公司 路口距离的检测方法、装置、电子设备和存储介质
KR102167522B1 (ko) * 2020-04-02 2020-10-19 트라웍스(주) 교통 정보 제공 시스템 및 그 방법

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103150901A (zh) * 2013-02-05 2013-06-12 长安大学 一种基于车辆运动矢量场分析的异常交通状况检测方法
CN111402612A (zh) * 2019-01-03 2020-07-10 北京嘀嘀无限科技发展有限公司 一种交通事件通知方法及装置
WO2021066784A1 (en) * 2019-09-30 2021-04-08 Siemens Mobility, Inc. System and method for detecting speed anomalies in a connected vehicle infrastructure environment
CN111079675A (zh) * 2019-12-23 2020-04-28 武汉唯理科技有限公司 基于目标检测与目标跟踪的行驶行为分析方法
CN112507993A (zh) * 2021-02-05 2021-03-16 上海闪马智能科技有限公司 一种异常驶离的检测方法、系统、电子设备及存储介质
CN113139482A (zh) * 2021-04-28 2021-07-20 北京百度网讯科技有限公司 交通异常检测的方法和装置

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116777703A (zh) * 2023-04-24 2023-09-19 深圳市普拉图科技发展有限公司 一种基于大数据的智慧城市管理方法和系统
CN116777703B (zh) * 2023-04-24 2024-02-02 深圳市普拉图科技发展有限公司 一种基于大数据的智慧城市管理方法和系统
CN117523858A (zh) * 2023-11-24 2024-02-06 邯郸市鼎舜科技开发有限公司 道路电子卡口检测方法和装置
CN117523858B (zh) * 2023-11-24 2024-05-14 邯郸市鼎舜科技开发有限公司 道路电子卡口检测方法和装置
CN117636270A (zh) * 2024-01-23 2024-03-01 南京理工大学 基于单目摄像头的车辆抢道事件识别方法及设备
CN117636270B (zh) * 2024-01-23 2024-04-09 南京理工大学 基于单目摄像头的车辆抢道事件识别方法及设备

Also Published As

Publication number Publication date
US20230036864A1 (en) 2023-02-02
CN113139482A (zh) 2021-07-20
CN113139482B (zh) 2024-11-01
JP2023527265A (ja) 2023-06-28

Similar Documents

Publication Publication Date Title
WO2022227766A1 (zh) 交通异常检测的方法和装置
US11840239B2 (en) Multiple exposure event determination
US11380105B2 (en) Identification and classification of traffic conflicts
EP4016130B1 (en) Method for outputting early warning information, device, storage medium and program product
KR20220047732A (ko) 차량 감시 방법 및 장치, 전자 기기, 저장 매체 및 컴퓨터 프로그램, 클라우드 제어 플랫폼 및 차량 도로 협조 시스템
US9704201B2 (en) Method and system for detecting uninsured motor vehicles
KR20210080459A (ko) 차선 검출방법, 장치, 전자장치 및 가독 저장 매체
CN111898491B (zh) 一种车辆逆向行驶的识别方法、装置及电子设备
CN113469115B (zh) 用于输出信息的方法和装置
CN112101223B (zh) 检测方法、装置、设备和计算机存储介质
CN114648748A (zh) 一种基于深度学习的机动车违停智能识别方法及系统
KR20220146670A (ko) 교통 비정상 탐지 방법, 장치, 기기, 저장 매체 및 프로그램
CN111524350A (zh) 车路协同异常行驶状况检测方法、系统、终端设备及介质
TWI774034B (zh) 基於車聯網的行車示警方法、系統及設備
CN116311071A (zh) 一种融合帧差和ca的变电站周界异物识别方法及系统
Bhandari et al. Fullstop: A camera-assisted system for characterizing unsafe bus stopping
CN111427063B (zh) 一种移动装置通行控制方法、装置、设备、系统及介质
WO2024098992A1 (zh) 倒车检测方法及装置
CN113538968B (zh) 用于输出信息的方法和装置
TW202121332A (zh) 影像偵測區域取得方法及空間使用情況的判定方法
CN114596706B (zh) 路侧感知系统的检测方法及装置、电子设备和路侧设备
CN112906428A (zh) 影像侦测区域取得方法及空间使用情况的判定方法
CN114708498A (zh) 图像处理方法、装置、电子设备以及存储介质
CN115775371A (zh) 监视未礼让行人的车辆的系统和方法
CN112861701A (zh) 违章停车识别方法、装置、电子设备以及计算机可读介质

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2022561125

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22794247

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22794247

Country of ref document: EP

Kind code of ref document: A1