CN113408479A - Flame detection method and device, computer equipment and storage medium - Google Patents
Flame detection method and device, computer equipment and storage medium Download PDFInfo
- Publication number
- CN113408479A CN113408479A CN202110785919.6A CN202110785919A CN113408479A CN 113408479 A CN113408479 A CN 113408479A CN 202110785919 A CN202110785919 A CN 202110785919A CN 113408479 A CN113408479 A CN 113408479A
- Authority
- CN
- China
- Prior art keywords
- flame
- score
- frame
- detection
- detection image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 152
- 238000000034 method Methods 0.000 claims abstract description 32
- 238000004364 calculation method Methods 0.000 claims description 6
- 230000009466 transformation Effects 0.000 claims description 5
- 238000004737 colorimetric analysis Methods 0.000 claims description 3
- 238000012544 monitoring process Methods 0.000 abstract description 6
- 238000005259 measurement Methods 0.000 abstract description 3
- 230000000007 visual effect Effects 0.000 abstract description 3
- 238000012545 processing Methods 0.000 abstract description 2
- 239000013598 vector Substances 0.000 description 4
- 238000006467 substitution reaction Methods 0.000 description 3
- 239000003086 colorant Substances 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 238000001931 thermography Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 230000010485 coping Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000005764 inhibitory process Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000246 remedial effect Effects 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 239000000779 smoke Substances 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
Landscapes
- Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Fire-Detection Mechanisms (AREA)
- Image Analysis (AREA)
Abstract
The invention belongs to the technical field of image processing, and particularly provides a flame detection method, a flame detection device, computer equipment and a storage medium, aiming at solving the problems of poor generalization performance, easy interference and the like which exist singly according to a color space rule in pure visual flame detection. To this end, the method of the invention comprises: detecting a flame frame in the image, and obtaining a flame measurement index according to the flame confidence coefficient, the flame color score and the flame motion score of the flame frame; the flame metric and the flame threshold are compared to determine if a flame is present in the image. By applying the method, the characteristics of visual property, action property and the like of the flame are integrated in the fire detection, the anti-interference capability of the flame detection can be improved, the false alarm rate is reduced, and the rapid detection and positioning at the initial stage of the fire are realized. Meanwhile, the existing video monitoring equipment is utilized, so that low-cost fire detection can be realized, and the method has a good application prospect.
Description
Technical Field
The invention belongs to the technical field of image processing, and particularly provides a flame detection method, a flame detection device, computer equipment and a storage medium.
Background
A fire is a disaster that can cause significant casualties and property loss. According to the statistical data of the fire rescue bureau of the emergency management department, 17.6 thousands of fires are reported all over the country in a quarter of 2021 year, 433 people who die and 249 people who are injured are reported, and the direct economic loss is 13.9 million yuan. Although a fire disaster can bring great loss to lives and properties of people, compared with other disasters, the fire disaster has certain controllability, and the fire disaster needs to be found in time and remedial measures are taken in the early stage of the fire disaster to realize the controllability, so that the flame can be detected in time.
The traditional flame detection method is mainly realized based on a smoke sensor, an optical sensor, an infrared sensor, a thermosensitive sensor and the like. However, such a sensor-based flame detection method has a large limitation in detection time, detection range, etc., and generally cannot provide detailed information of a fire, such as flame size, combustion degree, etc. In addition, the thermal imaging camera can also be used for detecting fire, but the thermal imaging camera is too high in cost compared with a common camera and difficult to popularize.
With the application of surveillance cameras in public areas, it has become a trend of technical development to automatically detect flames by intercepting video images of the surveillance cameras, but a computer flame detection method is generally completed based on rule learning of a color space, and such a rule-based method is poor in generalization, is easily affected by factors such as illumination and brightness, and is difficult to distinguish objects with colors similar to the colors of flames. Therefore, how to detect flames rapidly and accurately through images of monitoring videos becomes a problem to be solved in the field.
Accordingly, there is a need in the art for a new solution to the above-mentioned problems.
Disclosure of Invention
The invention aims to solve the technical problems, namely, the problems of poor generalization performance, easy interference and the like of a flame detection method based on color space rule learning in the application of monitoring video image flame detection are solved.
In a first aspect, the present invention provides a method of flame detection, the method comprising:
acquiring a first detection image;
detecting the first detection image to obtain a flame frame in the first detection image, wherein the information of the flame frame comprises flame confidence;
calculating a flame color score for the flame frame;
calculating a flame motion score for the flame frame;
calculating a flame metric index of the flame frame according to the flame confidence, the flame color score and the flame motion score;
comparing the flame metric indicator to a flame threshold, determining that a flame is present in the first detected image when the flame metric indicator is greater than or equal to the flame threshold, and/or determining that a flame is not present in the first detected image when the flame metric indicator is less than the flame threshold.
In one embodiment of the flame detection method, the flame metric index is calculated by:
Fire_Score=Conf*(w1YUV_Score+w2Motion_Score)
wherein Fire _ Score is the flame metric, Conf is the flame confidence, YUV _ Score is the flame color Score, Motion _ Score is the flame Motion Score, w is the flame Motion Score1Is the weight of the flame color score, w2Is a weight of the flame motion score, and w1+w2=1。
In an embodiment of the above flame detection method, the step of "calculating the flame color score of the flame frame" specifically includes:
carrying out YUV color space transformation on the flame frame;
counting the total number of pixels meeting the flame pixel constraint rule in the flame frame, and recording the total number as N _ r;
calculating the flame color score, wherein the calculation formula of the flame color score is as follows:
wherein N _ All is the total number of pixels in the flame frame;
the flame pixel constraint rules include:
rule r 1: y (x, Y) > U (x, Y)
Rule r 2: v (x, y) > U (x, y)
Rule r 3:
rule r 4: l V (x, y) -U (x, y) | > τ, τ ═ 40
Y (x, Y), U (x, Y) and V (x, Y) respectively represent values of Y components, U components and V components of pixel points with coordinate values of (x, Y) in a YUV color space, N is the number of pixels in the x direction of the flame frame, and N is the number of pixels in the Y direction of the flame frame.
In an embodiment of the above flame detection method, the step of "calculating a flame movement score of the flame frame" specifically includes:
acquiring adjacent frame images of the first detection image and recording the adjacent frame images as a second detection image;
respectively extracting flame key points in the first detection image and the second detection image;
matching the flame key points in the first detection image and the second detection image to obtain a flame key point matching pair;
obtaining the change angle of the flame key point matching pair according to the flame key point matching pair;
and obtaining the flame motion score according to the distribution condition of the change angles of the flame key point matching pairs.
In one embodiment of the above flame detection method, the method further comprises: and determining the fire level according to the area ratio of the flame frame in the first detection image when the flame is judged to exist in the first detection image.
In one embodiment of the above flame detection method, the information of the flame frame further includes a flame position;
the method further comprises the following steps: outputting to a user at least one of whether a fire is occurring, the fire level, and the flame location.
In a second aspect, the present invention provides a flame detection apparatus, the apparatus comprising:
an image acquisition module configured to acquire a first detection image and a second detection image;
a flame detection module configured to detect the first detection image, resulting in a flame frame in the first detection image, information of the flame frame including a flame confidence level; (ii) a
A flame color metric module configured to calculate a flame color score for the flame frame;
a flame movement metric module configured to calculate a flame movement score for the flame box;
a flame metrology module configured to:
calculating a flame metric index of the flame frame according to the flame confidence, the flame color score and the flame motion score;
comparing the flame metric indicator to a flame threshold, determining that a flame is present in the first detected image when the flame metric indicator is greater than or equal to the flame threshold, and/or determining that a flame is not present in the first detected image when the flame metric indicator is less than the flame threshold.
In an embodiment of the flame detection apparatus, the flame metric index is calculated by:
Fire_Score=Conf*(w1YUV_Score+w2Motion_Score)
wherein Fire _ Score is the flame metric index and Conf is theFlame confidence, YUV _ Score for the flame color Score, Motion _ Score for the flame Motion Score, w1Is the weight of the flame color score, w2Is a weight of the flame motion score, and w1+w2=1。
In one embodiment of the above flame detection device, the flame colorimetry module performs the following operations:
carrying out YUV color space transformation on the flame frame;
counting the total number of pixels meeting the flame pixel constraint rule in the flame frame, and recording the total number as N _ r;
calculating the flame color score, wherein the calculation formula of the flame color score is as follows:
wherein N _ All is the total number of pixels in the flame frame;
the flame pixel constraint rules include:
rule r 1: y (x, Y) > U (x, Y)
Rule r 2: v (x, y) > U (x, y)
Rule r 3:
rule r 4: l V (x, y) -U (x, y) | > τ, τ ═ 40
Y (x, Y), U (x, Y) and V (x, Y) respectively represent values of Y components, U components and V components of pixel points with coordinate values of (x, Y) in a YUV color space, N is the number of pixels in the x direction of the flame frame, and N is the number of pixels in the Y direction of the flame frame.
In one embodiment of the above flame detection apparatus, the flame motion metric module performs the following operations:
acquiring adjacent frame images of the first detection image and recording the adjacent frame images as a second detection image;
respectively extracting flame key points in the first detection image and the second detection image;
matching the flame key points in the first detection image and the second detection image to obtain a flame key point matching pair;
obtaining the change angle of the flame key point matching pair according to the flame key point matching pair;
and obtaining the flame motion score according to the distribution condition of the change angles of the flame key point matching pairs.
In one embodiment of the flame detection apparatus, the apparatus further includes a fire level determination module configured to determine a fire level based on an area ratio of the flame frame in the first detection image when it is determined that the flame exists in the first detection image.
In a third aspect, the invention proposes a computer device comprising a processor and a storage means adapted to store a plurality of program codes adapted to be loaded and run by the processor to perform a flame detection method according to any of the above aspects.
In a fourth aspect, the present invention proposes a storage medium adapted to store a plurality of program codes adapted to be loaded and run by a processor to perform a flame detection method according to any of the above aspects.
Under the condition of adopting the technical scheme, the method and the device can detect the flame frame in the video image, and synthesize the flame confidence coefficient, the flame color score and the flame motion score of the flame frame to obtain the flame measurement index, thereby judging whether the flame exists in the image. The invention can effectively improve the generalization performance and the anti-interference capability of pure visual flame detection, realize the quick detection and positioning at the initial stage of fire occurrence, and provide data support for quickly coping with the fire, thereby reducing the loss of life and property of people. Meanwhile, the method of the invention can utilize the existing video monitoring equipment, has low implementation cost and extremely high popularization value.
Drawings
Preferred embodiments of the present invention are described below with reference to the accompanying drawings, in which:
FIG. 1 is a flow chart of the main steps of a flame detection method of an embodiment of the invention.
Fig. 2 is a flowchart of a specific implementation of step S104 in fig. 1.
FIG. 3 is a schematic illustration of the varying angles of the flame keypoint matching pairs of the present invention.
FIG. 4 is a schematic view of the constitution of the flame detection device of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
Turning first to FIG. 1, FIG. 1 is a flow chart of the main steps of a flame detection method of an embodiment of the invention. As shown in fig. 1, the flame detection method of the present invention includes:
step S101: acquiring a first detection image;
step S102: detecting the first detection image to obtain a flame frame in the first detection image, wherein the information of the flame frame comprises a flame confidence coefficient;
step S103: calculating the flame color score of the flame frame;
step S104: calculating the flame motion value of the flame frame;
step S105: calculating a flame measurement index of the flame frame according to the flame confidence, the flame color score and the flame motion score;
step S106: and comparing the flame metric index with the flame threshold value, and judging whether flame exists in the first detection image.
In step S101, the first detection image is usually from an image captured in a video surveillance video.
Moreover, according to different selected flame frame detection methods, the requirements of the first detection image such as format, size and the like are different, so that for the image captured by the video, one or more combinations of scaling, filling, storage format conversion, normalization and the like of the image are generally required according to the requirements of a target detection algorithm, so as to obtain the first detection image meeting the requirements of the input format of the flame frame detection method.
In the present invention, the method for acquiring the second detection image is the same as the method for acquiring the first detection image, and the image formats of the first detection image and the second detection image are also the same.
In step S102, the method for detecting the flame frame in the first detected image is not limited by the present invention, and by way of example, a target detection algorithm based on deep learning, such as YOLOV5, SSD algorithm, etc., may be selected, and those skilled in the art may select a suitable method according to actual situations.
In the flame frame detection result, one or more flame frame information may be obtained, which typically includes the flame confidence of the flame frame and the position of the flame frame in the first detection image. As an example, a pixel coordinate system may be established in the first inspection image, with the coordinate system origin typically chosen in the upper left corner of the image. The flame frame is usually rectangular, and the flame frame can be represented as Qn(x, y, w, h, Conf), x and y being coordinates of the lower left corner of the rectangular box in the pixel coordinate system, w being the width of the rectangular box, h being the height of the rectangular box, Conf being the flame confidence value for the region, the flame confidence indicating the likelihood of a flame being present in the region, typically the greater the value of the flame confidence, the greater the likelihood of a flame being present in the region.
When the plurality of flame frames in the first detection image have the areas which mutually contain or mostly intersect, screening can be carried out by a non-maximum value inhibition method, redundant flame frames are removed, and the flame frame with the highest confidence coefficient is reserved. And, can be according to needing, further screen the flame frame through setting up the threshold value of confidence degree of flame, get one or more flame frames that the invention needs.
In step S103, YUV color space transformation is first performed on the flame frame, the number N _ r of pixel points satisfying the flame pixel constraint rule is counted according to the flame pixel constraint rule, and then the formula is used
Calculating the flame color score of the flame frame, wherein N _ All is the total number of pixels in the flame frame.
In step S103, the flame pixel constraint rule includes:
rule r 1: y (x, Y) > U (x, Y)
Rule r 2: v (x, y) > U (x, y)
Rule r 3:
rule r 4: l V (x, y) -U (x, y) | > τ, τ ═ 40
Y (x, Y), U (x, Y) and V (x, Y) respectively represent values of Y components, U components and V components of pixel points with coordinate values of (x, Y) in a YUV color space, N is the number of pixels in the x direction of the flame frame, and M is the number of pixels in the Y direction of the flame frame.
As an example, the resolution of a certain flame frame image is 200 × 300, that is, in the pixel coordinate system, the number N of pixels in the x direction of the flame frame is 200, the number M of pixels in the y direction of the flame frame is 300, the total number N _ All of pixels in the flame frame is 200 × 300 — 60000, the number N _ r of pixels satisfying the flame pixel constraint rule is 48000, and the calculation is performed
Continuing with fig. 2, fig. 2 is a specific implementation method of step S104, which includes:
step S1041: acquiring adjacent frame images of the first detection image and recording the adjacent frame images as a second detection image;
step S1042: respectively extracting flame key points in the first detection image and the second detection image;
step S1043: matching the flame key points in the first detection image and the second detection image to obtain a flame key point matching pair;
step S1044: obtaining the change angle of the flame key point matching pair according to the flame key point matching pair;
step S1045: and obtaining the flame motion score according to the distribution condition of the change angles of the flame key point matching pairs.
In step S1041, preferably, an image of a next frame adjacent to the first detection image in the monitoring video is intercepted as the second detection image, so as to further determine whether flame exists in the first detection image according to the motion attribute of flame.
In step S1042, the method for determining the flame key points of the first detection image and the second detection image is not limited in the present invention, and for example, the flame key points may be extracted by an ORB algorithm based on OpenCV, and a person skilled in the art may select an appropriate method according to actual situations.
In step S1043, the method for matching the flame key points in the first detection image and the second detection image is not limited in the present invention, and for example, the euclidean distance between the key points of the two images may be calculated by an SIFT algorithm based on OpenCV to perform matching, and those skilled in the art may select an appropriate method according to actual situations.
By referring to fig. 3, the variation angle of the flame key point matching pairs in step S1044 is explained. As shown in fig. 3, points a1 and B1 are the flame key points in the first test image, and points a2 and B2 are the flame key points in the second test image; obtaining flame key point matching pairs (A1, A2) and (B1, B2) after matching in the step S1043; connecting flame key point matching pairs by straight lines to obtain vectorsAndas an example, if the included angle between the flame key point matching pair vector and the positive x-axis direction is defined as the variation angle of the flame key point matching pair, the variation angle of the flame key point matching pair (a1, a2) is defined as a vectorThe included angle alpha with the positive direction of the x axis, and the change angles of the flame key point matching pairs (B1, B2) are vectorsThe angle beta is included with the positive direction of the x axis.
In step S1045, the flame motion score of the flame frame is determined by analyzing the distribution of the variation angles of all the matching pairs of the flame key points in the flame frame. Because of the disordering nature of the flame variation, the distribution of the varying angles of the flame keypoint matching pairs is relatively uniform. That is, the more uniform the variation angle distribution of the flame key point matching pairs, the higher the flame motion score of the flame frame; on the contrary, the more concentrated the variation angle distribution of the flame key point matching pairs, the lower the flame motion score of the corresponding flame frame.
In step S1045, the method for calculating the flame motion score is not limited in the present invention, and for example, the flame motion score of the flame frame may be determined by counting the mean and variance of the variation angles of the flame key point matching pairs in the flame frame, and those skilled in the art may select an appropriate method according to the actual situation.
After obtaining the flame confidence, the flame color score and the flame motion score of the flame frame, the flame metric index of the flame frame may be obtained through step S105. Preferably, the flame metric is calculated as follows:
Fire_Score=Conf*(w1YUV_Score+w2Motion_Score)
wherein Fire _ Score is the flame metric, Conf is the flame confidence, YUV _ Score is the flame color Score, Motion _ Score is the flame Motion Score, w is the flame Motion Score1Is said flameWeight of color score, w2Is a weight of the flame motion score, and w1+w2=1。
In practical application, w1And w2Can be set according to the condition of a monitoring site, and can improve the weight w of the flame color score for finding the flame in time1(ii) a To detect the condition of flame variation, the weight w of the flame motion score can be calculated2The skilled person can determine the appropriate w under different scenes by training1And w2The value of (c).
In step S106, a flame threshold value of 0.5 may be set, the flame metric indexes of all flame frames in the first detection image are compared with the flame threshold value one by one, and when the flame metric indexes of the flame frames in the first detection image are greater than or equal to the flame threshold value, it is determined that there is a flame in the first detection image, and/or when the flame metric indexes are all less than the flame threshold value, it is determined that there is no flame in the first detection image.
It should be noted that, those skilled in the art can also design other forms of flame metric index calculation methods according to the flame confidence, flame color score and flame motion score of the flame frame, such as
FireScore=w1YUVScore+w2MotionScore+w3Conf
Wherein, w3The detection of the flame can also be realized by the weight of the confidence coefficient of the flame. Such modifications and substitutions are intended to fall within the scope of the present invention without departing from the spirit thereof.
When it is determined that there is a flame in the first detection image, the area ratio of all flame frames in which flames exist in the first detection image may be further calculated. For example, if the area ratio of the flame frame in which flames are present in the first detection image is 70% or more, it can be determined that a fire is serious.
And the actual geographical position of the flame occurrence can be determined by the position of the flame frame in the first detection image in the flame frame information and the geographical position of the installation of the camera.
After the flame detection result is obtained, information such as whether a fire disaster occurs, the fire disaster level, the flame position and the like can be output to a user through forms such as screen display, voice broadcasting and the like according to setting.
Further, the invention also provides a flame detection device. As shown in fig. 4, the flame detection device 4 of the present invention mainly includes: an image acquisition module 41, a flame detection module 42, a flame color metric module 43, a flame motion metric module 44, a flame metric module 45, and a fire class discrimination module 46.
The image acquisition module 41 is configured to acquire a first detection image and a second detection image. The flame detection module 42 is configured to perform the operation in step S102. The flame colorimetry module 43 is configured to perform the operation in step S103. The flame motion metric module 44 is configured to perform the operations in step S104 and steps S1041 to S1045 in fig. 2. The flame metrology module 45 is configured to perform the operations in step S105 and step S106. The fire level discrimination module 46 is configured to determine a fire level based on an area ratio of a flame frame in the first detection image in a case where it is determined that there is a flame in the first detection image; and outputs to the user: whether a fire occurs, the fire level, the flame position, etc.
Further, the present invention also provides a computer device comprising a processor and a storage means, the storage means may be configured to store and execute a program of the flame detection method of the above method embodiment, and the processor may be configured to execute a program in the storage means, the program including but not limited to a program of the flame detection method of the above method embodiment. For convenience of explanation, only the parts related to the embodiments of the present invention are shown, and details of the specific techniques are not disclosed. The flame detection device may be a control device formed including various electronic devices.
Further, the present invention also provides a storage medium, which may be configured to store a program for executing the flame detection method of the above-described method embodiment, which may be loaded and executed by a processor to implement the method of the above-described flame detection method. For convenience of explanation, only the parts related to the embodiments of the present invention are shown, and details of the specific techniques are not disclosed. The storage medium may be a storage device formed of various electronic apparatuses, and optionally, the storage medium is a non-transitory computer-readable storage medium in an embodiment of the present invention.
Those of skill in the art will appreciate that the method steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described above generally in terms of their functionality in order to clearly illustrate the interchangeability of electronic hardware and software. Whether such functionality is implemented as electronic hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and in the claims, and in the drawings, are used for distinguishing between similar elements and not necessarily for describing or implying any particular order or sequence. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein.
It should be noted that in the description of the present application, the term "a and/or B" indicates all possible combinations of a and B, such as a alone, B alone, or a and B.
So far, the technical solutions of the present invention have been described in connection with the preferred embodiments shown in the drawings, but it is easily understood by those skilled in the art that the scope of the present invention is obviously not limited to these specific embodiments. Equivalent changes or substitutions of related technical features can be made by those skilled in the art without departing from the principle of the invention, and the technical scheme after the changes or substitutions can fall into the protection scope of the invention.
Claims (13)
1. A method of flame detection, the method comprising:
acquiring a first detection image;
detecting the first detection image to obtain a flame frame in the first detection image, wherein the information of the flame frame comprises flame confidence;
calculating a flame color score for the flame frame;
calculating a flame motion score for the flame frame;
calculating a flame metric index of the flame frame according to the flame confidence, the flame color score and the flame motion score;
comparing the flame metric indicator to a flame threshold, determining that a flame is present in the first detected image when the flame metric indicator is greater than or equal to the flame threshold, and/or determining that a flame is not present in the first detected image when the flame metric indicator is less than the flame threshold.
2. The flame detection method of claim 1, wherein the flame metric is calculated by:
Fire_Score=Conf*(w1YUV_Score+w2Motion_Score)
wherein Fire _ Score is the flame metric, Conf is the flame confidence, YUV _ Score is the flame color Score, Motion _ Score is the flame Motion Score, w is the flame Motion Score1Is the weight of the flame color score, w2Is a weight of the flame motion score, and w1+w2=1。
3. The flame detection method according to claim 1, wherein the step of calculating the flame color score of the flame frame specifically comprises:
carrying out YUV color space transformation on the flame frame;
counting the total number of pixels meeting the flame pixel constraint rule in the flame frame, and recording the total number as N _ r;
calculating the flame color score, wherein the calculation formula of the flame color score is as follows:
wherein N _ All is the total number of pixels in the flame frame;
the flame pixel constraint rules include:
rule r 1: y (x, Y) > U (x, Y)
Rule r 2: v (x, y) > U (x, y)
Rule r 3:
rule r 4: v (x, y) -U (x, y) | > τ, τ -40
Y (x, Y), U (x, Y) and V (x, Y) respectively represent values of Y components, U components and V components of pixel points with coordinate values of (x, Y) in a YUV color space, N is the number of pixels in the x direction of the flame frame, and N is the number of pixels in the Y direction of the flame frame.
4. The flame detection method according to claim 1, wherein the step of calculating the flame motion score of the flame frame specifically comprises:
acquiring adjacent frame images of the first detection image and recording the adjacent frame images as a second detection image;
respectively extracting flame key points in the first detection image and the second detection image;
matching the flame key points in the first detection image and the second detection image to obtain a flame key point matching pair;
obtaining the change angle of the flame key point matching pair according to the flame key point matching pair;
and obtaining the flame motion score according to the distribution condition of the change angles of the flame key point matching pairs.
5. The flame detection method of claim 1, further comprising: and determining the fire level according to the area ratio of the flame frame in the first detection image when the flame is judged to exist in the first detection image.
6. The flame detection method of claim 5, wherein the information of the flame frame further includes a flame position;
the method further comprises the following steps: outputting to a user at least one of whether a fire is occurring, the fire level, and the flame location.
7. A flame detection device, the device comprising:
an image acquisition module configured to acquire a first detection image and a second detection image;
a flame detection module configured to detect the first detection image, resulting in a flame frame in the first detection image, information of the flame frame including a flame confidence level; (ii) a
A flame color metric module configured to calculate a flame color score for the flame frame;
a flame movement metric module configured to calculate a flame movement score for the flame box;
a flame metrology module configured to:
calculating a flame metric index of the flame frame according to the flame confidence, the flame color score and the flame motion score;
comparing the flame metric indicator to a flame threshold, determining that a flame is present in the first detected image when the flame metric indicator is greater than or equal to the flame threshold, and/or determining that a flame is not present in the first detected image when the flame metric indicator is less than the flame threshold.
8. The flame detection device of claim 7, wherein the flame metric is calculated by:
Fire_Score=Conf*(w1YUV_Score+w2Motion_Score)
wherein Fire _ Score is the flame metric, Conf is the flame confidence, YUV _ Score is the flame color Score, Motion _ Score is the flame Motion Score, w is the flame Motion Score1Is the weight of the flame color score, w2Is a weight of the flame motion score, and w1+w2=1。
9. The flame detection device of claim 7, wherein the flame colorimetry module performs in particular the following:
carrying out YUV color space transformation on the flame frame;
counting the total number of pixels meeting the flame pixel constraint rule in the flame frame, and recording the total number as N _ r;
calculating the flame color score, wherein the calculation formula of the flame color score is as follows:
wherein N _ All is the total number of pixels in the flame frame;
the flame pixel constraint rules include:
rule r 1: y (x, Y) > U (x, Y)
Rule r 2: v (x, y) > U (x, y)
Rule r 3:
rule r 4: v (x, y) -U (x, y) | > τ, τ -40
Y (x, Y), U (x, Y) and V (x, Y) respectively represent values of Y components, U components and V components of pixel points with coordinate values of (x, Y) in a YUV color space, N is the number of pixels in the x direction of the flame frame, and N is the number of pixels in the Y direction of the flame frame.
10. The flame detection device of claim 7, wherein the flame motion metric module performs in particular the following:
acquiring adjacent frame images of the first detection image and recording the adjacent frame images as a second detection image;
respectively extracting flame key points in the first detection image and the second detection image;
matching the flame key points in the first detection image and the second detection image to obtain a flame key point matching pair;
obtaining the change angle of the flame key point matching pair according to the flame key point matching pair;
and obtaining the flame motion score according to the distribution condition of the change angles of the flame key point matching pairs.
11. The flame detection device according to claim 7, further comprising a fire level discrimination module configured to determine a fire level based on an area ratio of the flame frame in the first detection image when it is determined that there is a flame in the first detection image.
12. A computer device comprising a processor and a storage means adapted to store a plurality of program codes, characterized in that said program codes are adapted to be loaded and run by said processor to perform the flame detection method according to any of claims 1 to 6.
13. A storage medium adapted to store a plurality of program codes, wherein the program codes are adapted to be loaded and executed by a processor to perform the flame detection method of any of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110785919.6A CN113408479A (en) | 2021-07-12 | 2021-07-12 | Flame detection method and device, computer equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110785919.6A CN113408479A (en) | 2021-07-12 | 2021-07-12 | Flame detection method and device, computer equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113408479A true CN113408479A (en) | 2021-09-17 |
Family
ID=77686033
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110785919.6A Pending CN113408479A (en) | 2021-07-12 | 2021-07-12 | Flame detection method and device, computer equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113408479A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114425133A (en) * | 2022-02-09 | 2022-05-03 | 吕德生 | Indoor flame autonomous inspection and fire extinguishing method |
US20230188671A1 (en) * | 2021-12-09 | 2023-06-15 | Anhui University | Fire source detection method and device under condition of small sample size and storage medium |
CN117593588A (en) * | 2023-12-14 | 2024-02-23 | 小黄蜂智能科技(广东)有限公司 | Intelligent identification method and device for flame image |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110051993A1 (en) * | 2008-05-08 | 2011-03-03 | UTF Fire & Security | System and method for ensuring the performance of a video-based fire detection system |
US20120262583A1 (en) * | 2011-04-18 | 2012-10-18 | Xerox Corporation | Automated method and system for detecting the presence of a lit cigarette |
US20180341813A1 (en) * | 2017-05-25 | 2018-11-29 | Qualcomm Incorporated | Methods and systems for appearance based false positive removal in video analytics |
CN109815863A (en) * | 2019-01-11 | 2019-05-28 | 北京邮电大学 | Firework detecting method and system based on deep learning and image recognition |
CN110263654A (en) * | 2019-05-23 | 2019-09-20 | 深圳市中电数通智慧安全科技股份有限公司 | A kind of flame detecting method, device and embedded device |
CN111860323A (en) * | 2020-07-20 | 2020-10-30 | 北京华正明天信息技术股份有限公司 | Method for identifying initial fire in monitoring picture based on yolov3 algorithm |
CN113012383A (en) * | 2021-03-26 | 2021-06-22 | 深圳市安软科技股份有限公司 | Fire detection alarm method, related system, related equipment and storage medium |
CN113033553A (en) * | 2021-03-22 | 2021-06-25 | 深圳市安软科技股份有限公司 | Fire detection method and device based on multi-mode fusion, related equipment and storage medium |
-
2021
- 2021-07-12 CN CN202110785919.6A patent/CN113408479A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110051993A1 (en) * | 2008-05-08 | 2011-03-03 | UTF Fire & Security | System and method for ensuring the performance of a video-based fire detection system |
US20120262583A1 (en) * | 2011-04-18 | 2012-10-18 | Xerox Corporation | Automated method and system for detecting the presence of a lit cigarette |
US20180341813A1 (en) * | 2017-05-25 | 2018-11-29 | Qualcomm Incorporated | Methods and systems for appearance based false positive removal in video analytics |
CN109815863A (en) * | 2019-01-11 | 2019-05-28 | 北京邮电大学 | Firework detecting method and system based on deep learning and image recognition |
CN110263654A (en) * | 2019-05-23 | 2019-09-20 | 深圳市中电数通智慧安全科技股份有限公司 | A kind of flame detecting method, device and embedded device |
CN111860323A (en) * | 2020-07-20 | 2020-10-30 | 北京华正明天信息技术股份有限公司 | Method for identifying initial fire in monitoring picture based on yolov3 algorithm |
CN113033553A (en) * | 2021-03-22 | 2021-06-25 | 深圳市安软科技股份有限公司 | Fire detection method and device based on multi-mode fusion, related equipment and storage medium |
CN113012383A (en) * | 2021-03-26 | 2021-06-22 | 深圳市安软科技股份有限公司 | Fire detection alarm method, related system, related equipment and storage medium |
Non-Patent Citations (2)
Title |
---|
WANG M W等: "Fire recognition based on multi-channel convolutional neural Network", 《FIRE TECHNOLOGY》 * |
彭煜民等: "结合注意力机制的火灾检测算法", 《计算机测量与控制》 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230188671A1 (en) * | 2021-12-09 | 2023-06-15 | Anhui University | Fire source detection method and device under condition of small sample size and storage medium |
US11818493B2 (en) * | 2021-12-09 | 2023-11-14 | Anhui University | Fire source detection method and device under condition of small sample size and storage medium |
CN114425133A (en) * | 2022-02-09 | 2022-05-03 | 吕德生 | Indoor flame autonomous inspection and fire extinguishing method |
CN114425133B (en) * | 2022-02-09 | 2023-10-17 | 吕德生 | Indoor flame autonomous inspection and fire extinguishing method |
CN117593588A (en) * | 2023-12-14 | 2024-02-23 | 小黄蜂智能科技(广东)有限公司 | Intelligent identification method and device for flame image |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113408479A (en) | Flame detection method and device, computer equipment and storage medium | |
KR101245057B1 (en) | Method and apparatus for sensing a fire | |
CN111739250B (en) | Fire detection method and system combining image processing technology and infrared sensor | |
KR101822924B1 (en) | Image based system, method, and program for detecting fire | |
CN112489371B (en) | Swimming pool drowning prevention early warning system based on computer vision | |
CN104168478B (en) | Based on the video image color cast detection method of Lab space and relevance function | |
JP4653207B2 (en) | Smoke detector | |
CN103514430B (en) | The method and apparatus of detection flame | |
KR102592231B1 (en) | Method for diagnosing fault of camera | |
JP4999794B2 (en) | Still region detection method and apparatus, program and recording medium | |
JP3486229B2 (en) | Image change detection device | |
CN104883539B (en) | A kind of monitoring method and system anti-tamper for region-of-interest | |
CN109360370B (en) | Robot-based smoke and fire detection method | |
CN110096945A (en) | Indoor Video key frame of video real time extracting method based on machine learning | |
CN115937508A (en) | Method and device for detecting fireworks | |
US20110205360A1 (en) | Supervising system for image | |
JP5015838B2 (en) | Smoke detector | |
JP2024008990A (en) | Monitoring device, monitoring system, and monitoring method | |
CN117392495A (en) | Video flame detection method and system based on feature fusion | |
JP2018018500A (en) | Face identification method | |
JP3888528B2 (en) | Liquid level recognition processing apparatus and liquid level monitoring system | |
CN116168345A (en) | Fire detection method and related equipment | |
US11625837B2 (en) | Earthquake monitoring system and earthquake monitoring method | |
CN111091024B (en) | Small target filtering method and system based on video recognition result | |
JP5015839B2 (en) | Smoke detector |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210917 |