CN117478891B - Intelligent management system for building construction - Google Patents
Intelligent management system for building construction Download PDFInfo
- Publication number
- CN117478891B CN117478891B CN202311824815.7A CN202311824815A CN117478891B CN 117478891 B CN117478891 B CN 117478891B CN 202311824815 A CN202311824815 A CN 202311824815A CN 117478891 B CN117478891 B CN 117478891B
- Authority
- CN
- China
- Prior art keywords
- frame
- difference
- value
- image
- gray
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000009435 building construction Methods 0.000 title claims abstract description 16
- 238000012544 monitoring process Methods 0.000 claims abstract description 37
- 238000007906 compression Methods 0.000 claims abstract description 31
- 230000006835 compression Effects 0.000 claims abstract description 30
- 238000009430 construction management Methods 0.000 claims abstract description 18
- 238000012512 characterization method Methods 0.000 claims description 59
- 238000000034 method Methods 0.000 claims description 22
- 239000011159 matrix material Substances 0.000 claims description 19
- 238000010276 construction Methods 0.000 claims description 11
- 238000004458 analytical method Methods 0.000 claims description 9
- 238000001914 filtration Methods 0.000 claims description 6
- 238000013507 mapping Methods 0.000 claims description 4
- 238000004364 calculation method Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000008030 elimination Effects 0.000 description 3
- 238000003379 elimination reaction Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 241001270131 Agaricus moelleri Species 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000010606 normalization Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000004566 building material Substances 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/157—Assigned coding mode, i.e. the coding mode being predefined or preselected to be further used for selection of another element or parameter
- H04N19/159—Prediction type, e.g. intra-frame, inter-frame or bidirectional frame prediction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/117—Filters, e.g. for pre-processing or post-processing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/136—Incoming video signal characteristics or properties
- H04N19/137—Motion inside a coding unit, e.g. average field, frame or block difference
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/17—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
- H04N19/176—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/90—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using coding techniques not provided for in groups H04N19/10-H04N19/85, e.g. fractals
- H04N19/93—Run-length coding
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
The invention relates to the technical field of video signal compression, in particular to an intelligent management system for building construction; firstly, dividing an image of each frame to obtain pixel blocks, obtaining an inter-frame difference image according to gray scale differences of the pixel blocks corresponding to adjacent frames, and obtaining a difference deviation trend of the pixel blocks corresponding to the adjacent frames according to gray scale features in the inter-frame difference image; and obtaining the correlation coefficient of the adjacent frame according to the gray difference characteristics in the pixel blocks of the adjacent frame. And obtaining the homology degree and the approximation degree of the image according to the correlation coefficient of the adjacent frames, the deviation tendency of the difference and the gray level characteristic of the pixel block. Finally, the invention determines the choice of the image according to the approximation degree and adaptively sets the low-pass filter to obtain the filtered image, and the filtered image is subjected to image compression, so that the compression efficiency and the information integrity are improved, and the efficiency and the accuracy of construction management monitoring are further improved.
Description
Technical Field
The invention relates to the technical field of video signal compression, in particular to an intelligent management system for building construction.
Background
Because the traditional building industry has the problems of high energy consumption, high building material consumption, ecological environment influence and the like, the assembled building is manufactured in a centralized way according to the traditional field operation requirements, and the field assembly combined with the BIM technology becomes an important direction of green development in recent years. BIM is a building information model, and the core is that a virtual three-dimensional building engineering model is established, a digital technology is utilized to provide a complete building engineering information base consistent with actual conditions for the model, and the on-site operation of the building is comprehensively monitored.
In the process of analyzing and managing a dynamic model and an actual construction site, the BIM needs to acquire site monitoring pictures in real time, and the multi-angle monitoring of the construction site can cause huge load problems for the transmission and storage of a BIM system. In the prior art, aiming at video compression, video is divided into frame images, a run-length encoding compression algorithm is used for compressing the images, the compression efficiency of run-length encoding depends on the high-frequency information content of the compressed images, the compression effect is better when the content is smaller, but the monitoring video has a plurality of frames of continuous images which only have extremely small change, and the redundant images are more, so that the video compression efficiency is low; the existing image redundancy elimination method only simply analyzes the integral gray level change of the image, has low redundancy elimination accuracy, is difficult to ensure the integrity of video signals after compression, and further affects the accuracy and efficiency of construction management monitoring.
Disclosure of Invention
In order to solve the technical problems that the prior image redundancy removing method only simply analyzes the whole gray level change of the image, the redundancy removing accuracy is low, the integrity of a video signal is difficult to ensure after compression, and the accuracy of construction management monitoring is further affected, the invention aims to provide a building construction intelligent management system, and the adopted technical scheme is as follows:
and a data acquisition module: the method comprises the steps of acquiring an image of each frame in a building construction monitoring video; dividing the image of each frame into at least two blocks of pixels;
the difference analysis module: the method comprises the steps of obtaining an inter-frame difference graph according to gray level differences of corresponding pixel points in pixel blocks of adjacent frames, and obtaining difference deviation trend of the pixel blocks of the corresponding adjacent frames according to gray level size characteristics of the inter-frame difference graph; obtaining the correlation coefficient of the adjacent frame of the pixel block according to the gray difference characteristic in the pixel block corresponding to the adjacent frame; obtaining the homology degree of the pixel blocks corresponding to the adjacent frames according to the adjacent frame correlation coefficient, the difference deviation trend and the gray scale characteristics of the pixel blocks corresponding to the adjacent frames;
the construction management module is used for obtaining the approximation degree of the image corresponding to the adjacent frame according to the homology degree; determining the choice of the image and the adaptively set low-pass filter according to the approximation degree to obtain a filtered image; and obtaining a compressed data stream through image compression according to the filtered image and performing construction management monitoring.
Further, the step of obtaining an inter-frame difference map according to the gray scale difference of the corresponding pixel point in the pixel block of the adjacent frame includes:
and calculating the sum of the gray value difference value and 256 of the pixel points at the same position in the pixel block corresponding to any frame and the adjacent previous frame to obtain an adjacent gray difference value, calculating the ratio of the adjacent gray difference value to a constant 2 to obtain the gray value of the pixel point at the corresponding position in the inter-frame difference map, and obtaining the gray value of each pixel point in the inter-frame difference map to obtain the inter-frame difference map.
Further, the step of obtaining a differential bias trend for the pixel blocks of corresponding adjacent frames includes:
for any pixel point in the inter-frame difference graph, calculating the difference value between the gray value of the pixel point and 128 to obtain a gray change characterization value of the pixel point, and calculating the sum of the standard deviation of the gray value of the inter-frame difference graph and a preset minimum positive number to serve as the standard deviation characterization value; calculating the third power of the ratio of the gray level change characterization value to the standard deviation characterization value to obtain a difference degree characterization value of the pixel point; and calculating and normalizing the sum of the difference degree representation values of all the pixel points to obtain the difference deviation trend.
Further, the step of obtaining the correlation coefficient of the adjacent frame of the pixel block according to the gray difference characteristic in the pixel block corresponding to the adjacent frame includes:
calculating the difference value between the gray value of any one pixel point in the pixel block of any one frame and the gray average value of the corresponding pixel block in the pixel block of any one frame in the pixel block corresponding to the adjacent previous frame to obtain the gray deviation characterization value of the current frame; calculating the difference value between the gray value of the pixel point at the corresponding position in the pixel block of the adjacent previous frame and the gray average value of the corresponding pixel block to obtain the gray deviation characterization value of the adjacent frame; calculating covariance between the gray scale deviation characterization value of the current frame and the gray scale deviation characterization value of the adjacent frame; calculating the product of the standard deviation of the gray value of the pixel block of any frame and the standard deviation of the gray value of the pixel block of the adjacent previous frame to obtain a standard deviation product representation value;
and calculating the ratio of the covariance of the gray scale deviation characterization value of the current frame to the gray scale deviation characterization value of the adjacent frame to the standard deviation product characterization value to obtain the correlation coefficient of the adjacent frame.
Further, the step of obtaining the degree of homology of the pixel blocks corresponding to the neighboring frames includes:
calculating the difference absolute value of the gray average value of the pixel block corresponding to any frame and the adjacent previous frame to obtain an adjacent gray difference representation value; calculating the product and negative correlation mapping of the adjacent gray level difference characterization value and the difference deviation trend absolute value to obtain an overall difference characterization value; calculating the sum value of the adjacent frame correlation coefficient and a constant 1, and calculating the ratio of the sum value of the adjacent frame correlation coefficient and the constant 1 to a constant 2 to obtain a correlation coefficient characterization value; and calculating the product of the correlation coefficient characterization value and the overall difference characterization value to obtain the homology degree of the pixel blocks of adjacent frames.
Further, the step of obtaining the approximation degree of the image corresponding to the adjacent frame according to the homology degree includes:
and taking the image of any frame and the image of the adjacent previous frame as the image of the adjacent frame, and calculating the average value of the homology degree of the pixel blocks of the adjacent frame to obtain the approximation degree.
Further, the step of determining the choice of the image according to the approximation degree and adaptively setting a low-pass filter to obtain a filtered image includes:
for the images of any frame and the adjacent previous frame, discarding the image of any frame when the approximation degree is greater than a preset approximation threshold;
when the approximation degree is smaller than a preset difference threshold value, taking the image of any frame as the filtering image;
when the approximation degree is not greater than a preset approximation threshold value and not less than a preset difference threshold value, carrying out Fourier transform on the image of any frame to obtain a frequency domain image; calculating the difference between a constant 1 and the approximation degree to obtain a filter matrix coefficient, respectively calculating the product of the filter matrix coefficient and the length and the width of the frequency domain image to obtain a filter matrix length and a filter matrix width, setting a low-pass filter according to the filter matrix length and the filter matrix width to filter the frequency domain image to obtain a filter image, and performing inverse Fourier transform on the filter image to obtain the filter image.
Further, the step of obtaining a compressed data stream by image compression according to the filtered image and performing construction management monitoring includes:
and compressing the filtered image through run-length coding to obtain a compressed data stream, uploading the compressed data stream to a storage device, and performing construction management monitoring according to the compressed data stream.
The invention has the following beneficial effects:
in the embodiment of the invention, the image is divided into the pixel blocks, so that the calculation efficiency can be improved, and the contrast of detail features of adjacent frames is facilitated; the gray scale change condition of each pixel point in the pixel block corresponding to the adjacent frame can be intuitively reflected by the obtained inter-frame difference map; the difference deviation trend can reflect the gray level change degree of the pixel blocks of the adjacent frames, and is convenient for the analysis of the final redundant image. The correlation coefficient of the adjacent frames can reflect the gray level change correlation degree of the pixel blocks corresponding to the adjacent frames, and meanwhile, the special condition reflected by the difference deviation trend is avoided, so that the analysis accuracy of the redundant images is improved. The homology degree can reflect the gray level difference degree of the pixel blocks corresponding to the adjacent frames, and the approximation degree can reflect the gray level difference degree of the images of the adjacent frames; whether the image is redundant or not can be intuitively and accurately analyzed according to the approximation degree, and a low-pass filter can be adaptively arranged, so that high-frequency information is reduced; finally, the redundant image is removed and the high-frequency information is removed through the approximation degree, so that the removing accuracy of the redundant image and the compression efficiency of video signals are improved, and the efficiency and accuracy of construction management monitoring are further improved.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions and advantages of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
Fig. 1 is a block diagram of a building construction intelligent management system according to an embodiment of the present invention.
Detailed Description
In order to further describe the technical means and effects adopted by the invention to achieve the preset aim, the following detailed description is given below of a building construction intelligent management system according to the invention, which is specific to the implementation, structure, characteristics and effects thereof, with reference to the accompanying drawings and the preferred embodiment. In the following description, different "one embodiment" or "another embodiment" means that the embodiments are not necessarily the same. Furthermore, the particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following specifically describes a specific scheme of the intelligent management system for building construction provided by the invention with reference to the accompanying drawings.
Referring to fig. 1, a block diagram of a building construction intelligent management system according to an embodiment of the present invention is shown, where the system includes the following modules:
the data acquisition module S1 is used for acquiring an image of each frame in the building construction monitoring video; the image of each frame is divided into at least two pixel blocks.
In the embodiment of the invention, the implementation scene is compression processing of the monitoring video signal of the construction site. BIM is used as a three-dimensional building model, and mainly comprises the steps of forming a three-dimensional model by recombining time and space data of a plurality of subsystems, wherein the three-dimensional model comprises monitoring management of construction by visual monitoring data in the construction process. Thus, there is a need for monitoring image data acquisition at a construction site. The equipment for collecting and monitoring is a video monitoring module of a construction site, the video monitoring module based on the Internet of things is used for integrating and reconstructing the collected multi-angle video data through the collaborative processing subsystem, and because the monitoring required by BIM mainly acts on modeling analysis, the color space is ineffective to the analysis, and therefore, a monitoring video data stream is generated for the graying of a monitoring image. Because the redundant image with smaller change exists in the monitoring video, the efficiency of video signal compression is reduced, and the construction management and monitoring are affected. In order to remove redundant images in the monitoring video data stream, it is necessary to acquire a grayed image of each frame in the monitoring video, and then analyze whether the image is a redundant image according to the image of each frame. Because some existing methods for analyzing redundant images judge whether the image changes according to the gray level change of the whole image, the gray level of the whole image is unchanged but the gray level at a plurality of details is possibly changed, and the image which is easy to change is removed as the redundant image, so that effective information is lost in the compression process, in order to improve the analysis accuracy of the redundant image and the efficiency of analysis and calculation, the image of each frame is divided into at least two pixel blocks, the pixel blocks at the corresponding positions of adjacent frames are analyzed, in the embodiment of the invention, each frame of image is divided into 36 pixel blocks with the same size, and an operator can determine according to implementation scenes.
After the image of each frame of the building construction monitoring video is obtained and divided into different pixel blocks, the subsequent steps can analyze redundancy conditions according to the gray scale characteristics of the pixel blocks at the corresponding positions of the adjacent frames.
The difference analysis module S2: the method comprises the steps of obtaining an inter-frame difference graph according to gray level differences of corresponding pixel points in pixel blocks of adjacent frames, and obtaining a difference deviation trend of the pixel blocks of the corresponding adjacent frames according to gray level size characteristics of the inter-frame difference graph; obtaining the correlation coefficient of the adjacent frame of the pixel block according to the gray difference characteristic in the pixel block corresponding to the adjacent frame; and obtaining the homology degree of the pixel blocks corresponding to the adjacent frames according to the correlation coefficient of the adjacent frames, the deviation tendency of the difference and the gray scale characteristics of the pixel blocks corresponding to the adjacent frames.
Firstly, because the change condition of the gray value of the pixel point can intuitively reflect whether the detail of the image changes, the image change of the adjacent frame is represented according to the gray difference of the corresponding pixel point, so that an inter-frame difference image can be obtained according to the gray difference of the corresponding pixel point in the pixel block of the adjacent frame, and the method specifically comprises the following steps: and calculating the sum of the gray value difference value and 256 of the pixel points at the corresponding positions in the pixel block corresponding to any frame and the adjacent previous frame to obtain an adjacent gray difference value, calculating the ratio of the adjacent gray difference value to 2 to obtain the gray value of the pixel point at the corresponding position in the inter-frame difference map, and calculating the gray value of each pixel point in the inter-frame difference map to obtain the inter-frame difference map. The gray value of each pixel point in the inter-frame difference map can represent the gray variation condition of the adjacent frame at the corresponding position, because the gray difference of the pixel point at the corresponding position of the adjacent frame is between-256 and 256, the gray difference can be mapped between 0 and 256 by calculating the ratio of the adjacent gray difference value to 2, when the gray value of the pixel point in the inter-frame difference map is 128, the gray difference of the pixel point corresponding to the adjacent frame is 0, at this time, the gray of the pixel point at the position is unchanged, and if the gray difference of all positions in the pixel block is 0, namely, when the gray value of the pixel point in the inter-frame difference map is 128, the gray of the pixel point in the image is unchanged.
When the gray value of the pixel point in the inter-frame difference image is smaller than 128, the gray value of the pixel point of the adjacent frame is smaller; when the gray value of the pixel point in the inter-frame difference map is greater than 128, the gray value of the pixel point of the adjacent frame is larger. Therefore, in order to intuitively characterize the image variation characteristics of the adjacent frames, the difference bias trend of the pixel blocks corresponding to the adjacent frames can be obtained according to the gray scale characteristics of the inter-frame difference map, which specifically includes: for any pixel point in the inter-frame difference graph, calculating the difference value between the pixel point and 128 to obtain a gray level change characterization value, and calculating the sum of the standard deviation of the gray level value of the inter-frame difference graph and the preset minimum positive number to serve as the standard deviation characterization value; calculating a cubic square of the ratio of the gray level change characterization value to the standard deviation characterization value to obtain a difference degree characterization value; calculating and normalizing the sum of the difference degree representation values of all the pixel points to obtain a difference deviation trend; the acquisition formula of the differential deviation trend specifically comprises:
in the method, in the process of the invention,bias the trend towards differences for pixel blocks of neighboring frames, < >>Representing the number of pixels in the inter-disparity map,/-, for example>Representing a preset minimum positive number, in order to prevent the denominator from being zero, in the present embodiment 0.01, ++>Standard deviation of gray value of pixel point in the representation frame difference diagram, < >>Representing +.>Gray values of the individual pixels; />The values can be normalized to +.>Is a kind of medium. />For standard deviation characterization value, +.>Values are characterized for the degree of difference.
For the acquisition of the deviation trend, the standard deviation representation value can reflect the overall change condition of the adjacent frames of the pixel block as a whole, and the smaller the standard deviation is, the more similar the overall change condition is. When the gray value of the pixel point in the inter-frame difference graph is close to 128, the gray value of the adjacent frame of the pixel point at the position is unchanged, and the final difference deviation trend is close to 0; when the gray value of the pixel point in the inter-frame difference graph is close to 0, the gray value of the adjacent frame of the pixel point at the position is reduced, and the final difference deviation trend is close to-1 after normalization; when the gray value of the pixel point in the inter-frame difference map is close to 256, the gray value of the adjacent frame of the pixel point at the position is larger, and the final difference deviation trend is close to 1 after normalization. So far, the variation condition of the pixel block of the adjacent frame can be intuitively reflected according to the difference deviation trend value calculated by the gray scale characteristics of the inter-frame difference graph, and when the difference deviation trend is closer to 0, the overall variation of the pixel block in the adjacent frame is smaller; when the trend of the difference deviation is closer to-1 or 1, this means that the overall variation of the pixel block in the adjacent frame is larger.
Further, when the difference deviation trend value is calculated, special cases exist that the number and the change degree of the pixel points with larger and smaller gray scales in the pixel block are the same, and at the moment, the difference deviation trend value is zero, but the gray scales of the pixel block are changed greatly; in order to avoid the influence of special cases on the subsequent redundancy elimination step, the gray scale variation characteristics of the pixel blocks need to be analyzed from multiple aspects, so that the correlation coefficients of the adjacent frames of the pixel blocks are obtained according to the gray scale difference characteristics in the pixel blocks corresponding to the adjacent frames, and specifically include: for the pixel blocks corresponding to any frame and the adjacent previous frame, calculating the difference value between the gray value of any pixel point in the pixel block of any frame and the gray average value to obtain the gray deviation characterization value of the current frame; calculating the difference value between the gray value of the corresponding pixel point in the pixel block of the adjacent previous frame and the gray average value to obtain the gray deviation characterization value of the adjacent frame; calculating covariance between the gray scale deviation characterization value of the current frame and the gray scale deviation characterization value of the adjacent frame; calculating the product of the standard deviation of the gray value of the pixel block of any frame and the standard deviation of the gray value of the pixel block of the adjacent previous frame to obtain a standard deviation product representation value; calculating the ratio of the covariance of the gray scale deviation characterization value of the current frame to the gray scale deviation characterization value of the adjacent frame and the standard deviation product characterization value to obtain the correlation coefficient of the adjacent frame; the acquisition formula of the correlation coefficient of the adjacent frames specifically comprises the following steps:
in the method, in the process of the invention,representing the correlation coefficients of adjacent frames,/>The difference value between the gray value of any pixel point in the pixel block of any frame and the gray average value is the gray deviation characterization value of the current frame; />Representing the difference value between the gray value of the corresponding pixel point in the pixel block of the adjacent previous frame and the gray average value, and taking the difference value as the gray deviation characterization value of the adjacent frame; />Representing covariance of the gray scale deviation characterization value of the current frame and the gray scale deviation characterization value of the adjacent frame; />Standard deviation of gray value representing pixel block of any frame,/for the pixel block of any frame>Standard deviation of gray values representing pixel blocks of adjacent previous frame +.>Representing the standard deviation product characterization value.
Regarding the acquisition of the correlation coefficient of the adjacent frame, the formula is a calculation formula of the pearson correlation coefficient, and it should be noted that the pearson correlation coefficient is the prior art, and is used for measuring the correlation of two variables, and specific calculation steps are not repeated. The value range of the correlation coefficient of the adjacent frames isWhen the correlation coefficient of the adjacent frame is closer to 1, the difference between the gray level of the corresponding pixel point of the pixel block in the adjacent frame and the gray level mean value of the pixel point is more similar, and the gray level change of the corresponding pixel block of the adjacent frame is smaller; when the correlation coefficient of the adjacent frame is less than 1, the difference between the gray level of the corresponding pixel point in the adjacent frame and the gray level mean value of the pixel point is less similar, and the gray level change of the corresponding pixel block of the adjacent frame is larger. So far, whether the gray value of the corresponding pixel block of the adjacent frame is changed greatly can be intuitively reflected by the magnitude of the correlation coefficient of the adjacent frame.
After the correlation coefficient and the difference deviation trend of the adjacent frames are obtained, the gray level change degree of the pixel blocks of the adjacent frames can be analyzed, so that the homology degree of the pixel blocks of the corresponding adjacent frames is obtained according to the correlation coefficient and the difference deviation trend of the adjacent frames and the gray level characteristics of the pixel blocks corresponding to the adjacent frames, and the method specifically comprises the following steps: calculating the difference absolute value of the gray average value of the pixel block corresponding to any frame and the adjacent previous frame to obtain an adjacent gray difference representation value; calculating the product of the adjacent gray level difference characterization value and the absolute value of the difference deviation trend and carrying out negative correlation mapping to obtain an overall difference characterization value; calculating the sum value of the correlation coefficient of the adjacent frame and 1, and calculating the ratio of the sum value of the correlation coefficient of the adjacent frame and constant 1 to constant 2 to obtain a correlation coefficient characterization value; calculating the product of the correlation coefficient characterization value and the overall difference characterization value to obtain the homology degree; the acquisition formula of the homology degree specifically comprises:
in the method, in the process of the invention,representing the homology degree of the corresponding pixel blocks of the adjacent frames; />The absolute value of the difference value of the gray average value of the pixel block corresponding to any frame and the adjacent previous frame is represented as the adjacent gray difference representation value; />Representing the product of the adjacent gray scale difference characterization value and the absolute value of the difference deviation trend; />Represents an exponential function with a base of a natural constant,representing the product of the adjacent gray level difference characterization value and the absolute value of the difference deviation trend and carrying out negative correlation mapping to obtain an overall difference characterization value; />Characterizing the value for the correlation coefficient.
For the acquisition of the degree of homology, the value range isThe method comprises the steps of carrying out a first treatment on the surface of the Wherein the value range of the correlation coefficient representation value is 0 to 1, and when the value is closer to 1, the gray level change of the pixel block corresponding to the adjacent frame is smaller; when the adjacent gray level difference representation value is close to 0, the gray level average value difference of the adjacent frame pixel blocks is smaller; the gray scale difference of the pixel blocks of adjacent frames may be smaller as the difference bias trend is closer to 0. Therefore, when the homology degree is close to 1 finally, the gray level change of the pixel blocks corresponding to the adjacent frames is smaller, and the pixel blocks of the adjacent frames can be considered as homologous images; when the degree of homology is closer to 0, meaning that the gray scale variation of the corresponding pixel block of the adjacent frame is larger, the pixel block of the adjacent frame can be considered as a more different homologous image.
The gray level change degree of the pixel blocks corresponding to the adjacent frames can be judged according to the homology degree, and the redundant images can be analyzed according to the homology degree values of all the pixel blocks of the adjacent frames.
A construction management module S3, configured to obtain an approximation degree of an image corresponding to an adjacent frame according to the homology degree; determining the choice of the image according to the approximation degree, and adaptively setting a low-pass filter to obtain a filtered image; and obtaining a compressed data stream through image compression according to the filtered image and performing construction management monitoring.
After obtaining the homology degree of each pixel block of the adjacent frame, the overall change condition of the image in the adjacent frame needs to be analyzed, so that the approximation degree of the image corresponding to the adjacent frame is obtained according to the homology degree, which specifically comprises the following steps: for any frame and the adjacent frame adjacent to the previous frame, calculating the average value of the homology degree of the pixel blocks of the adjacent frame to obtain the approximation degree. The greater the approximation, the more similar the two images of adjacent frames are meant; the smaller the approximation, the less similar the two images of adjacent frames are meant.
Further, when the approximation degree of two images of adjacent frames is too high, the two images are similar, namely the latter one is considered as a redundant image, the information in the images has little effect, and the compression efficiency of the subsequent video signals is reduced; therefore, the choice of the image can be determined according to the approximation degree, and the low-pass filter is adaptively arranged to obtain the filtered image, which specifically comprises the following steps: for any frame and the image of the adjacent previous frame, discarding the image of any frame when the approximation degree is larger than a preset approximation threshold; when the approximation degree is smaller than a preset difference threshold value, taking the image of any frame as a filtered image; when the approximation degree is not more than a preset approximation threshold value and not less than a preset difference threshold value, carrying out Fourier transform on the image of any frame to obtain a frequency domain image; calculating the difference between the constant 1 and the approximation degree to obtain a filter matrix coefficient, respectively calculating the product of the filter matrix coefficient and the length and the width of the frequency domain image to obtain a filter matrix length and a filter matrix width, setting a low-pass filter according to the filter matrix length and the filter matrix width to filter the frequency domain image to obtain a filter image, and performing inverse Fourier transform on the filter image to obtain a filter image. It should be noted that, the fourier transform and the low-pass filter filtering belong to the prior art, and specific steps are not repeated; in the embodiment of the invention, the preset approximate threshold is 0.9, the preset difference threshold is 0.35, and the implementation can be determined by the implementation according to the implementation scene.
When the approximation degree exceeds a preset approximation threshold, two images of adjacent frames can be considered to be similar, redundancy occurs, and any frame needs to be removed in order to improve compression efficiency. The purpose of filtering the image is to eliminate high-frequency information in the image, reduce the information content of the image and improve the compression efficiency. When the approximation degree is smaller than a preset difference threshold value, the fact that the difference between two images of adjacent frames is larger means that filtering is not carried out on any frame of image, so that excessive information is prevented from being eliminated, and compression accuracy is prevented from being influenced; when the approximation degree is not larger than a preset approximation threshold value and not smaller than a preset difference threshold value, the larger the approximation degree is, the smaller the filter matrix is, the more the content of the eliminated high-frequency information is, and the lower the quality of the image is. The low-pass filter is adaptively arranged through the approximation degree, the high-frequency information is adaptively eliminated on the basis of the filtering of the frequency domain, the information content of similar images is reduced, and the compression efficiency of video signals is improved; while also preserving the integrity of the information.
After obtaining the filtered image of the monitoring video data stream, the compressed data stream can be obtained through image compression according to the filtered image and construction management monitoring is carried out, and the method specifically comprises the following steps: and compressing the filtered image through run-length coding to obtain a compressed data stream, uploading the compressed data stream to a storage device, and performing construction management monitoring according to the compressed data stream. It should be noted that, the run-length encoding compression algorithm belongs to the prior art, and specific steps are not repeated. The compression efficiency of run-length encoding depends on the content of high-frequency information in an image, and the image compression efficiency is higher the less the high-frequency information. And the BIM is used for reading and analyzing the compressed data stream in the storage device, so that the accuracy and the efficiency of management and monitoring on the construction site are improved.
In summary, the embodiment of the invention provides a building construction intelligent management system; firstly, dividing an image of each frame to obtain pixel blocks, obtaining an inter-frame difference image according to gray scale differences of the pixel blocks corresponding to adjacent frames, and obtaining a difference deviation trend of the pixel blocks corresponding to the adjacent frames according to gray scale features in the inter-frame difference image; and obtaining the correlation coefficient of the adjacent frame according to the gray difference characteristics in the pixel blocks of the adjacent frame. And obtaining the homology degree and the approximation degree of the image according to the correlation coefficient of the adjacent frames, the deviation tendency of the difference and the gray level characteristic of the pixel block. Finally, the invention determines the choice of the image according to the approximation degree and adaptively sets the low-pass filter to obtain the filtered image, and the filtered image is subjected to image compression, so that the compression efficiency and the information integrity are improved, and the efficiency and the accuracy of construction management monitoring are further improved.
It should be noted that: the sequence of the embodiments of the present invention is only for description, and does not represent the advantages and disadvantages of the embodiments. The processes depicted in the accompanying drawings do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments.
Claims (5)
1. An intelligent management system for building construction is characterized by comprising the following modules:
and a data acquisition module: the method comprises the steps of acquiring an image of each frame in a building construction monitoring video; dividing the image of each frame into at least two blocks of pixels;
the difference analysis module: the method for obtaining an inter-frame difference map according to the gray scale difference of the corresponding pixel point in the pixel block of the adjacent frame, and obtaining the difference deviation trend of the pixel block of the corresponding adjacent frame according to the gray scale size characteristic of the inter-frame difference map comprises the following steps: for any pixel point in the inter-frame difference graph, calculating the difference value between the gray value of the pixel point and 128 to obtain a gray change characterization value of the pixel point, and calculating the sum of the standard deviation of the gray value of the inter-frame difference graph and a preset minimum positive number to serve as the standard deviation characterization value; calculating the third power of the ratio of the gray level change characterization value to the standard deviation characterization value to obtain a difference degree characterization value of the pixel point; calculating and normalizing the sum of the difference degree representation values of all pixel points to obtain the difference deviation trend;
obtaining the correlation coefficient of the adjacent frame of the pixel block according to the gray difference characteristic in the pixel block corresponding to the adjacent frame, comprising:
calculating the difference value between the gray value of any one pixel point in the pixel block of any one frame and the gray average value of the corresponding pixel block in the pixel block of any one frame in the pixel block corresponding to the adjacent previous frame to obtain the gray deviation characterization value of the current frame; calculating the difference value between the gray value of the pixel point at the corresponding position in the pixel block of the adjacent previous frame and the gray average value of the corresponding pixel block to obtain the gray deviation characterization value of the adjacent frame; calculating covariance between the gray scale deviation characterization value of the current frame and the gray scale deviation characterization value of the adjacent frame; calculating the product of the standard deviation of the gray value of the pixel block of any frame and the standard deviation of the gray value of the pixel block of the adjacent previous frame to obtain a standard deviation product representation value;
calculating the ratio of the covariance of the gray scale deviation characterization value of the current frame to the gray scale deviation characterization value of the adjacent frame to the standard deviation product characterization value to obtain the correlation coefficient of the adjacent frame;
obtaining the homology degree of the pixel blocks corresponding to the adjacent frames according to the adjacent frame correlation coefficient, the difference deviation trend and the gray scale characteristics of the pixel blocks corresponding to the adjacent frames, wherein the method comprises the following steps:
calculating the difference absolute value of the gray average value of the pixel block corresponding to any frame and the adjacent previous frame to obtain an adjacent gray difference representation value; calculating the product and negative correlation mapping of the adjacent gray level difference characterization value and the difference deviation trend absolute value to obtain an overall difference characterization value; obtaining the representation value of the correlation coefficient asP is the correlation coefficient of the adjacent frames; calculating the product of the correlation coefficient characterization value and the overall difference characterization value to obtain the homology degree of the pixel blocks of adjacent frames;
the construction management module is used for obtaining the approximation degree of the image corresponding to the adjacent frame according to the homology degree; determining the choice of the image and the adaptively set low-pass filter according to the approximation degree to obtain a filtered image; and obtaining a compressed data stream through image compression according to the filtered image and performing construction management monitoring.
2. The intelligent management system for building construction according to claim 1, wherein the step of obtaining an inter-frame difference map according to gray differences of corresponding pixels in the pixel blocks of adjacent frames comprises:
and calculating the sum of the gray value difference value and 256 of the pixel points at the same position in the pixel block corresponding to any frame and the adjacent previous frame to obtain an adjacent gray difference value, calculating the ratio of the adjacent gray difference value to a constant 2 to obtain the gray value of the pixel point at the corresponding position in the inter-frame difference map, and obtaining the gray value of each pixel point in the inter-frame difference map to obtain the inter-frame difference map.
3. The intelligent management system for construction according to claim 1, wherein the step of obtaining the approximation degree of the image corresponding to the adjacent frame according to the homology degree comprises:
and taking the image of any frame and the image of the adjacent previous frame as the image of the adjacent frame, and calculating the average value of the homology degree of the pixel blocks of the adjacent frame to obtain the approximation degree.
4. The intelligent management system for construction according to claim 1, wherein the steps of determining the choice of the image according to the approximation degree and adaptively setting a low-pass filter to obtain a filtered image comprise:
for the images of any frame and the adjacent previous frame, discarding the image of any frame when the approximation degree is greater than a preset approximation threshold;
when the approximation degree is smaller than a preset difference threshold value, taking the image of any frame as the filtering image;
when the approximation degree is not greater than a preset approximation threshold value and not less than a preset difference threshold value, carrying out Fourier transform on the image of any frame to obtain a frequency domain image; calculating the difference between a constant 1 and the approximation degree to obtain a filter matrix coefficient, respectively calculating the product of the filter matrix coefficient and the length and the width of the frequency domain image to obtain a filter matrix length and a filter matrix width, setting a low-pass filter according to the filter matrix length and the filter matrix width to filter the frequency domain image to obtain a filter image, and performing inverse Fourier transform on the filter image to obtain the filter image.
5. The intelligent management system for construction according to claim 1, wherein the step of obtaining a compressed data stream by image compression based on the filtered image and performing construction management monitoring comprises:
and compressing the filtered image through run-length coding to obtain a compressed data stream, uploading the compressed data stream to a storage device, and performing construction management monitoring according to the compressed data stream.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311824815.7A CN117478891B (en) | 2023-12-28 | 2023-12-28 | Intelligent management system for building construction |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311824815.7A CN117478891B (en) | 2023-12-28 | 2023-12-28 | Intelligent management system for building construction |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117478891A CN117478891A (en) | 2024-01-30 |
CN117478891B true CN117478891B (en) | 2024-03-15 |
Family
ID=89635158
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311824815.7A Active CN117478891B (en) | 2023-12-28 | 2023-12-28 | Intelligent management system for building construction |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117478891B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117714691B (en) * | 2024-02-05 | 2024-04-12 | 佳木斯大学 | AR augmented reality piano teaching is with self-adaptation transmission system |
CN118158428B (en) * | 2024-05-11 | 2024-07-05 | 国网黑龙江省电力有限公司伊春供电公司 | Physical information system data compression method based on bitmap compression |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012103742A (en) * | 2010-11-05 | 2012-05-31 | Nippon Telegr & Teleph Corp <Ntt> | Image processing apparatus, image processing method, and recording medium with recorded image processing program |
JP2015226326A (en) * | 2014-05-26 | 2015-12-14 | 富士通株式会社 | Video analysis method and video analysis device |
WO2021208275A1 (en) * | 2020-04-12 | 2021-10-21 | 南京理工大学 | Traffic video background modelling method and system |
CN114782419A (en) * | 2022-06-17 | 2022-07-22 | 山东水利建设集团有限公司 | Water conservancy construction gradient detection method |
CN115297289A (en) * | 2022-10-08 | 2022-11-04 | 南通第二世界网络科技有限公司 | Efficient storage method for monitoring video |
WO2022252222A1 (en) * | 2021-06-04 | 2022-12-08 | 深圳市大疆创新科技有限公司 | Encoding method and encoding device |
CN115914649A (en) * | 2023-03-01 | 2023-04-04 | 广州高通影像技术有限公司 | Data transmission method and system for medical video |
CN116309565A (en) * | 2023-05-17 | 2023-06-23 | 山东晨光胶带有限公司 | High-strength conveyor belt deviation detection method based on computer vision |
WO2023134791A2 (en) * | 2022-12-16 | 2023-07-20 | 苏州迈创信息技术有限公司 | Environmental security engineering monitoring data management method and system |
CN117115713A (en) * | 2023-08-31 | 2023-11-24 | 广州商研网络科技有限公司 | Dynamic image generation method, device, equipment and medium thereof |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102648198B1 (en) * | 2019-01-14 | 2024-03-19 | 삼성디스플레이 주식회사 | Afterimage compensator and display device having the same |
-
2023
- 2023-12-28 CN CN202311824815.7A patent/CN117478891B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012103742A (en) * | 2010-11-05 | 2012-05-31 | Nippon Telegr & Teleph Corp <Ntt> | Image processing apparatus, image processing method, and recording medium with recorded image processing program |
JP2015226326A (en) * | 2014-05-26 | 2015-12-14 | 富士通株式会社 | Video analysis method and video analysis device |
WO2021208275A1 (en) * | 2020-04-12 | 2021-10-21 | 南京理工大学 | Traffic video background modelling method and system |
WO2022252222A1 (en) * | 2021-06-04 | 2022-12-08 | 深圳市大疆创新科技有限公司 | Encoding method and encoding device |
CN114782419A (en) * | 2022-06-17 | 2022-07-22 | 山东水利建设集团有限公司 | Water conservancy construction gradient detection method |
CN115297289A (en) * | 2022-10-08 | 2022-11-04 | 南通第二世界网络科技有限公司 | Efficient storage method for monitoring video |
WO2023134791A2 (en) * | 2022-12-16 | 2023-07-20 | 苏州迈创信息技术有限公司 | Environmental security engineering monitoring data management method and system |
CN115914649A (en) * | 2023-03-01 | 2023-04-04 | 广州高通影像技术有限公司 | Data transmission method and system for medical video |
CN116309565A (en) * | 2023-05-17 | 2023-06-23 | 山东晨光胶带有限公司 | High-strength conveyor belt deviation detection method based on computer vision |
CN117115713A (en) * | 2023-08-31 | 2023-11-24 | 广州商研网络科技有限公司 | Dynamic image generation method, device, equipment and medium thereof |
Non-Patent Citations (2)
Title |
---|
基于MATLAB图像处理的高速铁路异物侵限检测技术研究;李拓;;技术与市场;20180615(第06期);全文 * |
基于深度学习的岩心图像压缩模型研究;常子鹏;宋文广;顾宫;;电脑知识与技术;20180825(第24期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN117478891A (en) | 2024-01-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN117478891B (en) | Intelligent management system for building construction | |
CN110826684B (en) | Convolutional neural network compression method, convolutional neural network compression device, electronic device, and medium | |
CN111988611B (en) | Quantization offset information determining method, image encoding device and electronic equipment | |
CN109961465B (en) | Multi-fractal-based corrosion foil surface tunnel hole uniformity characterization method | |
CN108510496B (en) | Fuzzy detection method for SVD (singular value decomposition) based on image DCT (discrete cosine transformation) domain | |
CN106658004B (en) | A kind of compression method and device based on image flat site feature | |
CN112381790A (en) | Abnormal image detection method based on depth self-coding | |
CN117640932B (en) | Neurology image compression transmission method for telemedicine | |
CN108537133A (en) | A kind of face reconstructing method based on supervised learning depth self-encoding encoder | |
CN110428450A (en) | Dimension self-adaption method for tracking target applied to the mobile inspection image of mine laneway | |
CN116567269A (en) | Spectrum monitoring data compression method based on signal-to-noise separation | |
CN117274820B (en) | Map data acquisition method and system for mapping geographic information | |
CN117312255B (en) | Electronic document splitting optimization management method and system | |
CN102509076B (en) | Principal-component-analysis-based video image background detection method | |
CN116777845B (en) | Building site safety risk intelligent assessment method and system based on artificial intelligence | |
CN116896638A (en) | Data compression coding technology for transmission operation detection scene | |
CN112261386B (en) | High-speed frame difference backup system for video | |
CN114993671A (en) | Vibration fault diagnosis method and system based on Q factor wavelet transform | |
CN116612052B (en) | Building construction safety information management method and system | |
CN118607905B (en) | Deep learning-based building construction progress supervision and management method | |
CN117315527A (en) | Monitoring video data real-time screening method for anomaly detection | |
CN118247715B (en) | Streaming media disk storage method based on real-time video monitoring | |
CN118509561B (en) | All-link monitoring method and system based on video | |
CN114331838A (en) | Super-resolution reconstruction method for panoramic monitoring image of extra-high voltage converter station protection system | |
CN116703779A (en) | Image denoising method based on AFF feature fusion |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |