CN113840160B - Event data transmission method, system, electronic device and readable storage medium - Google Patents
Event data transmission method, system, electronic device and readable storage medium Download PDFInfo
- Publication number
- CN113840160B CN113840160B CN202111151533.6A CN202111151533A CN113840160B CN 113840160 B CN113840160 B CN 113840160B CN 202111151533 A CN202111151533 A CN 202111151533A CN 113840160 B CN113840160 B CN 113840160B
- Authority
- CN
- China
- Prior art keywords
- key frame
- event
- analysis result
- real
- video stream
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000005540 biological transmission Effects 0.000 title claims abstract description 51
- 238000000034 method Methods 0.000 title claims abstract description 50
- 238000013500 data storage Methods 0.000 claims abstract description 58
- 238000010191 image analysis Methods 0.000 claims abstract description 24
- 238000004458 analytical method Methods 0.000 claims description 101
- 238000004590 computer program Methods 0.000 claims description 16
- 230000000903 blocking effect Effects 0.000 abstract description 7
- 238000004891 communication Methods 0.000 description 11
- 238000010586 diagram Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 10
- 238000012545 processing Methods 0.000 description 5
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 230000007123 defense Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000153 supplemental effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/23418—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/80—Responding to QoS
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/238—Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
- H04N21/2385—Channel allocation; Bandwidth allocation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/64—Addressing
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
The application relates to the technical field of security and protection systems and discloses an event data transmission method, an event data transmission system, electronic equipment and a readable storage medium. In this way, the real-time video stream is used for transmitting the first event acquisition information containing the identification result and the data storage address, so that compared with the case that the event data containing the event image and the image analysis data is transmitted through independent event transmission streams, the transmission data quantity and the transmission bandwidth are reduced, the network bandwidth is further saved, and network blocking is avoided.
Description
Technical Field
The present application relates to the field of data storage technologies, and in particular, to a method, a system, an electronic device, and a readable storage medium for transmitting event data.
Background
At present, video equipment is used as front-end equipment, security data is obtained through video, photographing, audio recording and other modes, security arrangement and management are further carried out according to the obtained security data, the video equipment becomes an important component of a security system, and intelligent events triggered by the front-end equipment are processed, so that the video equipment is a large core function of security management. The front-end devices and the back-end platform are typically connected using an open protocol (e.g., ONVIF (Open Network Video Interface Forum, open network video interface forum), GB 28181), or a vendor-specific proprietary protocol. The front-end equipment obtains event data containing event images and image information through intelligent analysis functions (such as face detection, vehicle snapshot, dynamic detection and the like), and the rear-end platform realizes related security and protection services by receiving the event data sent by the front-end equipment.
The front-end device typically transmits event data to the back-end platform via a separate event stream and transmits real-time video via an additional real-time video stream, which may occupy a lot of bandwidth and even cause network congestion if the front-end device detects more events or the event data is of a large length.
Disclosure of Invention
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview, and is intended to neither identify key/critical elements nor delineate the scope of such embodiments, but is intended as a prelude to the more detailed description that follows.
In view of the above-mentioned shortcomings of the prior art, the present application discloses an event data transmission method, system, electronic device and readable storage medium, so as to reduce the bandwidth consumption for transmitting event data.
The application discloses an event data transmission method, which comprises the following steps: the front-end equipment acquires a key frame in the real-time video stream, and identifies the key frame to be analyzed to obtain an identification result; encoding the key frame according to the data storage address of the key frame analysis result and the identification result, and generating first event acquisition information; inserting the first event acquisition information into the real-time video stream and transmitting the first event acquisition information to a back-end platform; and the back-end platform acquires the key frame analysis result by extracting the information of the real-time video stream inserted with the first event acquisition information.
Optionally, the back-end platform obtains the key frame analysis result by extracting information from the real-time video stream inserted with the first event obtaining information, including: the method comprises the steps that a back-end platform obtains first event obtaining information in a real-time video stream; decoding the first event acquisition information to obtain the key frame and the data storage address; acquiring the key frame analysis result from the data storage address; and establishing a corresponding relation between the key frame and the key frame analysis result to obtain event data.
Optionally, encoding the key frame according to the data storage address of the key frame analysis result and the identification result, and before generating the first event acquisition information, further including: the front-end equipment numbers each key frame in the real-time video stream to obtain a code corresponding to each key frame; if the key frame to be analyzed is transmitted to the back-end platform through the real-time video stream, taking a code corresponding to the key frame to be analyzed as a key frame ID; encoding any key frame in the real-time video stream according to the data storage address of the key frame analysis result, the identification result and the key frame ID to generate second event acquisition information; and inserting the second event acquisition information into the real-time video stream and transmitting the second event acquisition information to a back-end platform.
Optionally, the event data transmission method further includes: the back-end platform acquires second event acquisition information in the real-time video stream; decoding the second event acquisition information to obtain the key frame ID and the data storage address; matching in the received real-time video stream according to the key frame ID, obtaining a key frame corresponding to the key frame ID, and obtaining the key frame analysis result from the data storage address; and establishing a corresponding relation between the key frame and the key frame analysis result to obtain event data.
Optionally, obtaining the key frame analysis result from the data storage address includes: if the result of acquiring the key frame analysis result is failure, acquiring the key frame analysis result again from the data storage address after a preset time period; if the result of obtaining the key frame analysis result is successful, establishing a corresponding relation between the key frame and the key frame analysis result, and obtaining event data.
Optionally, the key frame analysis result is obtained by: and carrying out image analysis on the key frames to be analyzed to obtain key frame analysis results corresponding to the key frames.
Optionally, performing image analysis on the key frame to be analyzed to obtain a key frame analysis result corresponding to the key frame, including: if the image analysis result is failure, taking preset data as a key frame analysis result corresponding to the key frame; and if the image analysis result is successful, obtaining a key frame analysis result corresponding to the key frame.
The application discloses an event data transmission system, comprising: the front-end equipment is used for acquiring key frames in the real-time video stream, identifying the key frames to be analyzed, obtaining an identification result, encoding the key frames according to a data storage address of the key frame analysis result and the identification result, generating first event acquisition information, inserting the first event acquisition information into the real-time video stream, and transmitting the first event acquisition information to the back-end platform; the back-end platform is connected with the front-end equipment and is used for acquiring the key frame analysis result by extracting information from the real-time video stream inserted with the first event acquisition information.
The application discloses an electronic device, comprising: a processor and a memory; the memory is used for storing a computer program, and the processor is used for executing the computer program stored in the memory so as to enable the electronic equipment to execute the method.
The present application discloses a computer-readable storage medium having stored thereon a computer program: the computer program, when executed by a processor, implements the above-described method.
The application has the beneficial effects that: the method comprises the steps of obtaining a key frame in a real-time video stream through front-end equipment, identifying the key frame to be analyzed, obtaining an identification result, encoding the key frame according to a data storage address of the key frame analysis result and the identification result, generating first event obtaining information, inserting the first event obtaining information into the real-time video stream, transmitting the first event obtaining information to a rear-end platform, and obtaining the key frame analysis result through the rear-end platform by extracting the information of the real-time video stream inserted with the first event obtaining information. In this way, the real-time video stream is used for transmitting the first event acquisition information containing the identification result and the data storage address, so that compared with the case that the event data containing the event image and the image analysis data is transmitted through independent event transmission streams, the transmission data quantity and the transmission bandwidth are reduced, the network bandwidth is further saved, and network blocking is avoided.
Drawings
FIG. 1 is a flow chart of a method for transmitting event data according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a first event acquisition information/second event acquisition information structure according to an embodiment of the present application;
FIG. 3 is a timing diagram of an event data transmission method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an event data transmission system according to an embodiment of the present application;
fig. 5 is a schematic diagram of an electronic device in an embodiment of the application.
Detailed Description
Other advantages and effects of the present application will become apparent to those skilled in the art from the following disclosure, which describes the embodiments of the present application with reference to specific examples. The application may be practiced or carried out in other embodiments that depart from the specific details, and the details of the present description may be modified or varied from the spirit and scope of the present application. It should be noted that the following embodiments and features in the embodiments may be combined with each other without conflict.
It should be noted that the illustrations provided in the following embodiments merely illustrate the basic concept of the present application by way of illustration, and only the components related to the present application are shown in the drawings and are not drawn according to the number, shape and size of the components in actual implementation, and the form, number and proportion of the components in actual implementation may be arbitrarily changed, and the layout of the components may be more complicated.
In the following description, numerous details are set forth in order to provide a more thorough explanation of embodiments of the present application, it will be apparent, however, to one skilled in the art that embodiments of the present application may be practiced without these specific details, in other embodiments, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring the embodiments of the present application.
The application discloses an event data transmission method, which is shown in fig. 1, and comprises the following steps:
step S101, front-end equipment acquires key frames in a real-time video stream, and identifies the key frames to be analyzed to obtain an identification result;
step S102, encoding a key frame according to a data storage address of a key frame analysis result and an identification result to generate first event acquisition information;
step S103, inserting the first event acquisition information into a real-time video stream and transmitting the first event acquisition information to a back-end platform;
step S104, the back-end platform acquires a key frame analysis result by extracting information from the real-time video stream inserted with the first event acquisition information.
By adopting the event data transmission method disclosed by the application, the key frames in the real-time video stream are acquired through the front-end equipment, the key frames to be analyzed are identified, the identification result is obtained, the key frames are encoded according to the data storage address of the key frame analysis result and the identification result, the first event acquisition information is generated, the first event acquisition information is inserted into the real-time video stream and is transmitted to the rear-end platform, and the rear-end platform acquires the key frame analysis result by extracting the information of the real-time video stream inserted with the first event acquisition information. In this way, the real-time video stream is used for transmitting the first event acquisition information containing the identification result and the data storage address, so that compared with the case that the event data containing the event image and the image analysis data is transmitted through independent event transmission streams, the transmission data quantity and the transmission bandwidth are reduced, the network bandwidth is further saved, and network blocking is avoided.
Optionally, the front-end device is one or more. Optionally, the front-end device is one or more of monitoring devices such as IPC (IP CAMERA) or NVR (Network Video Recorder ).
Optionally, the back-end platform is one or more. Optionally, the back-end platform is one or more of a server, an NVR or a special security platform and other devices.
Optionally, the front-end device acquires a key frame in the real-time video stream, including: detecting a preset event on the real-time video stream; if the preset event is detected, acquiring a key frame corresponding to the preset event.
Optionally, the preset event includes one or more of face recognition, vehicle recognition, animal recognition, activity behavior detection, and the like.
Optionally, the key frame to be analyzed is determined from the key frames through preset determination rules, wherein the preset determination rules comprise intelligent analysis rules, defense setting time periods and the like.
Optionally, the back-end platform obtains the key frame analysis result by extracting information from the real-time video stream inserted with the first event obtaining information, including: the method comprises the steps that a back-end platform obtains first event obtaining information in a real-time video stream; decoding the first event acquisition information to obtain a key frame and a data storage address; obtaining a key frame analysis result from the data storage address; and establishing a corresponding relation between the key frames and key frame analysis results to obtain event data. In this way, the back-end platform acquires the key frame and the key frame analysis result through the first event acquisition information in the real-time video stream, correlates the acquired key frame with the key frame analysis result to acquire event data, and compared with the back-end platform which acquires event data containing event images and image analysis data through independent event transmission streams, the back-end platform reduces the transmission data amount and the transmission bandwidth, further saves the network bandwidth and avoids network congestion.
Optionally, establishing a correspondence between the key frames and the key frame analysis result, and after obtaining the event data, further including: the event data is stored and/or presented.
Optionally, the backend platform determines whether to acquire the key frame analysis result from the data storage address according to a preset acquisition rule. Therefore, key frame analysis results are obtained according to actual needs, and the flexibility of the method is improved.
Optionally, encoding the key frame according to the data storage address of the key frame analysis result and the identification result, and before generating the first event acquisition information, further including: the front-end equipment numbers each key frame in the real-time video stream to obtain a code corresponding to each key frame; if the key frame to be analyzed is transmitted to the back-end platform through the real-time video stream, taking the code corresponding to the key frame to be analyzed as a key frame ID; encoding any key frame in the real-time video stream according to the data storage address of the key frame analysis result, the identification result and the key frame ID to generate second event acquisition information; and inserting the second event acquisition information into the real-time video stream and transmitting the second event acquisition information to the back-end platform. Therefore, each key frame in the real-time video stream is numbered, and the key frames are marked through the numbers, so that the condition that the key frames are transmitted to a back-end platform and cannot be encoded is avoided.
As shown in connection with fig. 2, the first event acquisition information/second event acquisition information includes an image frame and an SEI (Supplemental Enhancement Information ) unit, the SEI unit includes SEI header data and SEI bearer data, the SEI header data includes a forthcomin_bit (disable bit), a nal_reference_bit (NAL (Network Abstraction Layer, network abstraction layer) reference level), and a nal_unit_type (NAL unit type), wherein the value of the NAL unit type is 6, and the SEI bearer data carries several of a data storage address, an identification result, or a key frame ID.
The SEI unit is used as a transmission unit (NALU) of the H264/H265 video compression format, and is generally used for adding description information by a user, and a receiving end selects to acquire or ignore the SEI unit after receiving the SEI unit. The SEI unit is used for sending the identification result and the data storage address, so that compared with the transmission of event data comprising event images and image analysis data, the transmission data volume is reduced, the network bandwidth is further saved, and network blocking is avoided. In addition, as the intelligent event functions supported by the protocols of ONVIF, GB28181 and the like are limited, the transmission of the first event acquisition information or the second event acquisition information is realized through an SEI unit to acquire a key frame analysis result, so that the protocols of ONVIF, GB28181 and the like are expanded, the intelligent event functions of a security system can be perfected, and for a private SDK (Software Development Kit ), the transmission of the first event acquisition information or the second event acquisition information can be realized through inserting an SEI-like information unit to acquire the key frame analysis result, so that the application range is wide.
Optionally, the first event acquisition information and/or the second event acquisition information is in a key-value (key-value distributed) data format, wherein the key (key) is a data storage address or an identification result.
Optionally, the event data transmission method further includes: the back-end platform acquires second event acquisition information in the real-time video stream; decoding the second event acquisition information to obtain a key frame ID and a data storage address; matching in the received real-time video stream according to the key frame ID, obtaining a key frame corresponding to the key frame ID, and obtaining a key frame analysis result from a data storage address; and establishing a corresponding relation between the key frames and key frame analysis results to obtain event data. In this way, when the key frame is transmitted to the back-end platform through the real-time video stream, the real-time video stream is matched according to the key frame ID, the key frame is obtained through the key frame number, and the situation that the key frame is transmitted to the back-end platform and cannot be encoded is avoided.
Optionally, obtaining the key frame analysis result from the data storage address includes: if the result of acquiring the key frame analysis result is failure, acquiring the key frame analysis result again from the data storage address after a preset time period; if the result of obtaining the key frame analysis result is successful, establishing a corresponding relation between the key frame and the key frame analysis result, and obtaining event data. In this way, the situation that the key frame analysis result cannot be obtained due to the fact that the key frame analysis result cannot be stored in the data storage address in time (for example, the situation that the analysis is not completed and the like) is avoided.
Optionally, the key frame analysis result is obtained by: and carrying out image analysis on the key frames to be analyzed to obtain key frame analysis results corresponding to the key frames.
Optionally, image analysis is performed on the key frames to be analyzed through the front-end equipment, so that key frame analysis results corresponding to the key frames are obtained.
Optionally, the key frame analysis results include one or more of event type, time of occurrence, face frame coordinates, gender, age, jacket color, vehicle coordinates, vehicle information, etc. Optionally, the format of the key frame analysis result is one of JSON (JavaScript Object Notation, JS object numbered musical notation), XML (Extensible Markup Language ) and other data formats.
Optionally, performing image analysis on the key frame to be analyzed to obtain a key frame analysis result corresponding to the key frame, including: if the result of the image analysis is failure, taking the preset data as a key frame analysis result corresponding to the key frame; and if the image analysis result is successful, obtaining a key frame analysis result corresponding to the key frame. In this way, under the condition that the image analysis fails, the preset data is used as a key frame analysis result corresponding to the key frame, so that the situation that the back-end platform cannot acquire the key frame analysis result all the time is avoided.
In some embodiments, the preset data is preset empty set data.
In some embodiments, the content of the preset data is "analysis result is null".
Referring to fig. 3, the application discloses an event data transmission method, which comprises the following steps:
step S301, a front-end device acquires a key frame in a real-time video stream;
step S302, the front-end equipment identifies the key frames to be analyzed to obtain an identification result;
step S303, the front-end equipment performs image analysis on the key frames to be analyzed to obtain key frame analysis results;
step S304, the front-end equipment stores the key frame analysis result to obtain a data storage address;
wherein the data storage address includes an identification result;
step S305, the front-end equipment encodes the key frame according to the data storage address and the identification result to generate first event acquisition information;
step S306, the front-end equipment inserts the first event acquisition information into the real-time video stream;
step S307, the front-end equipment transmits the real-time video stream to the back-end platform;
step S308, a back-end platform acquires first event acquisition information in a real-time video stream;
step S309, the back-end platform decodes the first event acquisition information to obtain a key frame and a data storage address;
step S310, the back-end platform acquires a key frame analysis result from the data storage address;
in step S311, the back-end platform establishes a correspondence between the key frames and the key frame analysis results, and obtains event data.
By adopting the event data transmission method disclosed by the application, the key frames in the real-time video stream are acquired through the front-end equipment, the key frames to be analyzed are identified, the identification result is obtained, the key frames are encoded according to the data storage address of the key frame analysis result and the identification result, the first event acquisition information is generated, the first event acquisition information is inserted into the real-time video stream and is transmitted to the rear-end platform, and the rear-end platform acquires the key frame analysis result by extracting the information of the real-time video stream inserted with the first event acquisition information. In this way, the real-time video stream is used for transmitting the first event acquisition information containing the identification result and the data storage address, so that compared with the case that the event data containing the event image and the image analysis data is transmitted through independent event transmission streams, the transmission data quantity and the transmission bandwidth are reduced, the network bandwidth is further saved, and network blocking is avoided.
In some embodiments, a headend device obtains key frames in a real-time video stream; identifying a key frame to be analyzed to obtain an identification result frame_id; image analysis is carried out on a key frame to be analyzed to obtain a key frame analysis result frame_data, wherein the frame_date comprises an event type, an event time stamp, a face frame coordinate, gender, age, coat color and the like; storing a key frame analysis result frame_data to obtain a data storage address http:// XX.XXX.12:8880/alarm_data/frame_id; encoding the key frame according to the data storage address and the identification result, generating first event acquisition information, inserting the first event acquisition information into the real-time video stream and transmitting the first event acquisition information to the back-end platform; the back-end platform acquires first event acquisition information in the real-time video stream, decodes the first event acquisition information, and acquires a key frame and a data storage address http:// XX.XXX.12:8880/alarm_data/frame_id; obtaining a key frame analysis result from the data storage address; and establishing a corresponding relation between the key frames and key frame analysis results to obtain event data.
As shown in fig. 4, the present application discloses an event data transmission system, which includes a plurality of front-end devices 401 and a plurality of back-end platforms 402. The front-end device 401 is configured to obtain a key frame in the real-time video stream, identify the key frame to be analyzed, obtain an identification result, encode the key frame according to a data storage address of the key frame analysis result and the identification result, generate first event acquisition information, insert the first event acquisition information into the real-time video stream, and transmit the first event acquisition information to the back-end platform. The back-end platform 402 is connected to the front-end device 401, where the back-end platform 402 is configured to obtain a keyframe analysis result by extracting information from a real-time video stream into which the first event acquisition information is inserted.
By adopting the event data transmission system disclosed by the application, the key frames in the real-time video stream are acquired through the front-end equipment, the key frames to be analyzed are identified, the identification result is obtained, the key frames are encoded according to the data storage address of the key frame analysis result and the identification result, the first event acquisition information is generated, the first event acquisition information is inserted into the real-time video stream and is transmitted to the rear-end platform, and the rear-end platform acquires the key frame analysis result by extracting the information of the real-time video stream inserted with the first event acquisition information. In this way, the real-time video stream is used for transmitting the first event acquisition information containing the identification result and the data storage address, so that compared with the case that the event data containing the event image and the image analysis data is transmitted through independent event transmission streams, the transmission data quantity and the transmission bandwidth are reduced, the network bandwidth is further saved, and network blocking is avoided.
The above embodiments are merely illustrative of the principles of the present application and its effectiveness, and are not intended to limit the application. Modifications and variations may be made to the above-described embodiments by those skilled in the art without departing from the spirit and scope of the application. It is therefore intended that all equivalent modifications and changes made by those skilled in the art without departing from the spirit and technical spirit of the present application shall be covered by the appended claims.
As shown in fig. 5, this embodiment discloses an electronic device, including: a processor (processor) 500 and a memory (memory) 501; the memory is used for storing a computer program, and the processor is used for executing the computer program stored in the memory, so that the terminal executes any one of the methods in the embodiment. Optionally, the electronic device may also include a communication interface (Communication Interface) 502 and a bus 503. The processor 500, the communication interface 502, and the memory 501 may communicate with each other via the bus 503. The communication interface 502 may be used for information transfer. The processor 500 may call logic instructions in the memory 501 to perform the methods of the embodiments described above.
Further, the logic instructions in the memory 501 may be implemented in the form of software functional units and may be stored in a computer readable storage medium when sold or used as a stand alone product.
The memory 501 is a computer readable storage medium that may be used to store a software program, a computer executable program, and program instructions/modules corresponding to the methods in the embodiments of the present disclosure. The processor 500 performs functional applications as well as data processing, i.e. implements the methods of the embodiments described above, by running program instructions/modules stored in the memory 501.
Memory 501 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, at least one application program required for functionality; the storage data area may store data created according to the use of the terminal device, etc. Further, the memory 501 may include a high-speed random access memory, and may also include a nonvolatile memory.
By adopting the electronic equipment disclosed by the application, the key frames in the real-time video stream are acquired through the front-end equipment, the key frames to be analyzed are identified, the identification result is obtained, the key frames are encoded according to the data storage address of the key frame analysis result and the identification result, the first event acquisition information is generated, the first event acquisition information is inserted into the real-time video stream and is transmitted to the rear-end platform, and the rear-end platform acquires the key frame analysis result by extracting the information of the real-time video stream inserted with the first event acquisition information. In this way, the real-time video stream is used for transmitting the first event acquisition information containing the identification result and the data storage address, so that compared with the case that the event data containing the event image and the image analysis data is transmitted through independent event transmission streams, the transmission data quantity and the transmission bandwidth are reduced, the network bandwidth is further saved, and network blocking is avoided.
The present embodiment also discloses a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements any of the methods of the present embodiments.
The computer readable storage medium in this embodiment, as will be appreciated by those of ordinary skill in the art: all or part of the steps for implementing the method embodiments described above may be performed by computer program related hardware. The aforementioned computer program may be stored in a computer readable storage medium. The program, when executed, performs steps including the method embodiments described above; and the aforementioned storage medium includes: various media that can store program code, such as ROM, RAM, magnetic or optical disks.
The electronic device disclosed in this embodiment includes a processor, a memory, a transceiver, and a communication interface, where the memory and the communication interface are connected to the processor and the transceiver and perform communication therebetween, the memory is used to store a computer program, the communication interface is used to perform communication, and the processor and the transceiver are used to run the computer program, so that the electronic device performs each step of the above method.
In this embodiment, the memory may include a random access memory (Random Access Memory, abbreviated as RAM), and may further include a non-volatile memory (non-volatile memory), such as at least one magnetic disk memory.
The processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU for short), a network processor (Network Processor, NP for short), etc.; but also digital signal processors (Digital Signal Processing, DSP for short), application specific integrated circuits (Application Specific Integrated Circuit, ASIC for short), field-programmable gate arrays (Field-Programmable Gate Array, FPGA for short) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
The above description and the drawings illustrate embodiments of the disclosure sufficiently to enable those skilled in the art to practice them. Other embodiments may involve structural, logical, electrical, process, and other changes. The embodiments represent only possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and features of some embodiments may be included in, or substituted for, those of others. Moreover, the terminology used in the present application is for the purpose of describing embodiments only and is not intended to limit the claims. As used in the description of the embodiments and the claims, the singular forms "a," "an," and "the" (the) are intended to include the plural forms as well, unless the context clearly indicates otherwise. Similarly, the term "and/or" as used in this disclosure is meant to encompass any and all possible combinations of one or more of the associated listed. Furthermore, when used in the present disclosure, the terms "comprises," "comprising," and/or variations thereof, mean that the recited features, integers, steps, operations, elements, and/or components are present, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Without further limitation, an element defined by the phrase "comprising one …" does not exclude the presence of other like elements in a process, method or apparatus comprising such elements. In this context, each embodiment may be described with emphasis on the differences from the other embodiments, and the same similar parts between the various embodiments may be referred to each other. For the methods, products, etc. disclosed in the embodiments, if they correspond to the method sections disclosed in the embodiments, the description of the method sections may be referred to for relevance.
Those of skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. The skilled artisan may use different methods for each particular application to achieve the described functionality, but such implementation should not be considered to be beyond the scope of the embodiments of the present disclosure. It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the embodiments disclosed herein, the disclosed methods, articles of manufacture (including but not limited to devices, apparatuses, etc.) may be practiced in other ways. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the units may be merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. In addition, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form. The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to implement the present embodiment. In addition, each functional unit in the embodiments of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. In the description corresponding to the flowcharts and block diagrams in the figures, operations or steps corresponding to different blocks may also occur in different orders than that disclosed in the description, and sometimes no specific order exists between different operations or steps. For example, two consecutive operations or steps may actually be performed substantially in parallel, they may sometimes be performed in reverse order, which may be dependent on the functions involved. Each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Claims (9)
1. A method of event data transmission, comprising:
the front-end equipment acquires a key frame in the real-time video stream, and identifies the key frame to be analyzed to obtain an identification result;
encoding the key frame according to the data storage address of the key frame analysis result and the identification result, and generating first event acquisition information;
inserting the first event acquisition information into the real-time video stream and transmitting the first event acquisition information to a back-end platform;
the back-end platform acquires the key frame analysis result by extracting information from the real-time video stream inserted with the first event acquisition information;
encoding the key frames according to the data storage address of the key frame analysis result and the identification result, and before generating the first event acquisition information, further comprising the step that the front-end equipment numbers each key frame in the real-time video stream to obtain the corresponding code of each key frame; if the key frame to be analyzed is transmitted to the back-end platform through the real-time video stream, taking a code corresponding to the key frame to be analyzed as a key frame ID; encoding any key frame in the real-time video stream according to the data storage address of the key frame analysis result, the identification result and the key frame ID to generate second event acquisition information; and inserting the second event acquisition information into the real-time video stream and transmitting the second event acquisition information to a back-end platform.
2. The event data transmission method according to claim 1, wherein the back-end platform obtains the key frame analysis result by extracting information from a real-time video stream into which the first event acquisition information is inserted, comprising:
the method comprises the steps that a back-end platform obtains first event obtaining information in a real-time video stream;
decoding the first event acquisition information to obtain the key frame and the data storage address;
acquiring the key frame analysis result from the data storage address;
and establishing a corresponding relation between the key frame and the key frame analysis result to obtain event data.
3. The event data transmission method according to claim 1, characterized in that the event data transmission method further comprises:
the back-end platform acquires second event acquisition information in the real-time video stream;
decoding the second event acquisition information to obtain the key frame ID and the data storage address;
matching in the received real-time video stream according to the key frame ID, obtaining a key frame corresponding to the key frame ID, and obtaining the key frame analysis result from the data storage address;
and establishing a corresponding relation between the key frame and the key frame analysis result to obtain event data.
4. A method of event data transmission according to claim 2 or 3, wherein retrieving the key frame analysis result from the data storage address comprises:
if the result of acquiring the key frame analysis result is failure, acquiring the key frame analysis result again from the data storage address after a preset time period;
if the result of obtaining the key frame analysis result is successful, establishing a corresponding relation between the key frame and the key frame analysis result, and obtaining event data.
5. A method of event data transmission according to any one of claims 1 to 3, wherein the key frame analysis results are obtained by:
and carrying out image analysis on the key frames to be analyzed to obtain key frame analysis results corresponding to the key frames.
6. The event data transmission method according to claim 5, wherein performing image analysis on the key frame to be analyzed to obtain a key frame analysis result corresponding to the key frame, comprises:
if the image analysis result is failure, taking preset data as a key frame analysis result corresponding to the key frame;
and if the image analysis result is successful, obtaining a key frame analysis result corresponding to the key frame.
7. An event data transmission system, comprising:
the front-end equipment is used for acquiring key frames in the real-time video stream, identifying the key frames to be analyzed, obtaining an identification result, encoding the key frames according to a data storage address of the key frame analysis result and the identification result, generating first event acquisition information, inserting the first event acquisition information into the real-time video stream, and transmitting the first event acquisition information to the back-end platform;
the back-end platform is connected with the front-end equipment and is used for acquiring the key frame analysis result by extracting information from the real-time video stream inserted with the first event acquisition information;
encoding the key frames according to the data storage address of the key frame analysis result and the identification result, and before generating the first event acquisition information, further comprising the step that the front-end equipment numbers each key frame in the real-time video stream to obtain the corresponding code of each key frame; if the key frame to be analyzed is transmitted to the back-end platform through the real-time video stream, taking a code corresponding to the key frame to be analyzed as a key frame ID; encoding any key frame in the real-time video stream according to the data storage address of the key frame analysis result, the identification result and the key frame ID to generate second event acquisition information; and inserting the second event acquisition information into the real-time video stream and transmitting the second event acquisition information to a back-end platform.
8. An electronic device, comprising: a processor and a memory;
the memory is configured to store a computer program, and the processor is configured to execute the computer program stored in the memory, to cause the electronic device to perform the method according to any one of claims 1 to 6.
9. A computer-readable storage medium having stored thereon a computer program, characterized by: the computer program, when executed by a processor, implements the method of any of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111151533.6A CN113840160B (en) | 2021-09-29 | 2021-09-29 | Event data transmission method, system, electronic device and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111151533.6A CN113840160B (en) | 2021-09-29 | 2021-09-29 | Event data transmission method, system, electronic device and readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113840160A CN113840160A (en) | 2021-12-24 |
CN113840160B true CN113840160B (en) | 2023-09-29 |
Family
ID=78967362
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111151533.6A Active CN113840160B (en) | 2021-09-29 | 2021-09-29 | Event data transmission method, system, electronic device and readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113840160B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114900662B (en) * | 2022-05-11 | 2024-08-06 | 重庆紫光华山智安科技有限公司 | Video stream transmission quality information determining method, system, equipment and medium |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102665064A (en) * | 2012-03-01 | 2012-09-12 | 浙江大学 | A traffic video monitoring system based on standard labeling and quick search |
CN103731631A (en) * | 2012-10-16 | 2014-04-16 | 华为软件技术有限公司 | Method, device and system for transmitting video image |
CN104333732A (en) * | 2014-10-15 | 2015-02-04 | 广东中星电子有限公司 | Distributed video analysis method and system thereof |
CN105357495A (en) * | 2015-12-08 | 2016-02-24 | 浙江宇视科技有限公司 | Front-rear end cooperating intelligent analysis method and device |
CN106254782A (en) * | 2016-09-28 | 2016-12-21 | 北京旷视科技有限公司 | Image processing method and device and camera |
CN110139128A (en) * | 2019-03-25 | 2019-08-16 | 北京奇艺世纪科技有限公司 | A kind of information processing method, blocker, electronic equipment and storage medium |
CN110971529A (en) * | 2018-09-28 | 2020-04-07 | 杭州海康威视数字技术股份有限公司 | Data transmission method and device, electronic equipment and storage medium |
CN111428083A (en) * | 2020-03-19 | 2020-07-17 | 平安国际智慧城市科技股份有限公司 | Video monitoring warning method, device, equipment and storage medium |
CN112949547A (en) * | 2021-03-18 | 2021-06-11 | 北京市商汤科技开发有限公司 | Data transmission and display method, device, system, equipment and storage medium |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5026167B2 (en) * | 2007-07-02 | 2012-09-12 | パナソニック株式会社 | Stream transmission server and stream transmission system |
-
2021
- 2021-09-29 CN CN202111151533.6A patent/CN113840160B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102665064A (en) * | 2012-03-01 | 2012-09-12 | 浙江大学 | A traffic video monitoring system based on standard labeling and quick search |
CN103731631A (en) * | 2012-10-16 | 2014-04-16 | 华为软件技术有限公司 | Method, device and system for transmitting video image |
CN104333732A (en) * | 2014-10-15 | 2015-02-04 | 广东中星电子有限公司 | Distributed video analysis method and system thereof |
CN105357495A (en) * | 2015-12-08 | 2016-02-24 | 浙江宇视科技有限公司 | Front-rear end cooperating intelligent analysis method and device |
CN106254782A (en) * | 2016-09-28 | 2016-12-21 | 北京旷视科技有限公司 | Image processing method and device and camera |
CN110971529A (en) * | 2018-09-28 | 2020-04-07 | 杭州海康威视数字技术股份有限公司 | Data transmission method and device, electronic equipment and storage medium |
CN110139128A (en) * | 2019-03-25 | 2019-08-16 | 北京奇艺世纪科技有限公司 | A kind of information processing method, blocker, electronic equipment and storage medium |
CN111428083A (en) * | 2020-03-19 | 2020-07-17 | 平安国际智慧城市科技股份有限公司 | Video monitoring warning method, device, equipment and storage medium |
CN112949547A (en) * | 2021-03-18 | 2021-06-11 | 北京市商汤科技开发有限公司 | Data transmission and display method, device, system, equipment and storage medium |
Non-Patent Citations (3)
Title |
---|
A Real-Time Video Stream Key Frame Identification Algorithm for QoS;Ruisi Wang et al.;《 2010 Second International Conference on Multimedia and Information Technology》;全文 * |
基于关键帧和指示符运动模型的教学视频压缩算法;孟春宁 等;《激光与光电子学进展 》(第10期);全文 * |
边缘计算场景下实时目标检测系统研究;程闻博;《中国优秀硕士学位论文全文数据库信息科技》(第5期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN113840160A (en) | 2021-12-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101703931B1 (en) | Surveillance system | |
JP5572225B2 (en) | Method and apparatus for processing ECM packets | |
CN111787398A (en) | Video compression method, device, equipment and storage device | |
CN111866457B (en) | Monitoring image processing method, electronic device, storage medium and system | |
CN104639951A (en) | Video bitstream frame extraction process and device | |
CN112672381B (en) | Data association method, device, terminal equipment and medium | |
CN113840160B (en) | Event data transmission method, system, electronic device and readable storage medium | |
CN113225538A (en) | Monitoring video playing method and device and service equipment | |
WO2021121264A1 (en) | Snapshot picture transmission method, apparatus and system, and camera and storage device | |
CN111277800A (en) | Monitoring video coding and playing method and device, electronic equipment and storage medium | |
CN110650357B (en) | Video decoding method and device | |
CN111263113B (en) | Data packet sending method and device and data packet processing method and device | |
CN111263097A (en) | Media data transmission method and related equipment | |
CN109698932B (en) | Data transmission method, camera and electronic equipment | |
CN115379158B (en) | Video playing method and device, electronic equipment and computer readable storage medium | |
CN113038261A (en) | Video generation method, device, equipment, system and storage medium | |
CN104639892B (en) | Video recording system and method and apparatus for processing image in main system | |
CN116129316A (en) | Image processing method, device, computer equipment and storage medium | |
CN112866745B (en) | Streaming video data processing method, device, computer equipment and storage medium | |
CN112055174B (en) | Video transmission method and device and computer readable storage medium | |
CN106534137B (en) | Media stream transmission method and device | |
KR102593622B1 (en) | Video management system based on mobility terminal and method for managing video thereof | |
CN110855619A (en) | Processing method and device for playing audio and video data, storage medium and terminal equipment | |
CN109246434B (en) | Video encoding method, video decoding method and electronic equipment | |
CN111353133A (en) | Image processing method, device and readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |