CN114025162B - Entropy decoding method, medium, program product, and electronic device - Google Patents
Entropy decoding method, medium, program product, and electronic device Download PDFInfo
- Publication number
- CN114025162B CN114025162B CN202111338420.7A CN202111338420A CN114025162B CN 114025162 B CN114025162 B CN 114025162B CN 202111338420 A CN202111338420 A CN 202111338420A CN 114025162 B CN114025162 B CN 114025162B
- Authority
- CN
- China
- Prior art keywords
- memory
- decoded
- target
- cdf table
- electronic device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 73
- 230000015654 memory Effects 0.000 claims abstract description 122
- 230000008569 process Effects 0.000 claims abstract description 38
- 238000004590 computer program Methods 0.000 claims description 3
- 238000012423 maintenance Methods 0.000 claims 1
- 238000012545 processing Methods 0.000 description 14
- 101100005765 Arabidopsis thaliana CDF1 gene Proteins 0.000 description 11
- 101100007579 Arabidopsis thaliana CPP1 gene Proteins 0.000 description 11
- 238000010586 diagram Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 238000004891 communication Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000001360 synchronised effect Effects 0.000 description 3
- 101100005766 Caenorhabditis elegans cdf-1 gene Proteins 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000001186 cumulative effect Effects 0.000 description 2
- 238000005315 distribution function Methods 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000008570 general process Effects 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 238000013139 quantization Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/13—Adaptive entropy coding, e.g. adaptive variable length coding [AVLC] or context adaptive binary arithmetic coding [CABAC]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/42—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
- H04N19/423—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation characterised by memory arrangements
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
The application relates to an entropy decoding method, a medium, a program product and an electronic device. The method is applied to the electronic equipment, and the electronic equipment comprises a processor, a first memory arranged inside the processor and a second memory arranged outside the processor. The method comprises the following steps: acquiring a first target CDF table required by entropy decoding of a video frame to be decoded; acquiring the first target CDF table from the first memory when it is confirmed that the first target CDF table is stored in the first memory, and acquiring the first target CDF table from the second memory when it is confirmed that the first target CDF table is stored in the second memory; entropy decoding is carried out on the video frame to be decoded based on the obtained first target CDF table, so that a decoded video frame is obtained, and a second target CDF table is obtained after parameters in the first target CDF table are updated in the entropy decoding process.
Description
Technical Field
The present disclosure relates to the field of data processing technologies, and in particular, to an entropy decoding method, a medium, a program product, and an electronic device.
Background
With the development of terminal technology, the intelligent terminals can provide more and more functions, and the daily life of people is more and more dependent on various intelligent terminals. For example, chat, video watching, reading, music listening, office work, etc. are performed using the smart terminal. Among these, higher definition, higher bit rate, more network traffic savings are several of the most central demands of current online video. Supporting and pushing these needs are video codec and video transmission technologies for video players.
AV1, a video coding standard promulgated by the (open media alliance, alliance for Open Media, AOM), has the advantage of being able to support higher bit rates, wider color spaces, higher frame rates, etc., and is thus widely used. In the AV1 standard, the entropy coding part adopts a multi-symbol coding scheme, so that the entropy coding process is simplified, and the performance is better. However, in general, when a video player decodes an encoded video using the AV1 standard, a large number of cumulative distribution function (cumulative distribution function, CDF) tables are required, and a large time delay may be caused in the process of acquiring the CDF tables, so that the player may not be able to decode the encoded video in time, which affects the user experience.
Disclosure of Invention
In view of this, embodiments of the present application provide an entropy decoding method, medium, program product, and electronic device. According to the technical scheme, at least part of the CDF table is stored in the memory inside the electronic device processor, and the data reading and writing speed of the memory (such as SRAM) inside the electronic device processor is high. Therefore, the technical scheme can greatly reduce the time delay of reading and writing back the CDF table involved in the process of entropy decoding the video to be decoded, thereby rapidly decoding the video to be decoded and improving the user experience.
In a first aspect, an embodiment of the present application provides an entropy decoding method, applied to an electronic device, where the electronic device includes a processor, a first memory disposed inside the processor, and a second memory disposed outside the processor, and the method includes:
acquiring a first target CDF table required by entropy decoding of a video frame to be decoded;
in the event that it is confirmed that the first target CDF table is stored in the first memory, the first target CDF table is retrieved from the first memory,
in the case where it is confirmed that the first target CDF table is stored in the second memory, acquiring the first target CDF table from the second memory;
entropy decoding is carried out on the video frame to be decoded based on the obtained first target CDF table, so as to obtain a decoded video frame, and
and updating parameters in the first target CDF table in the entropy decoding process to obtain a second target CDF table.
In a possible implementation of the first aspect, after the determining that the first target CDF table is stored in the second memory, the method further includes:
in the event that the first memory is confirmed to have the capability to store the first target CDF table, storing the first target CDF table in the first memory;
In the event that it is determined that the first memory does not have the capability to store the first target CDF table, the third target CDF table stored in the first memory is replaced with the first target CDF table.
Wherein the third target CDF table may be the CDF table stored earliest in the first memory. Or any CDF table within a certain period of time at the earliest.
In a possible implementation of the first aspect, the method further includes: the third target CDF table is written to the second memory.
In a possible implementation of the first aspect, the method further includes: a second target CDF table is stored in the first memory.
In a possible implementation of the first aspect, storing the second target CDF table in the first memory includes:
in the case where it is confirmed that the first target CDF table is acquired from the first memory, the first target CDF table stored in the first memory is replaced with the second target CDF table.
In a possible implementation manner of the first aspect, the first memory is an SRAM.
In a possible implementation manner of the first aspect, the second memory is one of RAM and DDR SDRAM.
In a second aspect, embodiments of the present application provide a computer-readable storage medium having stored thereon instructions that, when executed on an electronic device, cause the electronic device to perform the entropy decoding method of the first aspect and any of the various possible implementations of the first aspect.
In a third aspect, embodiments of the present application provide a computer program product comprising instructions for implementing the above-described first aspect and any of the various possible implementations of the first aspect, when executed by one or more processors.
In a fourth aspect, an embodiment of the present application provides an electronic device, including:
a memory for storing instructions for execution by one or more processors of the electronic device, an
A processor for performing the entropy decoding method of the first aspect described above and any of the various possible implementations of the first aspect when the instructions are executed by one or more processors.
Drawings
FIG. 1 illustrates a scene graph of cross-device video playback, according to some embodiments of the present application;
FIG. 2 illustrates a block diagram of the hardware architecture of the electronic device 100 shown in FIG. 1, in accordance with some embodiments of the present application;
FIG. 3 illustrates a data flow of the electronic device 100 shown in FIG. 1 for obtaining a target CDF table when entropy decoding video to be decoded using the AV1 standard, according to some embodiments of the present application;
FIG. 4 illustrates another data flow of the electronic device 100 shown in FIG. 1 for obtaining a target CDF table when entropy decoding video to be decoded using the AV1 standard, in accordance with some embodiments of the present application;
FIG. 5 illustrates an entropy decoding flow diagram corresponding to FIG. 4, according to some embodiments of the present application;
FIG. 6 illustrates another data flow of the electronic device 100 shown in FIG. 1 to obtain a target CDF table when entropy decoding video to be decoded using the AV1 standard, according to some embodiments of the present application;
FIG. 7 illustrates another entropy decoding flow diagram corresponding to FIG. 6, according to some embodiments of the present application;
FIG. 8 illustrates another data flow for the electronic device 100 shown in FIG. 1 to obtain a target CDF table when entropy decoding video to be decoded using the AV1 standard, in accordance with some embodiments of the present application;
FIG. 9 illustrates another entropy decoding flow diagram corresponding to FIG. 8, according to some embodiments of the present application;
FIG. 10 illustrates a flow diagram of entropy decoding, according to some embodiments of the present application;
fig. 11 illustrates a flow diagram of another entropy decoding method, according to some embodiments of the present application.
Detailed Description
Illustrative embodiments of the present application include, but are not limited to, entropy decoding methods, media, program products, and electronic devices.
In order to facilitate understanding of the technical solutions of the present application, technical terms possibly related to embodiments of the present application will now be described, specifically as follows:
AV1 (Alliance for Open Media Video 1) standard: is an open source, copyrighted video encoding format formulated by AOM (Alliance for Open Media, open media alliance). When an image is encoded using the AV1 standard, it involves the division of encoded blocks in the image, intra-prediction, inter-prediction motion compensation, transform coding, entropy coding, and loop filtering after encoding.
Entropy coding: a Multi-arithmetic coding (Multi-Symbol Entropy Coding) based on Daala (a video codec standard). Each syntax element corresponds to a letter of an N element, N ranging from 0 to 15. The technical scheme of the application is based on the multi-element arithmetic entropy coding method of the AV1 standard.
Entropy decoding: i.e. the inverse of entropy coding.
CDF table: the method comprises the steps of entropy coding and parameters of syntax elements involved in the entropy decoding process, wherein in the entropy decoding process, each parameter in a CDF table corresponding to a frame image is updated when the entropy decoding of the frame image is completed.
The technical solutions of the embodiments of the present application are described in further detail below through the accompanying drawings and the embodiments.
Fig. 1 illustrates a scene graph of cross-device video playback, according to some embodiments of the present application. Including electronic device 100 and electronic device 200. The electronic device 200 and the electronic device 100 may be electronic devices that can communicate with each other in the same local area network and the same internet of things, or may be electronic devices that are far away, for example, in different places, or may be electronic devices that log in the same user account, electronic devices that form a super virtual terminal, and so on.
When the user wants to use the application related to capturing, editing, playing, etc. of the video installed on the electronic device 100, if the capturing performance of the electronic device 200 is better, the electronic device 200 may be used as an external device of the electronic device 100 to capture the video, then the captured video is sent to the electronic device 100, and after the electronic device 100 receives the video sent by the electronic device 200, the video may be stored, edited, and played. For example, a commentator may use the sports live application installed in the electronic device 100 to perform live commentary of a sports event, and may use the electronic device 200 with better shooting performance to shoot high-definition game videos of a player, where the electronic device 200 transmits the videos acquired in real time to the electronic device 100 to display, and the commentator performs synchronous commentary on the received high-definition game videos through the sports live application. For another example, the user views the surveillance video of the electronic device 200 within a specific range through the electronic device 100, and the electronic device 200 may transmit the collected surveillance video to the electronic device 100 for display.
Typically, to reduce the amount of data in transmitting video across devices, the electronic device 200 compresses (i.e., encodes) the captured video, obtains compressed video data, and then transmits the compressed video data to the electronic device 100. The electronic device 100 may decode the compressed video data to obtain decoded video.
For example, in some embodiments of the present application, the electronic device 200 encodes the captured video using the AV1 standard and transmits the encoded video data to the electronic device 100. Accordingly, upon receiving encoded video data (hereinafter referred to as video to be decoded) transmitted from the electronic apparatus 200, the electronic apparatus 100 needs to decode the video to be decoded using the AV1 standard.
In some embodiments, the general process by which the electronic device 200 encodes the captured video using the AV1 standard is:
taking the compression process of a frame of image a in a video to be encoded as an example, the electronic device 200 divides the image a into a plurality of encoded blocks (also referred to as macro blocks), and then the electronic device 200 may extract information of the macro blocks through intra prediction, inter prediction, motion estimation, motion compensation, and the like, and represent the information of the macro blocks with values of syntax elements, wherein the syntax elements include: motion Vector Difference (MVD), motion mode (intra-or inter-prediction), macroblock coding template (cbp), block prediction residual coefficients, intra-prediction mode, etc. The electronic device 200 may then entropy encode the values of the syntax elements (i.e., the information of the macro-blocks) to obtain a final compressed bitstream. Wherein, entropy coding is the last operation of video coding and is directly related to the generation of a code stream.
It will be appreciated that the electronic apparatus 100, after receiving the video to be decoded transmitted by the electronic apparatus 200, needs to decode the video to be decoded using the AV1 standard. The decoding process is the inverse of the encoding process, and in the decoding process, entropy decoding needs to be performed on the video to be decoded first to obtain the value of the syntax element. The entropy decoding is the first operation of video decoding, and a specific process of entropy decoding after the electronic device 100 receives a video to be decoded will be described in detail below with reference to a block diagram of the electronic device 100, which is not described herein.
In addition, it should be noted that, the electronic device 100 and the electronic device 200 to which the entropy decoding method provided in the present application is applicable may be any electronic device having a video encoding and decoding function, including, but not limited to, a mobile phone, a vehicle-mounted device, a personal computer, an artificial intelligent device, a tablet computer, a personal digital assistant, an intelligent wearable device (such as a smart watch or a bracelet, and a smart glasses), an intelligent television (or referred to as a smart large screen, a smart screen, or a large screen television, etc.), a virtual reality/mixed reality/augmented display device, a server, etc.
Furthermore, it will be appreciated that fig. 1 above is merely illustrative of one distributed video capture scenario of the video decoding scheme provided herein. The entropy decoding method provided by the application can be applied to any scene which needs video encoding and decoding, for example, the scheme of the application can be applied to scenes such as panoramic sound movies, ultra-high definition televisions, internet broadband audio and video services, digital audio and video broadcasting wireless broadband multimedia communication and the like.
The following describes a system architecture diagram of an electronic device 100 involved in the scenario shown in fig. 1, where it is desirable to entropy decode video to be decoded. As shown in fig. 2, the electronic device 100 includes a processor 11, system control logic 12, non-volatile storage 13, system memory 14, input/output (I/O) devices 15, a communication interface 16, firmware 17. The processor 11 includes a control unit 111, a storage unit 112, and an arithmetic unit 113.
The processor 11 includes a control unit 111, a storage unit 112, and an arithmetic unit 113. In addition, the processor 201 may include one or more processing units, such as processing modules or processing circuits that may include a central processing unit (Central Processing Unit, CPU), an image processor (Graphics Processing Unit, GPU), a digital signal processor (Digital Signal Processor, DSP), a microprocessor (Micro-programmed Control Unit, MCU), a programmable logic device (Field Programmable Gate Array, FPGA), an artificial intelligence processing unit (Artificial Intelligence Processing Unit, AIPU), a Neural-network processor (Neural-network Processing Unit, NPU), and the like. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
In some embodiments, when the electronic device 100 is decoding the video to be decoded received from the electronic device 200 shown in fig. 1, the processor 11 is configured to determine whether a target CDF table corresponding to a frame image to be decoded currently is stored in the storage unit 112 of the electronic device 100, so as to determine to decode the frame image to be decoded using the target CDF table stored in the storage unit 112, or determine to decode the frame image to be decoded using the target CDF table stored in the system memory 14.
In some embodiments, the storage unit 112 is configured to store a plurality of CDF tables updated by 4 default CDF tables provided in the standard document SPEC based on the AV1 standard in the process of decoding the video to be decoded by the electronic device 100. The plurality of CDF tables stored in the storage unit 112 may be updated directly from the aforementioned 4 default CDF tables. Assuming that the CDF table directly updated from the 4 default CDF tables is referred to as a reference CDF table, the plurality of CDF tables stored in the storage unit 112 may be updated from the reference CDF table during the entropy decoding process. Typically, the reference CDF table is only 7 sheets.
When the electronic device 100 decodes the video to be decoded, the code stream of the video to be decoded generally carries identification information for indicating a CDF table used when entropy decoding each frame of image in the video to be decoded.
For example, assuming that the identification field of the CDF table used for entropy decoding of the current frame image is one of 0 to 6, which is parsed from the code stream, decoding the frame image requires one of the aforementioned 7 reference CDF tables. Assuming that the identification field of the CDF table used for entropy decoding of the previous frame image is 7, the decoding of the frame image needs to use one of the aforementioned 4 default CDF tables. In some embodiments, a CDF table occupies approximately 110K bytes of memory.
In some embodiments, the control unit 111 is configured to maintain a CDF table at a hardware layer of the electronic device 100, where the table includes index information, for example, address information, of the CDF table stored in the storage unit 112, so that when the operation unit 113 decodes a frame of image in the video to be decoded, the operation unit may query, according to the index information, an address of a target CDF table required for decoding the frame of image, and obtain the target CDF table from the storage unit 112 according to the queried address.
In some embodiments, the operation unit 113 is configured to decode a frame of an image to be decoded currently in the video to be decoded, using the target CDF table acquired from the storage unit 112, so as to obtain a decoded video frame.
The system Memory 14 may include a Random-Access Memory (RAM), a double data rate synchronous dynamic Random Access Memory (Double Data Rate Synchronous Dynamic Random Access Memory, DDR SDRAM), and the like, for storing a plurality of CDF tables, for example, 7 reference CDF tables updated by the foregoing 4 default CDF tables, and in the case that the target CDF table exists in the system Memory 14, the operation unit 113 decodes the video to be decoded using the target CDF table, and updates the target CDF table to obtain a new CDF table. The system memory 14 may also be used for temporarily storing data or instructions for the electronic device 100.
Firmware (solidified software) 17, i.e. a program written in a programmable read-only memory of the electronic device 100, is entirely different from ordinary software, and is a program code solidified inside an integrated circuit, responsible for controlling and coordinating the functions of the integrated circuit. For example, in some embodiments, firmware 17 is a device that integrates some of the control functions of electronic device 100 (functions that are not implemented in dependence on the operating system of electronic device 100), such as power-on self-test, read-write functions of CDF tables involved between processor 11 and system memory 14, and the like.
The system control logic 12 may include any suitable interface controller to provide any suitable interface to other modules of the electronic device 100 such that the various modules of the electronic device 100 may communicate with one another.
The non-volatile memory 13 may be a tangible, non-transitory computer-readable medium comprising one or more instructions for permanently storing data and/or instructions. The nonvolatile memory 13 may include any suitable nonvolatile memory such as flash memory and/or any suitable nonvolatile storage device, for example, a Hard Disk Drive (HDD), compact Disc (CD), digital versatile Disc (Digital Versatile Disc, DVD), solid State Drive (SSD), and the like. In some embodiments, the nonvolatile memory 13 may also be a removable storage medium, such as a Secure Digital (SD) memory card or the like. In some embodiments, the non-volatile memory 13 is used to permanently store data or instructions of the electronic device 100, such as instructions for retrieving a target CDF table from the storage unit 112.
Input/output (I/O) devices 15 may include input devices such as a keyboard, mouse, touch screen, etc. for converting user operations into analog or digital signals and communicating to processor 11; and an output device, such as a speaker, printer, display, etc., for presenting information in the electronic device 100 to a user in the form of sound, text, images, etc.
The communication interface 16 provides a software/hardware interface for the electronic device 100 to communicate with other electronic devices, such that the electronic device 100 can exchange data with other electronic devices, e.g., the electronic device 100 may obtain encoded video from the electronic device 200 shown in fig. 1 through the communication interface 16.
It should be understood that the system structure of the electronic device 100 shown in fig. 2 is merely an example, and in other embodiments, the electronic device 100 may include more or fewer modules, and some modules may be combined or split, which is not limited in the embodiments herein.
The specific process of entropy decoding video to be decoded by the electronic device 100 in the scenario shown in fig. 1 will be described in detail below in conjunction with the system configuration diagram of the electronic device 100 shown in fig. 2, and fig. 3 to 11.
Fig. 3 illustrates a data flow of the electronic device 100 shown in fig. 1 for retrieving a target CDF table from the system memory 14 of the electronic device 100 when entropy decoding video to be decoded using the AV1 standard, according to some embodiments of the present application.
As shown in fig. 3, when the electronic device 100 is entropy-decoding a video (not shown) to be decoded, it is essentially an image encoded for each frame in the video to be decoded. When each frame image in the video to be decoded is entropy decoded using the AV1 standard, the target CDF table required for decoding each frame image may be determined according to the identification information carried in the code stream of the video to be decoded and received from the electronic device 200 and used for indicating the CDF table used when entropy decoding each frame image in the video to be decoded. And, each target CDF table required for decoding is retrieved from the system memory 14.
Since the principle of decoding each frame image is the same, only one frame image is taken as an example in the following, and the description will be given of the data flow when entropy decoding the video with decoding in the embodiment shown in fig. 3.
Specifically, assuming that the control unit 111 determines that the target CDF table corresponding to the image P1 to be decoded currently by the operation unit 113 is CDF1, the CDF1 is first read from the system memory 14 by the firmware 17, then the read CDF1 is written into the storage unit 112, and the operation unit 113 may read the CDF1 from the storage unit 112 when entropy decoding the image P1.
When the operation unit 113 completes the entropy decoding of the image P1, CDF1 is updated (the update process will be described below in conjunction with the flowchart, which will not be described here), CDF1 'is obtained, then CDF 1' is written into the storage unit 112, then the control unit 111 reads CDF1 'from the storage unit 112 and sends an instruction to the firmware 17, which replaces CDF1 in the system memory 14 with CDF 1' in response to the instruction.
However, in the embodiment shown in fig. 3 described above, each target CDF table required for decoding can be obtained from the system memory 14 only by the firmware 17, and after entropy decoding of one frame image is completed, it is necessary to re-write the updated CDF table to the system memory 14 by the firmware 17 to replace the corresponding CDF table. That is, the entropy decoding process of each frame image involves firmware 17 reading the CDF table from system memory 14 and writing the updated CDF table to system memory 14 via firmware 17. Because the read-write operation of the firmware 17 brings larger time delay, the decoding time delay of a frame of image is longer, so that the player can not decode the video to be decoded in time, and the user experience is affected.
To solve the problem of long latency in the embodiment shown in fig. 3, some embodiments of the present application directly store a plurality of CDF tables required for decoding in the storage unit 112 of the processor 11, and maintain a table in each of the control unit 111 and the firmware 17 of the processor 11, where the table includes indexes of the plurality of CDF tables stored in the storage unit 112, and the corresponding CDF table can be read in the storage unit 112 through the indexes in the table. Since the operation unit 113 can find the corresponding index in the above-described table at least part of the target CDF table required when entropy-decoding the video to be decoded, the target CDF can be read from the storage unit 112 directly according to the index. Since the Memory unit 112 is a Memory space inside the processor 11, such as a Static Random-Access Memory (SRAM),
the speed of reading data from and writing data to the memory cells 112 is substantially higher than the speed of reading data from and writing data to the system memory 14. Therefore, compared with the embodiment shown in fig. 3, the technical solution of the present application can greatly reduce the time delay of reading and writing back the CDF table involved in the process of entropy decoding the video to be decoded by the operation unit 113, thereby rapidly decoding the video to be decoded and improving the user experience.
It is understood that the form maintained by the storage unit 112 may be stored in the storage unit 112, and the control unit 111 may read the content in the form and may update the content in the form.
It should be understood that, in specific implementation, the number of CDF tables stored in the storage unit 112 and which CDF tables are specifically stored may be specifically set according to the need and the size of the storage space of the storage unit 112, which is not limited in this application.
The following describes in detail the technical solution of the present application, taking the electronic device 100 as an example to decode the ith frame image in the video to be decoded by using the ith CDF table CDFi required for decoding the ith frame image in the video to be decoded. Wherein i is a positive integer greater than 1.
It should be noted that it is possible to determine whether the CDFi table exists in the storage unit 112 by determining whether the index of the CDFi is in the form maintained by the control unit 111. For example, if it is determined that the index of CDFi is in the form maintained by the control unit 111, it is determined that the CDFi table is stored in the storage unit 112; on the contrary, if it is determined that there is no index of CDFi in the form maintained by the control unit 111, it is determined that the CDFi table is not stored in the storage unit 112.
Assuming that the storage unit 112 stores M CDF tables in advance, the CDFi is obtained directly from the storage unit 111 or from the system memory 14, and there are three cases in which the size of the storage space of the storage unit 112 is combined.
First case: in the form maintained by the control unit 111, the CDFi finds the index corresponding to the CDFi in the form, so as to directly obtain the CDFi from the storage unit 111 according to the index.
Second case: the CDFi is not in the form maintained by the control unit 111, the corresponding index cannot be found in the form, and the storage space of the storage unit 112 is not fully occupied, that is, the storage unit 112 may further write data, so that the CDFi needs to be obtained from the system memory 14, then the CDFi obtained from the system memory 14 is stored in the storage unit 112, and the index information of the CDFi table is added in the form maintained by the control unit 111.
Third case: the CDFi is not in the form maintained by the control unit 111, the corresponding index cannot be found in the form, and the storage space of the storage unit 112 is full, that is, the storage unit 112 cannot write data again, so that the CDFi needs to be obtained from the system memory 14, one CDF table is deleted in the storage unit 112, the deleted CDF table is written back into the system memory 14, and then the CDFi obtained from the system memory 14 is written into the storage unit 112. The CDF table deleted in the storage unit 112 may be the CDF table written in the storage unit 112 first, and it is understood that the CDF table written in the storage unit 112 first is least likely to be utilized and therefore may be replaced.
These three cases are described in detail below with reference to fig. 4 to 10, respectively.
Example 1
The process of entropy decoding a video to be decoded in the first case described above by the electronic device 100 will be described in detail with reference to fig. 4 and 5.
To facilitate understanding of the entropy decoding process of the electronic device 100 shown in fig. 5, the entropy decoding process shown in fig. 5 is first generally described with reference to fig. 4.
When the computing unit 113 adopts CDFi to perform entropy decoding on the ith frame image in the video to be decoded, all index information of the form B maintained by the control unit 11 is first obtained from the storage unit 112, and if the index of CDFi is queried in all index information of the form B, that is, the CDFi is stored in the storage unit 112, the computing unit 113 directly reads the CDFi from the storage unit 112 according to the index of the CDFi, for example, reads the CDFi from the address addr_i-1 of the storage unit 112 according to the index of the CDFi. After the operation unit 113 entropy-decodes the i-th frame image in the video to be decoded with CDFi, the CDFi is updated to CDFi ', and then the CDFi stored in the address addr_i-1 of the storage unit 112 is replaced with CDFi' for use when necessary.
The execution subject of each step in the flowchart shown in fig. 5 is the operation unit 113 of the electronic device 100, specifically, as shown in fig. 5, the process of entropy decoding the video to be decoded by the electronic device 100 in the first case described above includes the following steps:
Step 501: and determining a target CDF table corresponding to the current frame of image to be decoded.
For example, in some embodiments, the electronic device 100 obtains the video to be decoded from the electronic device 200 shown in fig. 1 through wireless communication such as WiFi, bluetooth, and the like. The video to be decoded is encoded by the electronic device 200 using the AV1 standard. In some embodiments, the electronic device 100 determines, according to identification information carried in a code stream acquired from the electronic device 200 and used for indicating a CDF table used when entropy decoding each frame image in the video to be decoded, a target CDF table corresponding to a frame image to be decoded currently.
Step 502: in the case where the target CDF table is stored in the storage unit 112, the target CDF table is read from the storage unit 112, and the image to be decoded is entropy decoded using the target CDF table.
For example, in some embodiments, when the computing unit 113 performs entropy decoding on the ith frame image in the video to be decoded, it is determined that the target CDF table corresponding to the decoded ith frame image is CDFi according to the identification information carried in the code stream and used for indicating the CDF table used when performing entropy decoding on each frame image in the video to be decoded. The operation unit 113 first acquires all index information of the form B maintained by the control unit 111 as shown in fig. 4, which includes index information of all CDF tables already stored by the storage unit 112, from the storage unit 112. Form B may be stored in the storage unit 112 or may be stored in a register of the processor 11. If the index of CDFi is queried in all index information of the form B, it is determined that CDFi is stored in the storage unit 112, the target CDF table may be read from the storage unit 112, and the I-frame image may be entropy decoded using CDFi.
The specific entropy decoding process will be separately introduced below and will not be described here.
Step 503: the target CDF table is updated.
For example, after the entropy decoding of the i-th frame image is completed using the target CDF table, the operation unit 113 updates parameters of syntax elements, such as the value of the large probability signal (Most Probable Signal, MPS), the value of lg_pmps, and the like, which are referred to in the target CDF table, for the next decoding use.
Step 504: the target CDF table before update in the storage unit 112 is replaced with the target CDF table after update.
For example, after the arithmetic unit 113 completes entropy decoding of the i-th frame image, CDFi shown in fig. 4 is updated to CDFi ', and then CDFi stored in the storage unit 112 is replaced with CDFi'. And replaces the information of CDFi in form B with the information of CDFi'.
It can be seen from this that the operation unit 113 directly acquires the CDF table (i.e., CDFi) required when entropy-decoding the i-th frame image in the video to be decoded from the storage unit 112. Also, the operation unit 113, after completing entropy decoding of the i-th frame image in the video to be decoded, directly writes the updated CDFi' into the storage unit 112. Since the memory unit 112 is a memory space inside the processor 11, the data reading and writing speed is high. Therefore, compared with the embodiment shown in fig. 3, the embodiments shown in fig. 4 and fig. 5 can improve the speed of reading and writing the CDF table, thereby improving the decoding speed and improving the user experience.
Example two
The process of entropy decoding the video to be decoded in the above-described second case by the electronic device 100 will be described in detail with reference to fig. 6 and 7 first.
To facilitate understanding of the entropy decoding process of the electronic device 100 shown in fig. 7, the entropy decoding process shown in fig. 7 is first generally described with reference to fig. 6.
As shown in fig. 6, when the computing unit 113 adopts CDFi to perform entropy decoding on the i-th frame image in the video to be decoded, if no index of CDFi is queried in all index information of the form B, that is, no CDFi is stored in the storage unit 112, the computing unit 113 acquires CDFi from the system memory 14 through the firmware 17 and writes the CDFi into the storage unit 112, and the computing unit 113 reads CDFi from the storage unit 112 to perform entropy decoding on the i-frame image.
The flowchart shown in fig. 7 is similar to the flowchart shown in fig. 5, except that step 702 shown in fig. 7 is different from step 502 shown in fig. 5, and only step 702 will be described in detail below, as shown in fig. 7, step 702 is as follows:
step 702: in the case where the target CDF table is not stored in the storage unit 112 and the storage space of the storage unit 112 is not full, the target CDF table is acquired from the system memory 14, written into the storage unit 112, and the image to be decoded is entropy decoded using the target CDF table.
For example, in some embodiments, when the computing unit 113 performs entropy decoding on the ith frame image in the video to be decoded, it is determined that the target CDF table corresponding to the decoded ith frame image is CDFi according to the identification information carried in the code stream and used for indicating the CDF table used when performing entropy decoding on each frame image in the video to be decoded. The operation unit 113 first acquires all index information of the form B maintained by the control unit 111 as shown in fig. 4, which includes index information of all CDF tables already stored by the storage unit 112, from the storage unit 112. If the index of CDFi is not queried in all the index information of the form B, it is determined that CDFi is not stored in the storage unit 112. The arithmetic unit 113 needs to instruct the firmware 17 through the control unit 111 to cause the firmware 17 to acquire CDFi from the system memory 14. The firmware 17 writes the acquired CDFi into the storage unit 112 through the control unit 111, and the firmware 17 and the control unit 111 each add index information of the CDFi in the form maintained respectively.
Form a maintained by the firmware 17 is a form of form B maintained by the control unit 111 at the software layer, and thus CDF information included in form a and form B is the same. For example, when the storage unit 112 writes CDFi in the address addr_m, the control unit 111 adds index information of CDFi in the form B, and the firmware 17 also adds index information of CDFi in the form a, for example, adds address of CDFi in the storage unit 112 and name information of CDFi.
The control unit 111 then sends the index of CDFi that has been added in the form B to the operation unit 113, and the operation unit 113 reads CDFi from the storage unit 112 according to the index of CDFi, for example, reads CDFi from the address addr_m of the storage unit 112 according to the index of CDFi. After the operation unit 113 entropy decodes the i-th frame image in the video to be decoded with CDFi, the CDFi is updated to CDFi ', and then the CDFi stored in the address addr_m of the storage unit 112 is replaced with CDFi', for use by the operation unit 113 in decoding the subsequent other images to be decoded.
The specific entropy decoding process will be separately introduced below and will not be described here.
It can be seen that the operation unit 113 obtains the CDF table (i.e., CDFi) required for entropy decoding the i-th frame image in the video to be decoded, though it is obtained from the system memory 14. However, the operation unit 113, after completing entropy decoding of the i-th frame image in the video to be decoded, directly writes the updated CDFi' to the storage unit 112. Since the memory unit 112 is a memory space inside the processor 11, the data reading and writing speed is high. Thus, in contrast to the embodiment shown in FIG. 3, the embodiments shown in FIGS. 6 and 7 do not require writing the updated CDF table to system memory 14. Assuming that the updated CDF table needs to be decoded when the next frame image is entropy decoded, the updated CDF table can be directly obtained from the storage unit 112, so that the speed of reading and writing the CDF table when the next frame image is decoded can be increased, further, the decoding speed of the next frame image can be increased, and the user experience can be improved.
Example III
The process of entropy decoding the video to be decoded in the third case described above by the electronic device 100 will be described in detail with reference to fig. 8 and 9 first.
The scheme shown in fig. 8 is similar to that shown in fig. 6, and the target CDF is not in the storage unit 112, but in the system memory 14, only the difference is that in the scheme shown in fig. 8, the storage space of the storage unit 112 is full, whereas the storage space shown in fig. 6 is not full, and data can be written.
The flowchart shown in fig. 9 is similar to the flowchart shown in fig. 7, except that step 902 shown in fig. 9 is different from step 702 shown in fig. 7, and only step 902 will be described in detail below, as shown in fig. 9, step 902 is as follows:
step 902: in the case where the target CDF table is not stored in the storage unit 112 and the storage space of the storage unit 112 is full, the target CDF table is acquired from the system memory 14, and one CDF table in the storage unit 112 is replaced with the target CDF table, the image to be decoded is entropy-decoded using the target CDF table.
The process of the computing unit 113 obtaining CDFi from the system memory 14 via the control unit 111 and the firmware 17 is described in fig. 7, and is not described herein.
After the firmware 17 acquires CDFi from the system memory 14, the firmware 17 writes the acquired CDFi into the storage unit 112 through the control unit 111, and the firmware 17 and the control unit 111 each add index information of the CDFi in the form maintained respectively.
It should be noted that, since the storage space of the storage unit 112 is full, only a part of CDF tables, for example, M CDF tables, may be stored in the storage unit 112, and it is not easy to understand that, in order to write CDFi obtained from the system memory 14 in the storage unit 112, one CDF table needs to be deleted from the storage unit 112, so that CDFi can be written into the empty storage space.
In some embodiments, a CDF table that is not used, i.e., that is not used with a high probability, may be selected from the storage unit 112, for example, CDF1 stored in the address addr_0 is deleted, the deleted CDF1 is written back to the system memory 14, and then CDFi acquired from the system memory 14 is written to the address addr_0.
Then, the control unit 111 adds the index information of CDFi to the form B, and the firmware 17 also adds the index information of CDFi to the form a, for example, adds the address of CDFi in the storage unit 112 and the name information of CDFi.
The control unit 111 then sends the index of CDFi that has been added in the form B to the operation unit 113, and the operation unit 113 reads CDFi from the storage unit 112 according to the index of CDFi, for example, reads CDFi from the address addr_0 of the storage unit 112 according to the index of CDFi.
After the operation unit 113 entropy decodes the i-th frame image in the video to be decoded using CDFi, the CDFi is updated to CDFi ', and then the CDFi stored in the address addr_0 of the storage unit 112 is replaced with CDFi', for use by the operation unit 113 in entropy decoding of the subsequent other images to be decoded.
In contrast to the embodiment shown in fig. 3, the embodiments shown in fig. 8 and 9, while acquiring CDFi from system memory 14, do not require writing an updated CDF table to system memory 14. Assuming that the updated CDF table needs to be decoded when the next frame image is entropy decoded, the updated CDF table can be directly obtained from the storage unit 112, so that the speed of reading and writing the CDF table when the next frame image is decoded can be increased, further, the decoding speed of the next frame image can be increased, and the user experience can be improved.
The process of entropy decoding an i-th frame image in a video to be decoded using CDFi, which is referred to in the above steps 502, 702, and 902, will be exemplarily described with reference to fig. 10.
Fig. 10 shows a flowchart of entropy decoding an i-th frame image in a video to be decoded by using CDFi by the computing unit 113 of the electronic device 100 according to an embodiment of the present application, where an execution subject of each step is the computing unit 113, and a specific process includes:
Step 1001: and decoding header information in the video code stream to be decoded, and determining an entropy decoding mode and a CDF table required for entropy decoding the ith frame of image. Wherein the header information is the first few fields in the video bitstream acquired by the electronic device 100. Carrying an entropy decoding mode, identification information of a CDF table required for decoding each frame image, and the like.
In some embodiments, the header information includes some basic information of the sequence and the image. Such as a frame type (frame type) field, a warp motion (all _ motion) field, and the like. The header information related to AV1 entropy decoding is: a reference frame information (primary_ref_frame) field of the CDF table indicating which reference frame the CDF table used for decoding the current video frame is, and a quantization parameter reference value (base_q_idx) field. In some embodiments, when primary_ref_frame is 0-6, the CDF of the corresponding reference frame is selected as the CDF of the current frame decoding. In some embodiments, when primary_ref_frame is 7, one of 4 default tables (the default 4 CDF tables in SPEC) is selected as the CDF table for decoding the current frame according to the value of base_q_idx. For example, when the value of base_q_idx is within interval [0, 20], table No. 0 is selected; when the value of base_q_idx is within the interval (20, 60), table No. 1 is selected, and when the value of base_q_idx is within the interval (60, 120), table No. 2 is selected, and otherwise table No. 3 is selected.
The entropy decoding mode refers to a specific mode of entropy decoding in each video standard, for example, in some embodiments, the electronic device 100 determines to use the entropy decoding mode under the AV1 standard and the identification information of the CDF table corresponding to the decoded i-th frame image by decoding header information in the video code stream to be decoded.
Step 1002: and adopting the determined entropy decoding mode, and carrying out entropy decoding on the ith frame image by utilizing CDFi to obtain the ith frame image after entropy decoding.
For example, according to the AV1 standard, the i-th frame image is entropy decoded using the determined CDFi.
Firstly, the syntax elements to be decoded corresponding to each block of the ith frame image need to be decoded. For example, each syntax element to be decoded is sequentially decoded using a syntax element structure of a video bitstream specified by the AV1 standard. When each syntax element is decoded, binary strings corresponding to the syntax elements are sequentially decoded. Determining a current decoding context model according to the result of adjacent decoded syntax elements, and selecting corresponding variables in a current CDFi table according to the context models for decoding the current syntax elements; and performing binarization processing on the decoded syntax element according to Spec, so as to obtain a decoding result of the syntax element, updating a corresponding variable in the CDFi table after the current syntax element is decoded, and then starting decoding of the next syntax element.
Then, a prediction process is performed on the current block of the i-th frame image using the decoded syntax element to obtain a predicted block of the current block, e.g., the predicted block of the current block reconstructs the image of the current block.
An entropy decoding method provided in the present application will be described in detail with reference to fig. 11 based on the above description of fig. 4 to 10.
Specifically, as shown in fig. 11, the entropy decoding method provided in the present application includes the following steps:
step 1101: and determining a target CDF table corresponding to the current frame of image to be decoded.
Step 1102: it is determined whether the target CDF table is in the storage unit 112. If so, indicating that the target CDF table is in the storage unit 112, proceeding to step 1103; otherwise, it indicates that the target CDF table is not in the storage unit 112, and the process proceeds to step 1104.
For example, in some embodiments, when the operation unit 113 decodes an i-th frame image in a video to be decoded, the corresponding target CDF table is CDFi. The arithmetic unit 113 first acquires all index information of the form B shown in fig. 4 maintained by the control unit 111, and if the index of CDFi is not queried in all index information of the form B, it is determined that CDFi is not stored in the storage unit 112, and the process proceeds to step 1104. Otherwise, if the index of CDFi is queried in all the index information of the form B, it is determined that CDFi is stored in the storage unit 112, and the process proceeds to step 1103.
Step 1103: the target CDF table is acquired from the storage unit 112, and the image to be decoded is entropy decoded using the target CDF table.
That is, in the case where the target CDF table is stored in the storage unit 112, the operation unit 113 may read the target CDF table from the storage unit 112 directly from the index of the target CDF table in the table B shown in fig. 4.
Step 1104: the target CDF table is acquired from the system memory 117, and the image to be decoded is entropy decoded using the target CDF table.
That is, in the case where the target CDF table is not stored in the storage unit 112, the operation unit 113 needs to acquire the target CDF table from the system memory 14.
For example, in the embodiment shown in fig. 6 and 8, if the index of CDFi (i.e., the target CDF table) is not queried in all the index information of the form B, that is, the CDFi is not stored in the storage unit 112, the operation unit 113 needs to issue an instruction to the firmware 17 through the control unit 111, so that the firmware 17 acquires the CDFi from the system memory 14. The firmware 17 writes the acquired CDFi into the storage unit 112 through the control unit 111, and the firmware 17 and the control unit 111 each add index information of the CDFi in the form maintained respectively.
In some embodiments, as shown in fig. 6, if the memory space of the storage unit 112 is not fully occupied, CDFi is written into the address addr_m, and the control unit 111 adds index information of CDFi in the form B, and the firmware 17 also adds index information of CDFi in the form a, for example, adds address of CDFi in the storage unit 112 and name information of CDFi.
In some embodiments, as shown in fig. 8, when the memory space of the storage unit 112 is full, a CDF table is deleted in the storage unit 112, and then CDFi is written into the storage unit 112. For example, CDF1 stored in address addr_0 is replaced with CDFi. And the control unit 111 adds the index information of CDFi in the form B, and the firmware 17 also adds the index information of CDFi in the form a.
Then, the computing unit 113 directly reads CDFi from the storage unit 112 according to the index of CDFi added in the form B, and decodes the corresponding image in the video to be decoded.
Step 1105: the target CDF table is updated. For use by the arithmetic unit 113 in decoding other non-decoded images in the video to be decoded.
Step 1106: the updated target CDF table is stored in the storage unit 112.
For example, CDFi shown in fig. 6 and 8 is updated to CDFi ', and CDFi stored in the storage unit 112 is replaced with CDFi'. And the index of CDFi in form a and form B is replaced with the index of CDFi'.
It can be seen that the present application decodes the video to be decoded by directly acquiring the target CDF required for decoding from the storage unit 112 with a faster data reading and writing speed, or partially acquiring the target CDF required for decoding from the system memory 14 with a slower data reading and writing speed. Then, the target CDF table used for decoding is updated, and the updated CDF table is directly written into the storage unit 112 with higher data reading and writing speed, so that when the updated CDF table needs to be used subsequently, the CDF table can be directly and quickly read from the storage unit 112. Compared to the embodiment shown in fig. 3, when the computing unit 113 decodes the video to be decoded, all CDFs are acquired from the system memory 14 with slower data reading and writing speed, and the updated CDF table is written into the system memory 14.
Embodiments of the mechanisms disclosed herein may be implemented in hardware, software, firmware, or a combination of these implementations. Embodiments of the present application may be implemented as a computer program or program code that is executed on a programmable system including at least one processor, a storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
Program code may be applied to input instructions to perform the functions described herein and generate output information. The output information may be applied to one or more output devices in a known manner. For purposes of this application, a processing system includes any system having a processor such as, for example, a digital signal processor (Digital Signal Processor, DSP), microcontroller, application specific integrated circuit (Application Specific Integrated Circuit, ASIC), or microprocessor.
The program code may be implemented in a high level procedural or object oriented programming language to communicate with a processing system. Program code may also be implemented in assembly or machine language, if desired. Indeed, the mechanisms described in the present application are not limited in scope to any particular programming language. In either case, the language may be a compiled or interpreted language.
In some cases, the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried by or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which may be read and executed by one or more processors. For example, the instructions may be distributed over a network or through other computer readable media. Thus, a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), including but not limited to floppy diskettes, optical disks, read-Only memories (CD-ROMs), magneto-optical disks, read Only Memories (ROMs), random access memories (Random Access Memory, RAMs), erasable programmable Read-Only memories (Erasable Programmable Read Only Memory, EPROMs), electrically erasable programmable Read-Only memories (Electrically Erasable Programmable Read-Only memories, EEPROMs), magnetic or optical cards, flash Memory, or tangible machine-readable Memory for transmitting information (e.g., carrier waves, infrared signal digital signals, etc.) using the internet in an electrical, optical, acoustical or other form of propagated signal. Thus, a machine-readable medium includes any type of machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).
In the drawings, some structural or methodological features may be shown in a particular arrangement and/or order. However, it should be understood that such a particular arrangement and/or ordering may not be required. Rather, in some embodiments, these features may be arranged in a different manner and/or order than shown in the illustrative figures. Additionally, the inclusion of structural or methodological features in a particular figure is not meant to imply that such features are required in all embodiments, and in some embodiments, may not be included or may be combined with other features.
It should be noted that, in the embodiments of the present application, each unit/module is a logic unit/module, and in physical aspect, one logic unit/module may be one physical unit/module, or may be a part of one physical unit/module, or may be implemented by a combination of multiple physical units/modules, where the physical implementation manner of the logic unit/module itself is not the most important, and the combination of functions implemented by the logic unit/module is the key to solve the technical problem posed by the present application. Furthermore, to highlight the innovative part of the present application, the above-described device embodiments of the present application do not introduce units/modules that are less closely related to solving the technical problems presented by the present application, which does not indicate that the above-described device embodiments do not have other units/modules.
It should be noted that, in the examples and descriptions of this patent, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
While the present application has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present application.
Claims (9)
1. An entropy decoding method applied to an electronic device, the electronic device comprising a processor, a first memory arranged inside the processor and a second memory arranged outside the processor, the method comprising:
Acquiring a first target CDF table required by entropy decoding of a video frame to be decoded, wherein the video frame to be decoded is an ith frame image in the video to be decoded, the first target CDF table is an ith CDF table required by decoding the ith frame image in the video to be decoded, and i is a positive integer greater than 1;
confirming whether the first target CDF table is stored in the first memory according to whether the index of the first target CDF table is in a maintenance form of the processor; retrieving the first target CDF table from the first memory upon confirming that the first target CDF table is stored in the first memory;
retrieving the first target CDF table from the second memory upon confirming that the first target CDF table is not stored in the first memory;
wherein, when it is confirmed that the first target CDF table is not stored in the first memory, after the first target CDF table is acquired from the second memory, further comprising:
storing the first target CDF table in the first memory upon confirming that the first memory has the capability to store the first target CDF table;
replacing a third target CDF table stored in the first memory with the first target CDF table, wherein the third target CDF table is the CDF table written in the first memory first, if the first memory is confirmed not to have the capacity of storing the first target CDF table;
Entropy decoding the video frame to be decoded based on the obtained first target CDF table to obtain a decoded video frame, and
and updating parameters in the first target CDF table in the entropy decoding process to obtain a second target CDF table.
2. The method as recited in claim 1, further comprising:
and in the case that the first memory does not have the capacity of storing the first target CDF table, writing a third target CDF table stored in the first memory into the second memory after replacing the third target CDF table with the first target CDF table.
3. The method according to claim 1 or 2, further comprising:
the second target CDF table is stored in the first memory.
4. The method of claim 3, wherein the storing the second target CDF table in the first memory comprises:
and in the case of confirming that the first target CDF table is acquired from the first memory, replacing the first target CDF table stored in the first memory with the second target CDF table.
5. The method of claim 1, wherein the first memory is SRAM.
6. The method of claim 1, wherein the second memory is one of RAM, DDR SDRAM.
7. A computer readable storage medium having stored thereon instructions which, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-6.
8. A computer program product comprising instructions for implementing the method of any of claims 1-6 when executed by one or more processors.
9. An electronic device, comprising:
a memory for storing instructions for execution by one or more processors of the electronic device, an
A processor for performing the method of any one of claims 1-6 when the instructions are executed by one or more processors.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111338420.7A CN114025162B (en) | 2021-11-12 | 2021-11-12 | Entropy decoding method, medium, program product, and electronic device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111338420.7A CN114025162B (en) | 2021-11-12 | 2021-11-12 | Entropy decoding method, medium, program product, and electronic device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114025162A CN114025162A (en) | 2022-02-08 |
CN114025162B true CN114025162B (en) | 2024-04-09 |
Family
ID=80063768
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111338420.7A Active CN114025162B (en) | 2021-11-12 | 2021-11-12 | Entropy decoding method, medium, program product, and electronic device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114025162B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102823257A (en) * | 2010-03-15 | 2012-12-12 | 联发科技(新加坡)私人有限公司 | Methods of utilizing tables adaptively updated for coding/decoding and related processing circuits thereof |
CN108513138A (en) * | 2016-02-24 | 2018-09-07 | 联发科技股份有限公司 | Video process apparatus and corresponding video processing method |
CN111726626A (en) * | 2020-06-18 | 2020-09-29 | 上海兆芯集成电路有限公司 | Integrated circuit and probability table storage method for video decoding |
US11039138B1 (en) * | 2012-03-08 | 2021-06-15 | Google Llc | Adaptive coding of prediction modes using probability distributions |
-
2021
- 2021-11-12 CN CN202111338420.7A patent/CN114025162B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102823257A (en) * | 2010-03-15 | 2012-12-12 | 联发科技(新加坡)私人有限公司 | Methods of utilizing tables adaptively updated for coding/decoding and related processing circuits thereof |
US11039138B1 (en) * | 2012-03-08 | 2021-06-15 | Google Llc | Adaptive coding of prediction modes using probability distributions |
CN108513138A (en) * | 2016-02-24 | 2018-09-07 | 联发科技股份有限公司 | Video process apparatus and corresponding video processing method |
CN111726626A (en) * | 2020-06-18 | 2020-09-29 | 上海兆芯集成电路有限公司 | Integrated circuit and probability table storage method for video decoding |
Also Published As
Publication number | Publication date |
---|---|
CN114025162A (en) | 2022-02-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5351020B2 (en) | Method and apparatus using virtual reference picture | |
RU2628319C2 (en) | Method and device for determination of supporting images for external prediction | |
JP6046839B2 (en) | Inter prediction method and apparatus, motion compensation method and apparatus | |
US9414059B2 (en) | Image processing device, image coding method, and image processing method | |
RU2608263C1 (en) | Method of entropy encoding slice segment and device therefor and method of entropy decoding segment slice and device therefor | |
US11356739B2 (en) | Video playback method, terminal apparatus, and storage medium | |
US9723308B2 (en) | Image processing apparatus and image processing method | |
US20230239464A1 (en) | Video processing method with partial picture replacement | |
KR20150092250A (en) | Jctvc-l0227: vps_extension with updates of profile-tier-level syntax structure | |
US20230022526A1 (en) | Video processing method and apparatus, device, and storage medium | |
CN105681893A (en) | Method and device for decoding stream media video data | |
CN114025162B (en) | Entropy decoding method, medium, program product, and electronic device | |
JP2007081756A (en) | Encoder/decoder, encoding and decoding method, encoding and decoding integrated circuit, and encoding and decoding program | |
JP2008072182A (en) | Moving picture decoding device, moving picture decoding method, moving picture decoding program, moving picture encoding device, moving picture encoding method, moving picture encoding program, and moving picture encoding and decoding device | |
JP2024517915A (en) | Data processing method, device, computer device and computer program | |
US20170070734A1 (en) | Coding apparatus, decoding apparatus, and video transmission system | |
CN114339249B (en) | Video decoding method, readable medium and electronic device thereof | |
JP5326724B2 (en) | Video processing apparatus and control program for video processing apparatus | |
JP2014103429A (en) | Moving image encoder and moving image encoding method | |
TW201635788A (en) | Coding of video and audio with initialization fragments | |
US20130287100A1 (en) | Mechanism for facilitating cost-efficient and low-latency encoding of video streams | |
CN118509633A (en) | Video transcoding method, device, electronic equipment and storage medium | |
CN117998162A (en) | Method and device for packaging dynamic pictures and electronic equipment | |
JP2012191397A (en) | Moving image encoding device and moving image encoding method | |
JP2011139291A (en) | Image playback controller, image playback control method and imaging device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |