CN113838167A - Method and apparatus for generating animation - Google Patents
Method and apparatus for generating animation Download PDFInfo
- Publication number
- CN113838167A CN113838167A CN202010575687.7A CN202010575687A CN113838167A CN 113838167 A CN113838167 A CN 113838167A CN 202010575687 A CN202010575687 A CN 202010575687A CN 113838167 A CN113838167 A CN 113838167A
- Authority
- CN
- China
- Prior art keywords
- animation
- target
- graph
- graphic
- generating
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 74
- 230000008859 change Effects 0.000 claims abstract description 46
- 230000009466 transformation Effects 0.000 claims description 22
- 239000002131 composite material Substances 0.000 claims description 17
- 238000004590 computer program Methods 0.000 claims description 10
- 230000004044 response Effects 0.000 claims description 8
- 238000012544 monitoring process Methods 0.000 claims description 5
- 238000013519 translation Methods 0.000 claims description 5
- 238000004064 recycling Methods 0.000 claims description 4
- 230000000694 effects Effects 0.000 description 15
- 230000008569 process Effects 0.000 description 13
- 238000010586 diagram Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 238000011161 development Methods 0.000 description 3
- 230000000644 propagated effect Effects 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 229920000136 polysorbate Polymers 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/36—Software reuse
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/38—Creation or generation of source code for implementing user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Processing Or Creating Images (AREA)
Abstract
Embodiments of the present disclosure disclose methods and apparatus for generating animations. One embodiment of the method comprises: generating a target graph by utilizing a graph generation class according to the animation generation request, wherein the graph generation class is used for abstracting drawing methods of graphs of different types, and the target graph is used for generating the animation which is generated by the animation generation request indication; and drawing the target graph to generate the animation according to a preset refresh rate and the change rate corresponding to the target graph. This embodiment helps to increase the flexibility of animated content that can be generated.
Description
Technical Field
The embodiment of the disclosure relates to the technical field of computers, in particular to a method and a device for generating animation.
Background
Currently, many development platforms of mobile terminals are provided with various tools, frameworks, etc. for implementing animations for developers to use to present various animation effects on the mobile terminal. The existing tools and frames for realizing animation have different animation realization methods. Even different animation implementations are provided for the same tool, framework, etc. used to implement the animation.
Taking the Android platform as an example, a developer can use an Animation library Frame provided by the Android platform to realize Frame Animation (Frame Animation), inter-complement Animation (Tween Animation), attribute Animation and the like. Generally, the cost of implementing animation effects increases as the complexity of the animation increases. And the implementation process of the animation usually consumes large system resources.
Disclosure of Invention
Embodiments of the present disclosure propose methods and apparatuses for generating an animation.
In a first aspect, an embodiment of the present disclosure provides a method for generating an animation, the method including: generating a target graph by utilizing a graph generation class according to the animation generation request, wherein the graph generation class is used for abstracting drawing methods of graphs of different types, and the target graph is used for generating the animation which is generated by the animation generation request indication; and drawing the target graph to generate the animation according to a preset refresh rate and the change rate corresponding to the target graph.
In some embodiments, the number of target graphics is more than two; the drawing the target graph to generate the animation according to the preset refresh rate and the change rate corresponding to the target graph includes: and respectively drawing each target graph according to the change rate respectively corresponding to each target graph based on the preset refresh rate to generate the animation.
In some embodiments, the target graphic belongs to a composite graphic, wherein the composite graphic includes at least two sub-graphics.
In some embodiments, the generating the target graphics using the graphics generation class according to the animation generation request includes: and acquiring the target graph from a graph cache pool by utilizing a graph generation class according to the animation generation request, wherein the graph cache pool comprises different types of graphs generated in advance.
In some embodiments, the drawing the target graphic to generate the animation according to the preset refresh rate and the change rate corresponding to the target graphic includes: and in response to the monitoring that the target graph is invisible, recycling the target graph.
In some embodiments, after recycling the target graphic in response to detecting that the target graphic is not visible, the method further comprises: and drawing the newly generated graph according to the refresh rate and the change rate corresponding to the newly generated graph.
In some embodiments, the target graphic corresponds to at least one transformation attribute, wherein the transformation attribute comprises at least one of: transparency, rotation, zoom, translation.
In a second aspect, an embodiment of the present disclosure provides an apparatus for generating an animation, the apparatus including: a graphic generation unit configured to generate a target graphic using a graphic generation class for abstracting a drawing method of different types of graphics according to an animation generation request, the target graphic being used to generate an animation indicated by the animation generation request; and the animation generation unit is configured to draw the target graph according to a preset refresh rate and a change rate corresponding to the target graph so as to generate the animation.
In some embodiments, the number of target graphics is more than two; and the animation generating unit is further configured to draw each target graph respectively according to the change rate corresponding to each target graph respectively based on a preset refresh rate so as to generate the animation.
In some embodiments, the target graphic belongs to a composite graphic, wherein the composite graphic includes at least two sub-graphics.
In some embodiments, the graphics generation unit is further configured to obtain the target graphics from a graphics cache pool using a graphics generation class according to the animation generation request, wherein the graphics cache pool includes different types of graphics generated in advance.
In some embodiments, the animation generation unit is further configured to recycle the target graphic in response to detecting that the target graphic is not visible.
In some embodiments, the animation generation unit is further configured to generate a new graphic after the target graphic is recycled, and draw the newly generated graphic according to a refresh rate and a change rate corresponding to the newly generated graphic.
In some embodiments, the target graphic corresponds to at least one transformation attribute, wherein the transformation attribute comprises at least one of: transparency, rotation, zoom, translation.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: one or more processors; storage means for storing one or more programs; when the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the method as described in any implementation of the first aspect.
In a fourth aspect, embodiments of the present disclosure provide a computer-readable medium on which a computer program is stored, which computer program, when executed by a processor, implements the method as described in any of the implementations of the first aspect.
According to the method and the device for generating the animation, the drawing methods of different types of graphs are abstracted by realizing the graph generation class, various types of graphs including self-defined graphs are generated by utilizing the graph generation class, and the graphs are drawn according to the refresh rate and the change rate of the graphs to form the animation effect. This can improve the flexibility of animation content that can be generated.
Drawings
Other features, objects and advantages of the disclosure will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which one embodiment of the present disclosure may be applied;
FIG. 2 is a flow diagram for one embodiment of a method for generating an animation according to the present disclosure;
FIG. 3 is a flow diagram of yet another embodiment of a method for generating an animation according to the present disclosure;
FIG. 4 is a flow diagram of yet another embodiment of a method for generating an animation according to the present disclosure;
FIG. 5 is a schematic diagram of an embodiment of an apparatus for generating an animation according to the present disclosure;
FIG. 6 is a schematic structural diagram of an electronic device suitable for use in implementing embodiments of the present disclosure.
Detailed Description
The present disclosure is described in further detail below with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that, in the present disclosure, the embodiments and features of the embodiments may be combined with each other without conflict. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 illustrates an exemplary architecture 100 to which embodiments of the presently disclosed method for generating an animation or apparatus for generating an animation may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The terminal devices 101, 102, 103 interact with a server 105 via a network 104 to receive or send messages or the like. Various client applications may be installed on the terminal devices 101, 102, 103. For example, browser-like applications, search-like applications, instant messaging-like applications, social platform software, information flow-like applications, image-like applications, animation-like applications, and so forth.
The terminal apparatuses 101, 102, and 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices that support animation presentation, including but not limited to smart phones, tablet computers, e-book readers, laptop portable computers, desktop computers, and the like. When the terminal apparatuses 101, 102, 103 are software, they can be installed in the electronic apparatuses listed above. It may be implemented as multiple pieces of software or software modules (e.g., multiple pieces of software or software modules to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The server 105 may be a server that provides various services, such as a backend server that provides support for client applications installed on the terminal devices 101, 102, 103. The server 105 may respond to various requests sent by the terminal apparatuses 101, 102, 103, and may also return the response results to the terminal apparatuses 101, 102, 103.
It should be noted that the method for generating the animation provided by the embodiment of the present disclosure is generally executed by the terminal devices 101, 102, 103, and accordingly, the apparatus for generating the animation is generally disposed in the terminal devices 101, 102, 103. At this point, the exemplary system architecture 100 may not have the server 105 and the network 104.
It is also noted that server 105 may also support animated rendering. At this time, the method for generating the animation may be executed by the server 105, and accordingly, the apparatus for generating the animation may be provided in the server 105. At this point, exemplary system architecture 100 may be absent of terminal devices 101, 102, 103 and network 104.
The server 105 may be hardware or software. When the server 105 is hardware, it may be implemented as a distributed server cluster composed of a plurality of servers, or may be implemented as a single server. When the server 105 is software, it may be implemented as multiple pieces of software or software modules (e.g., multiple pieces of software or software modules used to provide distributed services), or as a single piece of software or software module. And is not particularly limited herein.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for generating an animation according to the present disclosure is shown. The method for generating animation includes the steps of:
In the present embodiment, the execution subject of the method for generating animation (such as the terminal devices 101, 102, 103 shown in fig. 1) can generate a target graphic using a graphic generation class according to an animation generation request.
The animation generation request may be used to indicate an animation that is desired to be generated. Animation generation requests may be predetermined by a technician according to actual application requirements. Animation generation requests may also be sent by the user through the terminal he uses.
The graph generation class may be used to abstract the drawing methods of different types of graphs. The type of graphics is generally diverse. For example, types of graphics include, but are not limited to: circular, square, rectangular, quadrilateral, triangular, elliptical. In some cases, a Bitmap (Bitmap) may also be considered a type of graphics. The graphics generation class may be pre-written by a technician.
In particular, the graphics generation class may abstract the styles and rendering methods of various types of graphics. Based on this, drawing of various graphics can be realized by inheriting the graphics generation class, thereby generating various graphics. Moreover, the technician can generate various customized graphics using the graphics generation class.
Alternatively, the files used to generate the specified types of graphics may be pre-written by a technician. At this time, these files may be inherited to the above-described graphics generation class.
For example, a file for generating a triangle, a file for generating a circle, a file for generating a square, and the like may be written in advance.
Based on the method, when the graphs of the specified types are generated, the corresponding graphs can be generated by directly utilizing the pre-written files, the generation efficiency of the target graph is improved, and the efficiency of the animation implementation process is further improved.
The target graphic may be used to generate an animation that the animation generation request indicates generated. It should be appreciated that animation effects may be presented through certain changes in the graphics.
In the present embodiment, the refresh rate may refer to a screen refresh rate of the execution subject described above. The screen refresh rate may represent the number of times per second an image on the screen appears, typically in hertz (Hz).
The rate of change corresponding to the pattern may represent the rate of change of the pattern. According to different application requirements, the animation generation request indicates that the generated animation is different, the change modes of the graph for generating the animation can be different, and the change rates can also be different. It should be understood that the pattern variations may be varied. Generally, the change rate corresponding to the graph can be set according to the actual application requirement.
Optionally, each target graphic may correspond to at least one transformation attribute. Transformation properties may include, but are not limited to, transparency, rotation, scaling, translation, and the like. It should be understood that different transformation attributes can be flexibly set according to different application requirements. At this time, the change of the target pattern can be formed by the transformation of the transformation attribute.
At this time, the change rate corresponding to each target pattern may include a change rate corresponding to each transformation attribute that the target pattern has. For each target feature, each transformation attribute of the target feature may correspond to a rate of change. The rate of change of the different transformation attributes of the target graphic may be different or the same.
By independently controlling the transformation attributes of the target graphs, various animation effects can be flexibly realized, and a plurality of complex animations can be realized, so that the flexibility of animation realization is improved.
The execution main body can continuously draw the target graph on the screen according to the screen refreshing rate and the change rate corresponding to the target graph, so that animation can be formed. It should be understood that the variation of the target graphic can be obtained according to the product of the variation rate and the time, so that the target graphic is continuously redrawn according to the variation, and the animation effect can be formed by the dynamic variation process of the target graphic.
Different types of execution bodies (e.g., different operating systems) may have different ways of drawing graphics. For example, for an execution subject using an Android operating system, a target graphic may be drawn on a screen by calling a graphic drawing method provided by the Android system. For example, drawing of graphics is implemented based on a Canvas (Canvas) related application program interface API. Meanwhile, a callback mechanism provided by the TimeAnamator class can be used for controlling the continuous drawing of the target graph, so that the animation is formed.
Specifically, during each callback of the timeidentifier, the incremental information of each transformation attribute of the target graph is calculated by using the change rate corresponding to each transformation attribute of the target graph, and then the target graph is redrawn according to the new transformation attribute.
In some optional implementations of this embodiment, the number of target graphics generated by the graphics generation class may be two or more according to the animation generation request. In other words, the animation generation request indicates that the generated animation can result from a change of at least two graphics.
In this case, each target pattern may correspond to a unique rate of change. The rate of change may be the same or different for different target patterns. Meanwhile, after at least two target graphs are generated, the target graphs can be respectively drawn according to the change rate respectively corresponding to the target graphs on the basis of a preset refresh rate so as to generate animation.
Therefore, various self-defined graphs can be generated by utilizing the graph generation class according to the actual application requirements, and the richness of the achievable animation effect is improved. Meanwhile, the change of each generated graph can be independently controlled, and each transformation attribute of each graph can be independently controlled, so that the flexibility and complexity of the achievable animation effect are further improved. For example, an infinitely random animation effect may be achieved based on this approach.
In the prior art, the realization of animation effect usually requires writing many related configuration files or writing a large amount of codes to realize. In some cases, some of the more complex animations cannot be rendered by writing code or configuration files and are rendered directly using video as the animation. However, writing codes and configuration files or using videos will bring file increment, which results in large resources such as memory occupied by the whole system. For example, client applications that employ video as an animation are typically bulky, which also affects the performance of the client application while running.
The method provided by the above embodiment of the present disclosure generates various types of graphics using the graphics generation class of the drawing method of the previously written abstract different types of graphics, and can further draw each graphics based on the screen refresh rate and the change rate of each graphics, thereby forming an animation. Compared with the prior art, a large number of code files and related configuration files do not need to be written, and videos do not need to be used for presenting the animation, so that development cost can be saved, overall memory occupation is reduced, and the fluency of the animation presenting process is promoted.
With further reference to FIG. 3, a flow 300 of yet another embodiment of a method for generating an animation is shown. The flow 300 of the method for generating an animation comprises the steps of:
In this embodiment, the generated at least two target graphics may include a composite graphic. The object graphic belonging to the composite graphic may include at least two sub-graphics.
For the target graph belonging to the composite graph, when the target graph is changed, at least two sub-graphs included in the target graph can be correspondingly changed as a whole. The sub-graphs included in each composite graph can be flexibly added or deleted according to requirements.
Therefore, various sub-graphs can be freely combined into a target graph according to the actual application requirements, and the target graph can be changed and controlled as a whole. Based on the method, a plurality of complex graphics can be realized by combining simple graphics, so that a complex animation effect is achieved.
It should be noted that the target graph may be nested-enabled. For example, for an object graphic belonging to a composite graphic, the sub-graphics included in the object graphic may also be the composite graphic.
And step 302, respectively drawing each target graph according to the change rate respectively corresponding to each target graph based on a preset refresh rate to generate an animation.
Alternatively, when drawing the target graphic belonging to the composite graphic, the drawing may be performed by drawing each sub-graphic included in the target graphic in sequence. Specifically, the drawing of the target graph may be achieved by sequentially calling the drawing methods of the sub-graphs included in the target graph.
The specific execution process of the content not described in detail in the above steps 301 and 302 can refer to the related description of steps 201 and 202 in the corresponding embodiment of fig. 2, and is not described herein again.
The method provided by the above embodiment of the present disclosure enables a technician to freely combine various types of graphics to form various more complex graphics according to actual application requirements by supporting composite graphics, and controls and changes the combined graphics as a whole, thereby realizing various complex animation effects. Meanwhile, the graphs in the animation realized by the method are completely decoupled, namely the drawing processes of the graphs are mutually independent, so that the flexibility of animation realization is further improved.
With further reference to FIG. 4, a flow 400 of yet another embodiment of a method for generating an animation is shown. The flow 400 of the method for generating an animation comprises the steps of:
In this embodiment, the graphics cache pool may include different types of graphics generated in advance. Therefore, the target graph can be directly obtained from the graph cache pool by the graph generation class according to the animation generation request, and the generation efficiency of the target graph is improved.
Alternatively, it may be determined whether a target graphic currently exists in the graphic cache pool. If so, the target graph can be obtained from the graph cache pool by utilizing the graph generation class. If the target graph does not exist, the target graph can be created and generated by utilizing the graph generation class, and meanwhile, the target graph is cached in the graph cache pool for standby.
It should be understood that the number of various types of graphics in the graphics cache pool, the change rate corresponding to each graphics, the transformation attribute of each graphics, the total number of graphics included in the graphics cache pool, and the like can be flexibly set according to the actual application requirements.
Therefore, the generation efficiency of the target graph can be improved to a certain degree by using the graph cache pool, and the reuse of various types of graphs in the graph cache pool can be realized by using the graph cache pool, so that the influence on the performance of the execution main body in the animation realization process is avoided.
In some optional implementations of this embodiment, during the process of drawing the target graphics to generate the animation, the visibility of each target graphic may be monitored to determine whether each target graphic is currently in a visible state.
During the change of the graphic, there may be many cases where the target graphic is in an invisible state. For example, when the transparency of the target graphic is changed to 0, the target graphic is in an invisible state. As another example, the target graphic is panned out of the range that the screen can render, and the target graphic is also in an invisible state. For another example, when the target pattern is reduced to 0, the target pattern is also invisible.
At this point, the target graphic may be recycled in response to monitoring that the target graphic is not visible. Specifically, the target graphics may be recycled to the graphics cache pool. Therefore, the system performance can be improved, and unnecessary resource holding and resource consumption can be avoided.
In some optional implementation manners of this embodiment, after it is monitored that the target graph is invisible and the target graph is recovered, a new graph may be further generated, and the newly generated graph is drawn according to the refresh rate and the change rate corresponding to the newly generated graph to generate a new animation effect.
In this case, the new pattern to be further generated may be the same as the target pattern to be collected or may be different from the target pattern to be collected. Various attributes such as the type and the change rate of the generated new graph can be freely set according to actual application requirements. For example, a new pattern may be randomly generated. Therefore, the animation presentation effect is freely set according to different application requirements and application scenes, and the flexibility of animation realization and the complexity of the realized animation can be further improved.
The specific execution process of the content not described in detail in steps 401 and 402 may refer to the related description of steps 201 and 202 in the corresponding embodiment of fig. 2, and is not described herein again.
According to the method provided by the embodiment of the disclosure, various types of graphics cached in the graphics cache pool in advance can be multiplexed by setting the graphics cache pool, so that the conditions of system performance jitter and the like caused by continuous drawing of new graphics and other events in the animation implementation process are avoided, and the animation implementation efficiency and stability are improved.
With further reference to fig. 5, as an implementation of the method shown in the above figures, the present disclosure provides an embodiment of an apparatus for generating an animation, which corresponds to the embodiment of the method shown in fig. 2, and which is particularly applicable in various electronic devices.
As shown in fig. 5, the apparatus 500 for generating an animation provided by the present embodiment includes a graphics generating unit 501 and an animation generating unit 502. The graphics generation unit is configured to generate a target graphic by using a graphics generation class according to an animation generation request, wherein the graphics generation class is used for abstracting drawing methods of different types of graphics, and the target graphic is used for generating an animation generated by the animation generation request indication; and the animation generation unit is configured to draw the target graph according to a preset refresh rate and a change rate corresponding to the target graph so as to generate the animation.
In the present embodiment, in the apparatus 500 for generating animation: the specific processing of the graph generating unit 501 and the animation generating unit 502 and the technical effects thereof can refer to the related descriptions of step 201 and step 202 in the corresponding embodiment of fig. 2, which are not repeated herein.
In some optional implementations of this embodiment, the number of target patterns is two or more; and the animation generating unit 502 is further configured to draw each target graph according to the change rate corresponding to each target graph respectively based on a preset refresh rate to generate an animation.
In some optional implementations of this embodiment, the target graphic belongs to a composite graphic, wherein the composite graphic includes at least two sub-graphics.
In some optional implementations of this embodiment, the graphics generating unit 501 is further configured to obtain the target graphics from a graphics cache pool by using a graphics generating class according to the animation generation request, where the graphics cache pool includes different types of graphics generated in advance.
In some optional implementations of the present embodiment, the animation generation unit 502 is further configured to recycle the target graphics in response to monitoring that the target graphics are not visible.
In some optional implementations of the embodiment, the animation generating unit 502 is further configured to generate a new graph after the target graph is recycled, and draw the newly generated graph according to the refresh rate and the change rate corresponding to the newly generated graph.
In some optional implementations of this embodiment, the target graph corresponds to at least one transformation attribute, where the transformation attribute includes at least one of: transparency, rotation, zoom, translation.
The apparatus provided by the foregoing embodiment of the present disclosure generates, by a graphics generation unit, a target graphic according to an animation generation request, using a graphics generation class, where the graphics generation class is used to abstract drawing methods of graphics of different types, and the target graphic is used to generate an animation generated according to an animation generation request instruction; the animation generating unit draws the target graph to generate the animation according to the preset refresh rate and the change rate corresponding to the target graph, so that the development cost of the animation can be saved, the whole memory occupation is reduced, and the smoothness of the animation presenting process is promoted.
Referring now to fig. 6, shown is a schematic diagram of an electronic device (e.g., terminal device in fig. 1) 600 suitable for use in implementing embodiments of the present disclosure. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a fixed terminal such as a digital TV, a desktop computer, and the like. The terminal device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the use range of the embodiments of the present disclosure.
As shown in fig. 6, electronic device 600 may include a processing means (e.g., central processing unit, graphics processor, etc.) 601 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage means 608 into a Random Access Memory (RAM) 603. In the RAM603, various programs and data necessary for the operation of the electronic apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Generally, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 illustrates an electronic device 600 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 6 may represent one device or may represent multiple devices as desired.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 609, or may be installed from the storage means 608, or may be installed from the ROM 602. The computer program, when executed by the processing device 601, performs the above-described functions defined in the methods of embodiments of the present disclosure.
It should be noted that the computer readable medium described in the embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In embodiments of the present disclosure, however, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: generating a target graph by utilizing a graph generation class according to the animation generation request, wherein the graph generation class is used for abstracting drawing methods of graphs of different types, and the target graph is used for generating the animation which is generated by the animation generation request indication; and drawing the target graph to generate the animation according to a preset refresh rate and the change rate corresponding to the target graph.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a graphics generation unit and an animation generation unit. The names of these elements do not in some cases constitute a limitation on the elements themselves, and for example, a graphics-generating element may also be described as an "element that generates a target graphic using a graphics-generating class according to an animation-generating request".
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept as defined above. For example, the above features and (but not limited to) technical features with similar functions disclosed in the embodiments of the present disclosure are mutually replaced to form the technical solution.
Claims (10)
1. A method for generating an animation, comprising:
generating a target graph by utilizing a graph generation class according to an animation generation request, wherein the graph generation class is used for abstracting drawing methods of graphs of different types, and the target graph is used for generating an animation generated by the animation generation request indication;
and drawing the target graph to generate animation according to a preset refresh rate and the change rate corresponding to the target graph.
2. The method according to claim 1, wherein the number of the target patterns is two or more; and
the drawing the target graph to generate the animation according to the preset refresh rate and the change rate corresponding to the target graph comprises the following steps:
and respectively drawing each target graph according to the change rate respectively corresponding to each target graph based on the preset refresh rate to generate the animation.
3. The method of claim 1, wherein the target graphic belongs to a composite graphic, wherein the composite graphic comprises at least two sub-graphics.
4. The method of claim 1, wherein the generating a target graphic using a graphic generation class in accordance with the animation generation request comprises:
and acquiring the target graph from a graph cache pool by utilizing a graph generation class according to the animation generation request, wherein the graph cache pool comprises different types of graphs generated in advance.
5. The method of claim 4, wherein the drawing the target graph to generate an animation according to a preset refresh rate and a corresponding change rate of the target graph comprises:
in response to monitoring that the target graphic is not visible, recycling the target graphic.
6. The method of claim 5, wherein after the recycling the target graphic in response to the monitoring that the target graphic is not visible, the method further comprises:
and generating a new graph, and drawing the newly generated graph according to the refresh rate and the change rate corresponding to the newly generated graph.
7. The method according to one of claims 1 to 6, wherein the target graphic corresponds to at least one transformation attribute, wherein the transformation attribute comprises at least one of: transparency, rotation, zoom, translation.
8. An apparatus for generating an animation, comprising:
a graphic generation unit configured to generate a target graphic using a graphic generation class for abstracting a drawing method of graphics of different types according to an animation generation request, the target graphic being used to generate an animation that the animation generation request indicates to generate;
and the animation generation unit is configured to draw the target graph according to a preset refresh rate and a change rate corresponding to the target graph so as to generate an animation.
9. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-7.
10. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010575687.7A CN113838167A (en) | 2020-06-22 | 2020-06-22 | Method and apparatus for generating animation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010575687.7A CN113838167A (en) | 2020-06-22 | 2020-06-22 | Method and apparatus for generating animation |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113838167A true CN113838167A (en) | 2021-12-24 |
Family
ID=78963874
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010575687.7A Pending CN113838167A (en) | 2020-06-22 | 2020-06-22 | Method and apparatus for generating animation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113838167A (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1134194A (en) * | 1993-11-02 | 1996-10-23 | 塔里根特公司 | Object-oriented graphic system |
CN102866886A (en) * | 2012-09-04 | 2013-01-09 | 北京航空航天大学 | Web-based visual algorithm animation development system |
CN105824515A (en) * | 2016-03-22 | 2016-08-03 | 乐视网信息技术(北京)股份有限公司 | Element display method and device |
CN106610828A (en) * | 2015-10-21 | 2017-05-03 | 广州市动景计算机科技有限公司 | Method and device for playing GIF animation based on Android system |
CN110471700A (en) * | 2019-08-06 | 2019-11-19 | Oppo广东移动通信有限公司 | Graphic processing method, device, storage medium and electronic equipment |
CN110730374A (en) * | 2019-10-10 | 2020-01-24 | 北京字节跳动网络技术有限公司 | Animation object display method and device, electronic equipment and storage medium |
-
2020
- 2020-06-22 CN CN202010575687.7A patent/CN113838167A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1134194A (en) * | 1993-11-02 | 1996-10-23 | 塔里根特公司 | Object-oriented graphic system |
CN102866886A (en) * | 2012-09-04 | 2013-01-09 | 北京航空航天大学 | Web-based visual algorithm animation development system |
CN106610828A (en) * | 2015-10-21 | 2017-05-03 | 广州市动景计算机科技有限公司 | Method and device for playing GIF animation based on Android system |
CN105824515A (en) * | 2016-03-22 | 2016-08-03 | 乐视网信息技术(北京)股份有限公司 | Element display method and device |
CN110471700A (en) * | 2019-08-06 | 2019-11-19 | Oppo广东移动通信有限公司 | Graphic processing method, device, storage medium and electronic equipment |
CN110730374A (en) * | 2019-10-10 | 2020-01-24 | 北京字节跳动网络技术有限公司 | Animation object display method and device, electronic equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2023040443A1 (en) | Method and device for drawing canvas | |
US20220353587A1 (en) | Method and apparatus for generating music poster, electronic device, and medium | |
WO2022033131A1 (en) | Animation rendering method based on json data format | |
CN111324376B (en) | Function configuration method, device, electronic equipment and computer readable medium | |
CN110134905B (en) | Page update display method, device, equipment and storage medium | |
CN115269886A (en) | Media content processing method, device, equipment and storage medium | |
CN111694629A (en) | Information display method and device and electronic equipment | |
CN114428925A (en) | Page rendering method and device, electronic equipment and computer readable medium | |
WO2024067319A1 (en) | Method and system for creating stickers from user-generated content | |
CN113961280A (en) | View display method and device, electronic equipment and computer-readable storage medium | |
CN116301457A (en) | Target content and page display method, device, equipment and storage medium | |
CN110619615A (en) | Method and apparatus for processing image | |
CN110618811A (en) | Information presentation method and device | |
CN115454306A (en) | Display effect processing method and device, electronic equipment and storage medium | |
CN113138707B (en) | Interaction method, interaction device, electronic equipment and computer-readable storage medium | |
CN113838167A (en) | Method and apparatus for generating animation | |
CN114489910A (en) | Video conference data display method, device, equipment and medium | |
CN114090938A (en) | Page processing method and equipment | |
CN113778566A (en) | Native application calling method and device, electronic equipment and computer readable medium | |
CN112380821B (en) | Graphic display method and device and electronic equipment | |
CN112306339B (en) | Method and apparatus for displaying image | |
CN113553527B (en) | Data display method, device and equipment | |
CN114647472B (en) | Picture processing method, apparatus, device, storage medium, and program product | |
CN114510309B (en) | Animation effect setting method, device, equipment and medium | |
CN114357348B (en) | Display method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20211224 |