CN118567764A - Interface generation method and electronic equipment - Google Patents
Interface generation method and electronic equipment Download PDFInfo
- Publication number
- CN118567764A CN118567764A CN202310225183.6A CN202310225183A CN118567764A CN 118567764 A CN118567764 A CN 118567764A CN 202310225183 A CN202310225183 A CN 202310225183A CN 118567764 A CN118567764 A CN 118567764A
- Authority
- CN
- China
- Prior art keywords
- rendering
- electronic device
- application
- rendering mode
- mode
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 67
- 238000009877 rendering Methods 0.000 claims abstract description 538
- 230000008569 process Effects 0.000 claims description 27
- 238000004590 computer program Methods 0.000 claims description 4
- 238000007726 management method Methods 0.000 description 29
- 238000004891 communication Methods 0.000 description 19
- 230000000694 effects Effects 0.000 description 16
- 230000006870 function Effects 0.000 description 13
- 238000010295 mobile communication Methods 0.000 description 12
- 230000005236 sound signal Effects 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 239000000203 mixture Substances 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 230000015572 biosynthetic process Effects 0.000 description 4
- 210000000988 bone and bone Anatomy 0.000 description 4
- 238000003786 synthesis reaction Methods 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 3
- 239000000872 buffer Substances 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 238000000926 separation method Methods 0.000 description 3
- 229920001621 AMOLED Polymers 0.000 description 2
- 230000003416 augmentation Effects 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 230000006378 damage Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000013529 biological neural network Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000019771 cognition Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000013078 crystal Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000010349 pulsation Effects 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/46—Multiprogramming arrangements
- G06F9/50—Allocation of resources, e.g. of the central processing unit [CPU]
- G06F9/5005—Allocation of resources, e.g. of the central processing unit [CPU] to service a request
- G06F9/5027—Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals
- G06F9/505—Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals considering the load
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Human Computer Interaction (AREA)
- Stored Programmes (AREA)
Abstract
The application discloses an interface generation method and electronic equipment. The electronic device includes a unified rendering mode and a split rendering mode. The electronic device is in a unified rendering mode, the rendering trees of one or more application programs can be combined, and an interface containing one or more application program interfaces is generated through one-time rendering based on the combined target rendering tree, so that the rendering times are reduced, and the power consumption of the electronic device is reduced. The electronic equipment is in the separated rendering mode, the rendering tree of one or more application programs can be rendered to generate interfaces of one or more application programs, and the interfaces of the one or more application programs are combined to generate interfaces containing the interfaces of the one or more application programs, so that the rendering speed is improved. The electronic device can select a suitable rendering mode based on the state of the electronic device, so that the electronic device can select the rendering mode suitable for the electronic device according to the state of the electronic device, and better user experience is brought.
Description
Technical Field
The present application relates to the field of electronic technologies, and in particular, to an interface generating method and an electronic device.
Background
With the development of electronic technology, more and more electronic devices are involved in the daily life of users. In addition, as the resolution, size, and other parameters of the screen of the electronic device are higher, more contents can be displayed on the electronic device.
But before the electronic device displays the interface of the application program, each application needs to render by itself to generate display content, and then the synthesizer synthesizes the display content of each application program to obtain the user interface. When the load of each application program is high, the time for generating the display content is long, and the electronic device is caused to display a clamping effect.
Disclosure of Invention
The embodiment of the application provides an interface generation method and electronic equipment, wherein the electronic equipment comprises a unified rendering mode and a separated rendering mode. The electronic device is in a unified rendering mode, the rendering trees of one or more application programs can be combined, and an interface containing one or more application program interfaces is generated through one-time rendering based on the combined target rendering tree, so that the rendering times are reduced, and the power consumption of the electronic device is reduced. The electronic equipment is in the separated rendering mode, the rendering tree of one or more application programs can be rendered to generate interfaces of one or more application programs, and the interfaces of the one or more application programs are combined to generate interfaces containing the interfaces of the one or more application programs, so that the rendering speed is improved. The electronic device can select a suitable rendering mode based on the state of the electronic device, so that the electronic device can select the rendering mode suitable for the electronic device according to the state of the electronic device, and better user experience is brought.
In a first aspect, an embodiment of the present application provides an interface generating method, including:
The electronic equipment generates a first interface through rendering in a first rendering mode;
the electronic equipment displays a first interface;
the electronic device receives a first input for opening a first application;
The electronic device responds to the first input to acquire first running state information of one or more application programs and first device state information of the electronic device, wherein the one or more application programs comprise a first application;
The electronic equipment determines a second rendering mode based on the first running state information and the first equipment state information;
The electronic equipment generates a second interface through rendering in a second rendering mode, wherein the second interface comprises display content of the first application;
the electronic device displays a second interface.
In this way, when the state of the electronic device changes, the electronic device selects the corresponding rendering mode based on the state of the application and the state of the electronic device. The electronic equipment switches the rendering mode of the electronic equipment under the condition that the user does not feel, so that the electronic equipment selects an applicable rendering mode based on the condition of the electronic equipment, and better rendering service experience is provided for the user. And the situation that the electronic equipment only supports one rendering mode, and the rendering efficiency is low and the rendering effect is poor under partial conditions due to the limitation of the rendering mode can be avoided.
In one possible implementation, the first rendering mode is a unified rendering mode or a split rendering mode, and the second rendering mode is a unified rendering mode or a split rendering mode, and the first rendering mode is the same as or different from the second rendering mode.
In this way, after the state of the electronic device changes, the determined rendering mode may be the same as or different from the rendering mode currently in the electronic device.
In one possible implementation, after the electronic device displays the second interface, the method further includes:
the electronic device obtains second running state information of one or more application programs and second device state information of the electronic device, and the one or more application programs comprise a first application;
The electronic equipment determines a third rendering mode based on the second running state information and the second equipment state information;
The electronic equipment generates a third interface through rendering in a third rendering mode, wherein the third interface comprises display content of the first application;
the electronic device displays a third interface.
In this way, if the state of the electronic device changes during the process of running the first application, the electronic device changes the rendering mode of the electronic device accordingly, so as to realize the interface generating operation.
In one possible implementation, the third rendering mode is a unified rendering mode or a split rendering mode.
In one possible implementation, the second rendering mode is a unified rendering mode; before the electronic device generates the second interface through the second rendering mode rendering, the method further comprises:
the method comprises the steps that a first application generates a first rendering tree, and the first rendering tree is used for drawing display content of the first application;
The first application sends the first rendering tree to a system rendering service of the electronic device;
the system rendering service renders and generates a second interface based on the first rendering tree.
In this way, the electronic device may generate the second interface through the system rendering service in the unified rendering mode. The electronic device submits the rendering tree to the GPU directly, the GPU is not required to be called for many times, the load of the GPU is reduced, and the drawing rendering energy efficiency ratio of the graphic rendering pipeline is optimized. And the unified rendering mode is adopted, so that the electronic equipment can conveniently realize the cross-window dynamic effect, and the development of the cross-window dynamic effect is more convenient and unified.
In one possible implementation, the second rendering mode is a split rendering mode; before the electronic device generates the second interface through the second rendering mode rendering, the method further comprises:
the first application generates a first rendering tree;
the first application renders and generates display content of the first application based on the first rendering tree;
the method comprises the steps that a first application sends display content of the first application to a system rendering service of the electronic device;
The system rendering service generates a second interface based on the display content of the first application.
In this way, in the separate rendering mode, the application of the electronic device performs the rendering operation itself. When the electronic equipment generates a user interface in a separate rendering mode, the application program independently completes rendering of the layers, then the system rendering service performs summarized rendering on the layers of all the applications, and finally the interface is refreshed to a display screen of the terminal for display. And each application program runs independently, and the parallelization degree is high.
In one possible implementation, the first running state information includes one or more of a window display type of one or more applications, a rendering load of the one or more applications, a task load of the one or more applications, and whether the one or more applications are in an animation process; the first device state information includes one or more of a load of a CPU of the electronic device, a load of a system rendering service of the electronic device, a GPU load of the electronic device, a hardware performance of the electronic device, and an electrical quantity of the electronic device.
In this way, the electronic device jointly decides the rendering mode of the electronic device based on a plurality of parameters, and the plurality of parameters can enable the electronic device to obtain the rendering mode which is more suitable for the state of the electronic device.
In one possible implementation, the method further includes: the electronic device is provided with first running state information and weight values of all parameters in the first device state information, and the electronic device determines a second rendering mode based on the values of all the parameters and the weight values of all the parameters.
Thus, the electronic device sets weight values for the respective parameters and sets a larger weight value for the more important parameters. The electronic equipment is enabled to obtain a rendering mode which is more attached to the state of the electronic equipment.
In one possible implementation manner, the electronic device determines the second rendering mode based on the first running state information and the first device state information, and specifically includes:
The electronic equipment obtains application rendering modes corresponding to one or more application programs based on the first running state information of the one or more application programs, wherein the application rendering modes are separated rendering modes or unified rendering modes;
The electronic equipment obtains an equipment rendering mode based on the first equipment state information, wherein the equipment rendering mode is a separate rendering mode or a unified rendering mode;
the electronic device sets a greater number of rendering modes to the second rendering mode based on the one or more application rendering modes and the number of separate rendering modes and the number of unified rendering modes in the device rendering modes.
Therefore, when the electronic device runs a plurality of application programs, the rendering mode of the electronic device can be jointly determined based on the states of the application programs and the states of the device, so that the obtained rendering mode is more suitable for the electronic device.
In a second aspect, embodiments of the present application provide a chip system for application to an electronic device, the chip system comprising one or more processors for invoking computer instructions to cause the electronic device to perform a method as described in any of the possible implementations of the first aspect.
In a third aspect, embodiments of the present application provide a computer program product comprising instructions which, when run on an electronic device, cause the electronic device to perform a method as described in any one of the possible implementations of the first aspect.
In a fourth aspect, an embodiment of the present application provides an electronic device, including: one or more processors and memory; the memory is coupled to one or more processors, the memory for storing computer program code comprising computer instructions that the one or more processors call to cause the electronic device to perform the method as described in any one of the possible implementations of the first aspect described above.
In a fifth aspect, embodiments of the present application provide a computer readable storage medium comprising instructions which, when run on an electronic device, cause the electronic device to perform a method as described in any one of the possible implementations of the first aspect.
Drawings
Fig. 1A is a schematic flow chart of generating a target bitmap by an electronic device in a separate rendering mode according to an embodiment of the present application;
fig. 1B is a schematic view of a scene of an electronic device for generating a target bitmap in a separate rendering mode according to an embodiment of the present application;
Fig. 2A is a schematic flow chart of generating a target bitmap by an electronic device in a unified rendering mode according to an embodiment of the present application;
Fig. 2B is a schematic view of a scene of generating a target bitmap by an electronic device in a unified rendering mode according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of an interface generating method according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of a module according to an embodiment of the present application;
Fig. 5 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
Fig. 6 is a software architecture diagram according to an embodiment of the present application.
Detailed Description
The terminology used in the following embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates to the contrary. It should also be understood that the term "and/or" as used in this disclosure refers to and encompasses any or all possible combinations of one or more of the listed items.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the application, unless otherwise indicated, the meaning of "a plurality" is two or more.
The term "User Interface (UI)" in the following embodiments of the present application is a media interface for interaction and information exchange between an application program or an operating system and a user, which enables conversion between an internal form of information and a form acceptable to the user. The user interface is a source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, and the interface source code is analyzed and rendered on the electronic equipment to finally be presented as content which can be identified by a user. A commonly used presentation form of a user interface is a graphical user interface (graphic user interface, GUI), which refers to a graphically displayed user interface that is related to computer operations. It may include visual interface elements such as text, icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets, and the like that are displayed in a display of the electronic device.
For ease of understanding, related terms and related concepts related to the embodiments of the present application are described below. The terminology used in the description of the embodiments of the application herein is for the purpose of describing particular embodiments of the application only and is not intended to be limiting of the application.
When the electronic equipment generates the user interface, the appointed rendering mode can be used for realizing the rendering operation of the user interface, and the user interface displayed on the display screen is obtained. Wherein the specified rendering mode may be a separate rendering mode or a unified rendering mode. The electronic device may render the user interface using a separate rendering mode or a unified rendering mode.
When the electronic device generates a user interface in a split rendering mode, an application of the electronic device may render a bitmap (bitmap) that generates the application interface. And then, the system rendering service (RENDERSERVICE) of the electronic equipment can perform summarized rendering on the bitmaps of the application programs, and finally, the obtained images are refreshed to the display screen of the electronic equipment for display.
Specifically, the interface displayed by the electronic device on the display screen may include interfaces of one or more application programs, that is, the electronic device needs to generate an interface for one or more application programs, and synthesize the interfaces, thereby obtaining a synthesized interface displayed on the screen.
In this process, an application of the electronic device may generate a rendering tree, render the rendering tree to obtain a bitmap through a graphics synthesizer (graphics processing unit, GPU), and place the bitmap on a surface corresponding to the application. The application may then store the generated surface in buffers of a buffer queue (BufferQueue) provided by the surface synthesizer (SurfaceFlinger). The surface synthesizer may obtain bitmaps generated by each application from BufferQueue and synthesize bitmaps of each application. The electronic device may display the synthesized individual bitmaps. In some examples, the surface compositor may determine a bitmap of an application visible on the display screen through a Window management service (Window MANAGER SERVICE, WMS). The surface compositor may also determine the composition of the bitmap, e.g., client composition, device composition, etc., to the hardware compositor (Hardware Composer, HWC) after determining the visible bitmap. In some examples, when the electronic device implements a complex animation effect, for example, starting an application program, exiting the application program, navigating by gesture, converting colors, displaying multiple windows, recording a screen, and the like, the surface synthesizer may use OpenGL ES loader to draw part of bitmap (herein, may be referred to as a layer) content onto an off-screen buffer in advance, and then send the rest of the layer together to HWComposer for composite rendering, which may be referred to as "off-screen rendering".
Illustratively, as shown in FIG. 1A, the application includes a UI thread and a Render (Render) thread. The UI thread of the application may be used to generate a rendering tree of the application and pass the rendering tree to a rendering thread of the application, which may render a bitmap based on the rendering tree. The surface compositor may then compose a bitmap of the application and render the composed bitmap for display on a display screen.
By way of example, application 1 and application 2 of the electronic device are taken as examples. Application 1 generates rendering tree 1, then generates bitmap 1 based on rendering tree 1, application 2 generates rendering tree 2, and then generates bitmap 2 based on rendering tree 2. And after receiving the bitmap 1 and the bitmap 2, the surface synthesizer synthesizes the bitmap 1 and the bitmap 2 to obtain a target bitmap. As shown in fig. 1B, if the application 1 is an application providing a status bar, the application 2 is a calculator. The application 1 may generate a rendering tree 1 as shown in fig. 1B and render a bitmap 1 based on the rendering tree 1. The application 2 may generate a rendering tree 2 as shown in fig. 1B and render a bitmap 2 based on the rendering tree 2. Then, the system rendering service synthesizes bitmap 1 and bitmap 2, resulting in the target bitmap shown in fig. 1B.
When the electronic device generates the user interface in the separated rendering mode, the application program independently completes rendering of the layers, then the system rendering service (RENDERSERVICE) performs summarized rendering on the layers of each application, and finally the interface is refreshed to the display screen of the terminal for display. And each application program runs independently, and the parallelization degree is high.
However, in a multi-window scenario, starting from the generation of rendering trees by UI threads of multiple applications to the synthesis of layers of multiple applications by the surface synthesizer, since the process of invoking the GPU in the electronic device is always changing, for example, application 1, application 2, …, application N, surface synthesizer, etc., the electronic device needs to start the GPU n+1 times alone, where N is the number of applications. And, because the surface synthesizer is responsible for synthesizing a plurality of layers, high requirements are placed on the performance of the surface synthesizer. If the interface displayed by the electronic device includes high-order special effects for cross-windows (e.g., application switching, widget display, etc.), the system rendering service is required to process the layers provided by the respective applications based on the animation effects of the respective windows, increasing the memory overhead and computational overhead of the electronic device.
When the electronic device generates a user interface under a unified rendering framework, an application of the electronic device may generate a rendering tree. The system rendering service may obtain rendering trees for one or more applications and reassemble the one or more rendering trees to generate a target rendering tree. The system rendering service may render directly based on the target rendering tree to directly obtain a bitmap bearing the one or more application program interface image information such that the electronic device does not perform layer composition operations.
In the process that the system rendering service merges the target rendering tree of one or more application programs into the target rendering tree, the system rendering service can determine the animation effect of each layer, and change the child nodes in the target rendering tree according to the animation effect of each application program, so that the system rendering service can directly generate a target bitmap according to the target rendering tree without off-screen rendering. Moreover, the application program can be used for carrying out unified rendering by the system rendering service instead of generating the rendering thread, thereby being beneficial to improving the speed of interface rendering.
It should be noted that, in the case where the number of the rendering trees is one, the rendering tree is the target rendering tree; in the case where the number of rendering trees is greater than one, the electronic device may synthesize the plurality of rendering trees to obtain one target rendering tree.
Further, compared to the split rendering mode, the electronic device may merge the rendering tree of one or more application programs into the target rendering tree in the unified rendering mode, and may not need to generate a plurality of bitmaps in advance as layers for synthesis. The system rendering service can acquire the Z sequence of the layers corresponding to each application rendering tree in the process of rendering according to the target rendering tree to generate the bitmap, and takes the Z sequence as the Z sequence of the rendering tree, wherein the Z sequence can be used for identifying the hierarchical relationship of each layer.
Optionally, the electronic device may prune the child nodes corresponding to the bitmap that is not displayed or that does not affect the display accordingly, thereby avoiding overdrawing. Wherein, after the application program generates the rendering tree, the rendering tree may be sent to the system rendering service. The system rendering service may traverse the sub-nodes of each rendering tree, determine the position of the view corresponding to each sub-node in the surface, and determine the fully occluded view and the sub-nodes of the fully occluded view in combination with the Z-order of the rendering tree. The system rendering service may delete child nodes corresponding to views that are fully obscured in the rendering tree.
Illustratively, as shown in FIG. 2A, the UI thread of application 1 generates a render tree 1, the UI thread of application 2 generates a render tree 2, and application 1 and application 2 may pass the respective render trees to the system rendering service. The system rendering service may generate a new root node, and then connect the rendering tree 1 and the rendering tree 2 with the root node to obtain a target rendering tree. The system rendering service may then render a target bitmap based on the target rendering tree, which the electronic device may display. As shown in fig. 2B, if the application 1 is an application providing a status bar, the application 2 is a calculator. The application 1 may generate a rendering tree 1 as shown in fig. 2B. The application 2 may generate a rendering tree 2 as shown in fig. 2B. Then, the system rendering service synthesizes the rendering tree 1 and the rendering tree 2 to obtain a target rendering tree. And rendering to obtain the target bitmap shown in fig. 2B based on the target rendering tree.
Therefore, the electronic device submits the rendering tree to the GPU directly, the GPU is not required to be called for many times, the load of the GPU is reduced, and the drawing rendering energy efficiency ratio of the graphic rendering pipeline is optimized. And the unified rendering mode is adopted, so that the electronic equipment can conveniently realize the cross-window dynamic effect, and the development of the cross-window dynamic effect is more convenient and unified. However, compared with the separate rendering mode, the inter-process communication between the application program and the rendering service is increased in the unified rendering mode for synchronizing the rendering instruction, so that the rendering tasks of all applications are handed to the system rendering service, and the burden of the system rendering service is increased.
The embodiment of the application provides an interface generation method. The electronic device supports a split rendering mode and a unified rendering mode. The electronic device selects a corresponding rendering mode based on the state of the application and the state of the electronic device.
In this way, the electronic device is provided with a rendering mode selection policy, and the electronic device can select an appropriate rendering mode according to the state of the application providing the display content and the state of the electronic device. For example, the electronic device may employ a unified rendering mode when the application load is high, and a split rendering mode when the application load is low, and the system load is high. For another example, the electronic device may take a unified rendering mode when the power is low. The electronic device may also take a unified rendering mode when complex animation effects are achieved.
The electronic device can switch the rendering mode of the electronic device under the condition that the user does not feel, so that the electronic device can select an applicable rendering mode based on the condition of the electronic device, and better rendering service experience is provided for the user. And the situation that the electronic equipment only supports one rendering mode, and the rendering efficiency is low and the rendering effect is poor under partial conditions due to the limitation of the rendering mode can be avoided.
In some examples, the electronic device may determine the rendering mode of the electronic device based on the state of the application (e.g., window display type of the application, whether the application is in an animation process, task load of the application, rendering load of the application, etc.) and the state of the electronic device (e.g., load of system rendering services, load of CPU, load of GPU, hardware performance of the electronic device, power of the electronic device, etc.). For example, when the load of the system rendering service is high, the electronic device may adopt a separate rendering mode, so as to reduce the load of the system rendering service, enable the application program to render the respective bitmaps, and also improve the rendering efficiency. For another example, when the application program tasks are more, the electronic device can adopt a unified rendering mode to enable the system rendering service to execute the rendering tasks, so that the rendering expense of the application program is reduced, and the electronic device can smoothly display an interface including display contents of the application program when the application program is busy.
As shown in fig. 3, the interface generation method includes the steps of:
s301, the rendering mode of the electronic device is a first rendering mode, and the electronic device receives input for opening a first application.
The electronic device receiving input to open the first application may trigger the electronic device to execute step S302. The first application is an application program provided by the electronic device, and the electronic device can display an icon of the first application on the display screen, wherein the icon can be used for triggering the electronic device to display a page of the first application. For example, the first application may be an application installed on the electronic device, or the first application may also be an application provided by a cloud server, or the like.
S302, the electronic device determines a second rendering mode based on state information of one or more application programs and state information of the electronic device, wherein the one or more application programs comprise a first application, and the first rendering mode is different from the second rendering mode.
The electronic device may obtain, in response to an input to open the first application, state information for one or more applications of the electronic device and state information for the electronic device. Wherein the one or more applications are applications that provide visual display content to the display screen.
The state information of the application may include, but is not limited to, one or more of a window display type of the application, a task load of the application, a rendering load of the application, whether it is in an animation process, and the like.
Wherein the window display type of the application program can be used to indicate whether the window of the application program is a maximized display or a small window display. When the window display type of the application program is a small window display, the window of the application program can be a floating window or a split screen window. When the window display type of the application is maximized, the parameter does not affect the electronic device's choice of rendering mode. When the window display type of the application is a widget display, the parameter tends to cause the electronic device to set the rendering mode to a unified rendering mode.
Where whether an application is in an animation process may be used to indicate whether the application is in an animation process, for example, the application being in an animation process may include an animation process that initiates the application, an animation process that is in a change in the size of an application window, or the like. When an application is in an animation process (or complex animation effects need to be achieved), this parameter tends to cause the electronic device to set the rendering mode to a unified rendering mode. When the application is not in the animation process, the parameter does not affect the electronic device to decide the rendering mode.
Wherein, the rendering load of the application program can be used for representing the load of the application program rendering the bitmap. When the rendering load of the application is above a rendering load threshold (e.g., the rendering load threshold is 70% of the rendering load peak), the parameter tends to cause the electronic device to set the rendering mode to a unified rendering mode. When the rendering load of the application is below the rendering load threshold, the parameter does not affect the electronic device to decide the rendering mode.
Wherein the task load of the application may be used to represent the load of the application to perform tasks other than rendering. When the task load of an application is above a task load threshold (e.g., the task load threshold is 70% of the task load peak), the parameter tends to cause the electronic device to set the rendering mode to a unified rendering mode. When the task load of the application is below the task load threshold, the parameter does not affect the electronic device to decide the rendering mode.
The state information of the electronic device may include, but is not limited to, one or more of a load of a system rendering service, a load of a CPU, a load of a GPU, a hardware performance of the electronic device, a power level of the electronic device, and the like.
Wherein when the load of the system rendering service is above a system load threshold (e.g., the system load threshold is 70% of the load peak of the system rendering service), the parameter tends to cause the electronic device to set the rendering mode to the split rendering mode. When the load of the system rendering service is below the system load threshold, the parameter does not affect the electronic device to decide the rendering mode.
Wherein when the load of the CPU is higher than a CPU load threshold (e.g., the CPU load threshold is 70% of the CPU load peak), the parameter tends to cause the electronic device to set the rendering mode to the split rendering mode. When the load of the CPU is lower than the CPU load threshold, the parameter does not affect the rendering mode of the electronic device.
Wherein, when the load of the GPU is above a GPU load threshold (e.g., the GPU load threshold is 70% of the GPU load peak), the parameter tends to cause the electronic device to set the rendering mode to a unified rendering mode. When the load of the GPU is below the GPU load threshold, the parameter does not affect the electronic device to decide the rendering mode.
The hardware performance of the electronic device may refer to the synthesizer performance of the electronic device, where different synthesizer performances of different electronic devices, and if the synthesizer performance installed in the electronic device is better, the parameter tends to enable the electronic device to set the rendering mode to be a separate rendering mode. If the hardware (e.g., synthesizer) of the electronic device is performing poorly, the parameter tends to cause the electronic device to set the rendering mode to a uniform rendering mode. In this way, the situation that the composite bitmap takes too long in the split mode can be avoided.
Wherein when the power of the electronic device is above a power threshold (e.g., a power threshold of 30%) the parameter does not affect the electronic device to decide the rendering mode. When the power of the electronic device is below the power threshold, the parameter tends to cause the electronic device to set the rendering mode to a uniform rendering mode.
In this way, when the task load and the rendering load of one or more application programs are low and are not in the animation process, the second rendering mode is the separate rendering mode, and the electronic device can complete the rendering operation more quickly. When at least one application program in the one or more application programs is in the animation process, the second rendering mode is a unified rendering mode, and the electronic device can play the advantage of rendering the cross-window animation under unified rendering. When the task load or the rendering load of at least one application program in the one or more application programs is higher, the second rendering mode is a unified rendering mode, and the electronic device can give the rendering task to a unified system rendering service so as to reduce the load of an application process. When the GPU load of the electronic equipment is higher, the second rendering mode is a unified rendering mode, and the electronic equipment adopts one GPU submission in the unified rendering mode, so that the load of the GPU can be reduced. When the load of the system rendering service is higher and/or the load of the CPU is higher, the second rendering mode is a separated rendering mode, so that each application process can share and balance the rendering load. When the electric quantity of the electronic device is low (for example, the electric quantity is lower than 30%), the second rendering mode is a unified rendering mode, and the power consumption of the electronic device in the unified rendering mode is lower than that of the electronic device in the unified rendering mode, so that the power consumption of the electronic device can be saved, and the energy efficiency ratio of the system is optimized. In some examples, the second rendering mode is a split rendering mode when the hardware (e.g., synthesizer) of the electronic device is better performing and the power is lower, as the hardware performance of the electronic device is unchanged and is of higher importance than other parameters.
In other examples, the electronic device may set different weights for application state information of the application program and device state information of the electronic device. The electronic device may obtain an application rendering mode of each application based on application state information of each application program and a weight of the application state information, and obtain a device rendering mode based on device state information and a weight of the device state information. The electronic device may then derive a final rendering mode based on the application rendering mode of the respective application and the device rendering mode.
The weight ratio of the window display type in the application state information to the task load of the application program to the rendering load of the application program is 2:4:2:2. The weight ratio of the load of the system rendering service, the load of the CPU, the load of the GPU, the hardware performance of the electronic device and the electric quantity of the electronic device in the device state information is 2:2:2:3:1.
When one or more application programs of the electronic device include application 1 and application 2. If the window display type of the application 1 is small window display and is not in the animation process, the task load of the application program is higher than the task load threshold, and the rendering load of the application program is higher than the rendering load threshold, the results corresponding to the parameters can be obtained to be respectively a unified rendering mode, no influence is caused, the unified rendering mode and the unified rendering mode, and therefore the application rendering mode of the application 1 is obtained to be the unified rendering mode. If the window display type of the application 2 is the maximized display and is not in the animation process, the task load of the application program is higher than the task load threshold, and the rendering load of the application program is lower than the rendering load threshold, the results corresponding to the parameters can be obtained to be respectively in a non-influence, non-influence and non-influence unified rendering mode, so that the application rendering mode of the application 2 is obtained to be the unified rendering mode. If the load of the system rendering service in the equipment state information is higher than the system load threshold, the load of the CPU is higher than the CPU load threshold, the load of the GPU is higher than the GPU load threshold, the hardware performance of the electronic equipment is better, the electric quantity of the electronic equipment is higher than the electric quantity threshold, and the results corresponding to the parameters can be obtained to be respectively a separation rendering mode, a unified rendering mode and a separation rendering mode without influence, so that the equipment rendering mode is obtained to be the separation rendering mode. Since the application rendering mode of the application 1 is the unified rendering mode, the application rendering mode of the application 2 is the unified rendering mode, the device rendering mode is the separated rendering mode, the number of results of the unified rendering mode is more, and the electronic device can obtain the second rendering mode as the unified rendering mode. It will be appreciated that the various parameters at which the electronic device determines the second rendering mode are merely examples and should not be construed as being particularly limiting. It should be noted that, if the electronic device cannot obtain the final rendering modes according to each parameter, for example, the number of the two rendering modes is equal, the electronic device may keep the rendering modes of the electronic device unchanged, that is, the obtained second rendering mode is the same as the first rendering mode.
It should be further noted that, after determining the second rendering mode, the electronic device may notify all application programs and the system rendering service of the electronic device to perform the rendering operation in the second rendering mode.
Optionally, the electronic device may further detect states of the one or more applications and the electronic device in real time, and determine a rendering mode of the electronic device based on the acquired state information. In some examples, to save power consumption of the electronic device, the electronic device may determine that the rendering mode of the electronic device is a unified rendering mode or a split rendering mode every preset time (e.g., 16 ms).
S303, the electronic equipment renders an interface comprising display content of the first application in a second rendering mode.
The electronic device may render, in the second rendering mode, an interface including display content of the first application after determining the second rendering mode. The electronic device firstly switches from a first rendering mode to a second rendering mode, and then executes the operation of rendering to obtain an interface comprising display content of the first application. Specifically, when the first rendering mode is the separate rendering mode and the second rendering mode is the unified rendering mode, the electronic device may notify the one or more applications and the system rendering service to switch the rendering mode to the unified rendering mode when switching the rendering mode to the second rendering mode. The one or more applications (including the first application) may send data to be rendered to the system rendering service upon receipt of the notification. The system rendering service may render the resulting user interface based on the data to be rendered. When the first rendering mode is a unified rendering mode and the second rendering mode is a split rendering mode, the electronic device may notify the one or more applications and the system rendering service to switch the rendering mode to the split rendering mode when switching the rendering mode to the second rendering mode. The application program or application programs can render to obtain respective bitmaps, and then the bitmaps are transmitted to a system rendering service to perform bitmap synthesis and other operations to obtain a user interface.
It should be noted that, here, only the case where the first rendering mode is different from the second rendering mode is described, and if the second rendering mode determined by the electronic device is the same as the first rendering mode, the electronic device does not need to switch the rendering modes.
In other examples, to save power consumption of the electronic device, the electronic device may determine that the rendering mode of the electronic device is a unified rendering mode or a split rendering mode every preset time (e.g., 16 ms). Thus, the electronic equipment can monitor the states of the application program and the electronic equipment in real time and switch the rendering modes in time.
In some examples, the electronic device may include, but is not limited to, a device state awareness module, an application state awareness module, a rendering mode decision module, and the like. The application state sensing module may be configured to monitor a state of each application program and transmit state information of each application program to the rendering mode decision module. The device state sensing module may be configured to monitor a state of the electronic device and communicate state information of the electronic device to the rendering mode decision module. The state information of the application program and the state information of the electronic device may refer to the embodiment shown in fig. 3, which is not described herein.
The rendering mode decision module may determine a rendering mode of the electronic device based on information fed back by the application state sensing module and the device state sensing module. The rendering mode decision module may also notify an application of the electronic device and a system rendering service of the determined rendering mode after determining the rendering mode. Specifically, the rendering mode decision module determines the rendering mode according to the state of the application program and the state of the device, which can be referred to the embodiment shown in fig. 3, and will not be described herein.
In some examples, the rendering mode decision module may notify each application and the system rendering service of the determined rendering mode through a message mechanism after determining the rendering mode. Each application and the system rendering service may perform data synchronization preparation before switching the rendering mode after receiving the notification of switching the rendering mode, for example, each application may send the rendering tree to the system rendering service when switching from the separate rendering mode to the unified rendering mode. After the data of the electronic equipment is synchronized, the non-perception switching can be performed between the two rendering modes.
In some examples, the rendering mode decision module may inform the respective application programs and the system rendering service to switch rendering modes through a messaging mechanism only when a switch of rendering modes is required. In this way, the electronic device may not notify the application and system rendering services when it is not necessary to switch rendering modes.
By way of example, fig. 4 illustrates a data interaction flow between various modules of an electronic device. As shown in fig. 4, the application state aware module of the electronic device may obtain application state information from an application program and pass the application state information to the rendering mode decision module. The device state awareness module may obtain device state information from the system rendering service and pass the device state information to the rendering mode decision module. The rendering mode decision module may determine a rendering mode of the electronic device based on the application state information and the device state information according to a preset rendering mode selection rule. The rendering mode decision module may also send the determined rendering mode to an application and a system rendering service. The application program and the system rendering service can set the rendering mode as the rendering mode determined by the rendering mode decision module after receiving the rendering mode information sent by the rendering mode decision module. Wherein the flow of data between the various modules can be seen by solid arrows in fig. 4.
In some examples, the rendering modes of the application and system rendering services are separate rendering modes. And after the rendering mode decision module determines that the rendering mode is the unified rendering mode, notifying an application program and a system rendering service to set the rendering mode to be the unified rendering mode. The application may send a rendering message to the system rendering service, as indicated by the dashed arrow in fig. 4. Wherein the rendering message may comprise an application generated rendering tree. In this way, the system rendering service may implement rendering operations in a unified rendering mode.
It should be noted that, the device state sensing module obtains the device state information from the system rendering service module here is merely an example. Since the state information of the electronic device may also include the load of the GPU, the load of the CPU, the power of the electronic device, and the like. The system rendering service is unable to provide such status information to the device status awareness module. The device state awareness module may obtain such state information from the corresponding service. For example, the device state awareness module may obtain the power of the electronic device from a power management module of the electronic device.
In some examples, the rendering mode decision module may detect the launch state of the respective application. When the rendering mode decision module detects that a certain application program is started, the rendering mode of the electronic device can be determined according to the steps, and the application program is informed of the determined rendering mode.
It will be appreciated that the rendering mode decision module, upon determining a rendering mode change, notifies all applications and the system rendering service to perform a rendering mode switch operation.
The electronic equipment provided by the embodiment of the application is described below.
Referring to fig. 5, fig. 5 schematically shows a hardware structure of an electronic device according to an embodiment of the present application.
The electronic device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device. In other examples of the application, the electronic device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural-Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can be a neural center and a command center of the electronic device. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some examples, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some examples, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
It should be understood that the connection relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device. In other examples of the present application, the electronic device may also use different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other examples, the power management module 141 may also be provided in the processor 110. In other examples, the power management module 141 and the charge management module 140 may also be provided in the same device.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other examples, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied on an electronic device. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some examples, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some examples, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some examples, the modem processor may be a stand-alone device. In other examples, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near field communication (NEAR FIELD communication, NFC), infrared (IR), etc., as applied to electronic devices. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some examples, antenna 1 and mobile communication module 150 of the electronic device are coupled, and antenna 2 and wireless communication module 160 are coupled, such that the electronic device may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques can include a global system for mobile communications (global system for mobile communications, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation SATELLITE SYSTEM, GLONASS), a beidou satellite navigation system (beidou navigation SATELLITE SYSTEM, BDS), a quasi zenith satellite system (quasi-zenith SATELLITE SYSTEM, QZSS) and/or a satellite based augmentation system (SATELLITE BASED AUGMENTATION SYSTEMS, SBAS).
The electronic device implements display functions via a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, a quantum dot LIGHT EMITTING diode (QLED), or the like. In some examples, the electronic device may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device may implement shooting functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also perform algorithm optimization on noise and brightness of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some examples, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some examples, the electronic device may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, and so on.
Video codecs are used to compress or decompress digital video. The electronic device may support one or more video codecs. In this way, the electronic device may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent cognition of electronic devices can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect external non-volatile memory to enable expansion of the memory capabilities of the electronic device. The external nonvolatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and video are stored in an external nonvolatile memory.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device (e.g., audio data, phonebook, etc.), and so forth. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. When a touch operation is applied to the display screen 194, the electronic apparatus detects the intensity of the touch operation according to the pressure sensor 180A. The electronic device may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. The gyro sensor 180B may be used to determine a motion gesture of the electronic device. The air pressure sensor 180C is used to measure air pressure. The magnetic sensor 180D includes a hall sensor. The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device in various directions (e.g., directions in which three axes in an x, y, z three-axis coordinate system of the electronic device are pointing). A distance sensor 180F for measuring a distance. The electronic device may measure the distance by infrared or laser. The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The ambient light sensor 180L is used to sense ambient light level. The electronic device can adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The fingerprint sensor 180H is used to collect a fingerprint. The temperature sensor 180J is for detecting temperature. The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to the touch operation may be provided through the display 194. In other examples, the touch sensor 180K may also be disposed on a surface of the electronic device at a different location than the display 194. The bone conduction sensor 180M may acquire a vibration signal. In some examples, bone conduction sensor 180M may acquire a vibration signal, a blood pressure pulsation signal, of a human vocal tract vibrating bone mass. The keys 190 may be mechanical keys or touch keys. The electronic device may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device.
Next, a software architecture diagram provided in an embodiment of the present application will be described.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some examples, the system is divided into four layers, from top to bottom, an application layer, an application framework layer, a system library, and a kernel layer, respectively.
As shown in fig. 6, the application layer may include a series of application packages. The application package may include camera, gallery, calendar, talk, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
In embodiments of the application, an application may be used to generate a rendering tree. In the split rendering mode, the application may also render the resulting displayed bitmap based on the rendering tree.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for the application of the application layer. The application framework layer includes a number of predefined functions.
The application framework layer may include a window management service, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window management service is used to manage window programs. Such as the launching, adding, deleting, etc. of windows. The window management service may also determine the applications displayed on the window and determine the creation, destruction, dynamic effects, etc. of the layers of the applications. The window management service may also be used to determine if there is a status bar, lock a screen, intercept a screen, etc.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The telephony manager is for providing communication functions of the electronic device. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification presented in the form of a chart or scroll bar text in the system top status bar, such as a notification of a background running application, or a notification presented on a screen in the form of a dialog interface. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
A system rendering service may be used to obtain a user interface. In the split rendering mode, the system rendering service may be responsible for graphics layer composition, implementing animation effects, and the like. In a unified rendering mode, the system rendering service may obtain a target rendering tree based on the rendering tree of one or more applications and a user interface based on the target rendering tree. The system rendering service can synchronize layer information such as layer creation, destruction, attribute change and the like through the window management service. The system rendering service may synchronize information of the display area, such as the size of a screen, from the display management service.
The application state awareness module may be used to monitor the state of an application. In some examples, the application state awareness module monitors only the state of the application providing the visual display content. The device state sensing module may be used to monitor the state of the electronic device. The rendering mode decision module may be identical to determining a rendering mode of the electronic device.
The runtime includes a core library and a virtual machine. The runtime is responsible for the scheduling and management of the operating system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), graphics processing Libraries, wherein graphics processing Libraries include three-dimensional graphics processing Libraries (e.g., openGL ES), two-dimensional graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provides a fusion of two-Dimensional (2D) and three-Dimensional (3D) layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing 3D graphic drawing, image rendering, layer synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a display driver, a camera driver, an audio driver, a sensor driver and a virtual card driver.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.
Claims (11)
1. An interface generation method, comprising:
The electronic equipment generates a first interface through rendering in a first rendering mode;
the electronic equipment displays the first interface;
The electronic device receives a first input for opening a first application;
The electronic device responds to the first input to acquire first running state information of one or more application programs and first device state information of the electronic device, wherein the one or more application programs comprise the first application;
the electronic equipment determines a second rendering mode based on the first running state information and the first equipment state information;
The electronic equipment generates a second interface through the rendering of the second rendering mode, wherein the second interface comprises display content of the first application;
the electronic device displays the second interface.
2. The method of claim 1, wherein the first rendering mode is a unified rendering mode or a split rendering mode, the second rendering mode is the unified rendering mode or the split rendering mode, and the first rendering mode is the same as or different from the second rendering mode.
3. The method of claim 1 or 2, wherein after the electronic device displays the second interface, the method further comprises:
the electronic equipment obtains second running state information of one or more application programs and second equipment state information of the electronic equipment, wherein the one or more application programs comprise the first application;
The electronic equipment determines the third rendering mode based on the second running state information and the second equipment state information;
The electronic equipment generates a third interface through the third rendering mode rendering, wherein the third interface comprises display content of the first application;
and the electronic equipment displays the third interface.
4. A method according to claim 3, wherein the third rendering mode is a unified rendering mode or a split rendering mode.
5. The method of any of claims 1-4, wherein the second rendering mode is a unified rendering mode; before the electronic device generates a second interface through the second rendering mode rendering, the method further includes:
the first application generates a first rendering tree, and the first rendering tree is used for drawing display content of the first application;
the first application sends the first rendering tree to a system rendering service of the electronic device;
The system rendering service renders and generates the second interface based on the first rendering tree.
6. The method of any of claims 1-4, wherein the second rendering mode is a split rendering mode; before the electronic device generates a second interface through the second rendering mode rendering, the method further includes:
the first application generates a first rendering tree;
The first application renders and generates display content of the first application based on the first rendering tree;
The first application sends display content of the first application to a system rendering service of the electronic device;
the system rendering service generates the second interface based on the display content of the first application.
7. The method of claims 1-6, wherein the first run state information includes a window display type of the one or more applications, a rendering load of the one or more applications, a task load of the one or more applications, whether one or more of the one or more applications is in an animation process; the first device state information includes a load of a CPU of the electronic device, a load of a system rendering service of the electronic device, a GPU load of the electronic device, a hardware performance of the electronic device, and one or more of an electrical quantity of the electronic device.
8. The method of claim 7, wherein the method further comprises:
The electronic equipment is provided with the first running state information and the weight value of each parameter in the first equipment state information, and the electronic equipment determines the second rendering mode based on the value of each parameter and the weight value of each parameter.
9. The method of claim 7, wherein the electronic device determines a second rendering mode based on the first operating state information and the first device state information, specifically comprising:
The electronic equipment obtains an application rendering mode corresponding to one or more application programs based on first running state information of the one or more application programs, wherein the application rendering mode is a separate rendering mode or a unified rendering mode;
The electronic equipment obtains an equipment rendering mode based on the first equipment state information, wherein the equipment rendering mode is the separated rendering mode or the unified rendering mode;
The electronic device sets a greater number of rendering modes to the second rendering mode based on the one or more application rendering modes and the number of separate rendering modes and the number of unified rendering modes in the device rendering modes.
10. An electronic device, the electronic device comprising: one or more processors and memory; the memory is coupled with the one or more processors, the memory for storing computer program code comprising computer instructions that the one or more processors invoke to cause the electronic device to perform the method of any of claims 1-9.
11. A computer readable storage medium comprising instructions which, when run on an electronic device, cause the electronic device to perform the method of any of claims 1-9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310225183.6A CN118567764A (en) | 2023-02-28 | 2023-02-28 | Interface generation method and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310225183.6A CN118567764A (en) | 2023-02-28 | 2023-02-28 | Interface generation method and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118567764A true CN118567764A (en) | 2024-08-30 |
Family
ID=92462338
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310225183.6A Pending CN118567764A (en) | 2023-02-28 | 2023-02-28 | Interface generation method and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118567764A (en) |
-
2023
- 2023-02-28 CN CN202310225183.6A patent/CN118567764A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115473957B (en) | Image processing method and electronic equipment | |
WO2020253719A1 (en) | Screen recording method and electronic device | |
CN113254120B (en) | Data processing method and related device | |
WO2021036770A1 (en) | Split-screen processing method and terminal device | |
WO2022007862A1 (en) | Image processing method, system, electronic device and computer readable storage medium | |
EP4195707A1 (en) | Function switching entry determining method and electronic device | |
WO2020155875A1 (en) | Display method for electronic device, graphic user interface and electronic device | |
WO2021204103A1 (en) | Picture preview method, electronic device, and storage medium | |
CN115119048B (en) | Video stream processing method and electronic equipment | |
WO2023160224A1 (en) | Photographing method and related device | |
CN117806745B (en) | Interface generation method and electronic equipment | |
CN116052236B (en) | Face detection processing engine, shooting method and equipment related to face detection | |
US20230412929A1 (en) | Photographing Method and Related Apparatus | |
CN118567764A (en) | Interface generation method and electronic equipment | |
CN117009005A (en) | Display method, automobile and electronic equipment | |
CN115994006A (en) | Animation effect display method and electronic equipment | |
EP4296845A1 (en) | Screen projection method and system, and related apparatus | |
CN116700555B (en) | Dynamic effect processing method and electronic equipment | |
WO2022206709A1 (en) | Component loading method for application and related apparatus | |
WO2024083009A1 (en) | Interface generation method and electronic device | |
CN116700578B (en) | Layer synthesis method, electronic device and storage medium | |
CN117689796B (en) | Rendering processing method and electronic equipment | |
CN117950846A (en) | Resource scheduling method and related device | |
CN116991532A (en) | Virtual machine window display method, electronic equipment and system | |
CN117850989A (en) | Service calling method, system and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |