CN110874172B - Method, device, medium and electronic equipment for amplifying APP interface - Google Patents
Method, device, medium and electronic equipment for amplifying APP interface Download PDFInfo
- Publication number
- CN110874172B CN110874172B CN201811014444.5A CN201811014444A CN110874172B CN 110874172 B CN110874172 B CN 110874172B CN 201811014444 A CN201811014444 A CN 201811014444A CN 110874172 B CN110874172 B CN 110874172B
- Authority
- CN
- China
- Prior art keywords
- view
- mapping
- trigger event
- event
- operation position
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the invention provides a method for amplifying an APP interface, wherein a trigger gesture of a magnifying glass control is registered to the bottommost layer of a view, and the method comprises the following steps: after a trigger event triggered by screen operation is obtained, determining a corresponding mapping position of the operation position in an original interface according to the operation position of the screen operation; setting the state of the topmost view to a non-responsive event state; communicating the trigger event and the mapping location along a chain of responses to a lowest level of the view; and the magnifying glass control processes the trigger event according to the mapping position. In the technical scheme of the embodiment of the invention, the state of the top view is set as the non-response state event, so that the trigger event is transmitted to the bottom layer of the view and is processed by the magnifying glass control.
Description
Technical Field
The invention relates to the technical field of wireless communication, in particular to a method, a device, a medium and electronic equipment for amplifying an APP interface.
Background
With the advent of the mobile internet era, mobile phones have become an indispensable part of people's lives.
Due to the limitation of the mobile phone screen, pictures or characters displayed on the mobile phone screen are small, and a user with poor eyesight, such as presbyopia, has certain obstacles when using the mobile phone.
When developing APP (APPLICATION) of a mobile terminal such as a mobile phone based on a mobile operating system, the needs of users with poor eyesight may be considered. For example, some APPs provide functionality for setting font and picture sizes; in addition, the iOS operating system can realize the APP magnifying glass based on the full platform, and the whole amplification and real-time clicking of the APP are realized.
However, when the display content is enlarged by setting the font and the picture size, the setting size range of the font or the picture is limited by the beauty of the APP interface or the interface layout, and there may be a case where even the maximum font is set, the demand of the user with poor eyesight cannot be satisfied. In addition, a large software development amount is required for setting the sizes of the fonts and the pictures, and particularly when an e-commerce APP which needs to display a complex interface containing various commodity category information is developed, the workload of a developer is huge. After the software development is completed, developers also need to maintain the software and upgrade the software, and the workload is increased. For example, when a developer maintains software, several sets of page logics need to be maintained for the same page, the modification of one interface position needs to be carried out, and the conditions of various pages need to be considered; meanwhile, different resources are required to be introduced into each set of page, so that the code amount is increased, the size of the APP is increased, and the software development and intervention cost is increased.
When the display content is amplified through the full-platform APP magnifier, the page accessed to the magnifier needs to realize the amplification gesture by itself, and each click-supported position or control needs to be specially processed, for example, the unique identifier and the action attribute are assigned. Therefore, the cost of accessing the magnifying glass control to the application program is high.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present invention and therefore may include information that does not constitute prior art known to a person of ordinary skill in the art.
Disclosure of Invention
The embodiment of the invention aims to provide a method for amplifying an APP interface, so that the problem of high cost of accessing a magnifying glass control into an application program is solved to at least a certain extent.
Additional features and advantages of the invention will be set forth in the detailed description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
According to a first aspect of the embodiments of the present invention, there is provided a method for magnifying an APP interface, where a trigger gesture of a magnifying glass control is registered to a lowest layer of a view, the method including: after a trigger event triggered by screen operation is acquired, determining a corresponding mapping position of the operation position in an original interface according to the operation position of the screen operation; setting the state of the topmost view to a non-responsive event state; passing the trigger event and the mapping location along a chain of responses to a lowest level of the view; and the magnifying glass control processes the trigger event according to the mapping position.
Preferably, the processing, by the magnifier control, of the trigger event according to the mapping position includes: acquiring the coordinate of the central point of the magnifying lens according to the mapping position; and rendering the view corresponding to the original interface into the newly-built first view through amplification and translation according to the center point coordinate.
Preferably, the communicating the trigger event and the mapping location to the lowest layer of the view along a chain of responses comprises: and transmitting the trigger event and the mapping position to the root view direction of the response chain along the response chain until reaching the lowest layer of the view.
Preferably, before determining the mapping position of the operation position in the original interface corresponding to the operation position according to the operation position of the screen operation, the method further includes: and when the screen operation representation user triggers a magnifying glass, rendering the screenshot into a newly-built second view, wherein the second view is the topmost view.
Preferably, before determining the mapping position of the operation position in the original interface corresponding to the operation position according to the operation position of the screen operation, the method further includes: acquiring a mapping relation between the screenshot and the original interface; and determining a mapping position corresponding to the operation position in an original interface according to the mapping relation and the operation position of the screen operation.
According to a second aspect of the embodiments of the present invention, there is provided an apparatus for magnifying an APP interface, wherein a trigger gesture of a magnifying glass control is registered to the lowest layer of a view, the apparatus comprising: the mapping unit is used for determining a mapping position corresponding to the operation position in the original interface according to the operation position of the screen operation after acquiring a trigger event triggered by the screen operation; the setting unit is used for setting the state of the topmost view into a non-response event state; a transfer unit for transferring the trigger event and the mapping location to the lowest layer of the view along a response chain; and the processing unit is used for enabling the magnifying glass control to process the trigger event according to the mapping position.
Preferably, the processing unit is further configured to: acquiring the coordinate of the central point of the magnifying lens according to the mapping position; and rendering the view corresponding to the original interface into the newly-built first view through amplification and translation according to the center point coordinate.
Preferably, the transfer unit is further configured to: and transmitting the trigger event and the mapping position to the root view direction of the response chain along the response chain until reaching the lowest layer of the view.
Preferably, the apparatus further comprises: and the screenshot unit is used for rendering the screenshot into a newly-built second view when the screen operation representation user triggers the magnifying glass, wherein the second view is the topmost view.
Preferably, the mapping unit is further configured to: acquiring a mapping relation between the screenshot and the original interface; and determining a corresponding mapping position of the operation position in an original interface according to the mapping relation and the operation position of the screen operation.
According to a third aspect of embodiments of the present invention, there is provided a computer readable medium, on which a computer program is stored, which when executed by a processor, implements the method of magnifying an APP interface as described in the first aspect of embodiments above.
According to a fourth aspect of embodiments of the present invention, there is provided an electronic apparatus, including: one or more processors; storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement a method of magnifying an APP interface as described in the first aspect of the embodiments above.
The technical scheme provided by the embodiment of the invention can have the following beneficial effects:
in the technical scheme provided by some embodiments of the present invention, the state of the top view is set as the non-response state event, so that the trigger event is transmitted to the bottom layer of the view and processed by the magnifier control, thereby reducing the cost of the magnifier control accessing the application program and improving the running efficiency of the application program.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention. It is obvious that the drawings in the following description are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort. In the drawings:
FIG. 1 schematically illustrates a method flow diagram of a method of magnifying an APP interface, in accordance with an embodiment of the present invention.
FIG. 2 is a schematic diagram of an event response chain of an iOS system;
FIG. 3 schematically illustrates a flow diagram for one embodiment of a method of magnifying an APP interface, in accordance with the present invention;
FIG. 4 schematically illustrates a block diagram of an apparatus for magnifying an APP interface, in accordance with an embodiment of the present invention;
FIG. 5 schematically illustrates a block diagram of one embodiment of an apparatus for magnifying an APP interface, in accordance with the present invention;
FIG. 6 illustrates a schematic structural diagram of a computer system suitable for use with the electronic device to implement an embodiment of the invention.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention may be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known methods, devices, implementations or operations have not been shown or described in detail to avoid obscuring aspects of the invention.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. I.e. these functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
Fig. 1 schematically illustrates a method of magnifying an APP interface of an exemplary embodiment of the present disclosure. According to an exemplary embodiment of the present disclosure, the triggering gesture of the magnifying glass control is registered to the lowest layer of the view. Referring to fig. 1, the method of magnifying an APP interface may include the steps of:
and step S102, after a trigger event triggered by the screen operation is obtained, determining a mapping position of the operation position in the original interface according to the operation position of the screen operation.
And step S104, setting the state of the topmost view as a non-response event state.
Step S106, the trigger event and the mapping position are transmitted to the lowest layer of the view along the response chain.
And step S108, the magnifying glass control processes the trigger event according to the mapping position.
The present disclosure is implemented based on a mobile operating system, in particular, based on an iOS system. Fig. 2 is a schematic diagram of an event response chain of an iOS system.
As shown in fig. 2, the event response chain of the iOS system is represented as a transfer mode, and the arrow direction represents the event transfer direction. If the originating view 202 in FIG. 2 receiving the event cannot handle the event and it is not the top level view, the event will be passed to its parent view, view 204. If the view 204 cannot process the event after it has captured the event, it continues to pass up, looping the process. Events are passed to their view controller 208 if the top level view, view 206, is still not able to handle the event, to Window, user interface Window 210 if the view controller 208 is also not able to handle the event, at which point the windows are not able to handle the event is passed to the root view Application, Application 212, and if the Application is also not able to handle the event, the event is discarded.
According to an exemplary embodiment of the present disclosure, the APP magnifier is registered to the Window layer, i.e., the bottom layer of the view, so that according to the event response chain mechanism of the iOS system, when the page of the upper layer does not respond, an event is transferred to the Window layer, thereby triggering the APP magnifier. The scheme is easy to realize, service codes are not immersed, and compared with the method of realizing the amplification of display contents by setting fonts and picture sizes, the method reduces the code amount and the workload and the development cost of developers.
In addition, the method sets the state of the topmost view to the non-response event state after acquiring the mapping position of the operation position of the trigger event in the original interface, so that the response chain of the trigger event determines the parent view of the topmost view as the first responder. Because the first responder cannot process the trigger event, the trigger event is passed to the bottom layer of the view for processing by the APP magnifier.
In the prior art, the former scheme is that a control on an enlarged interface is required to assign a unique identifier and an action attribute, and a corresponding control is required to be found according to a position when an event is transmitted so as to perform corresponding operation. In the exemplary embodiment of the present disclosure, the trigger event is transferred to the lowest layer of the view and processed by the APP magnifier, the execution efficiency of APP is higher than that of the prior art, and the APP supports more event transfer types. In addition, in the prior art, the event transmission type is limited, the transmission of events such as stress and double click is difficult to support, and when the technical scheme disclosed by the invention is adopted, multiple event transmission types such as stress and double click can be supported.
In an exemplary embodiment of the disclosure, when a user opens an APP and triggers a gesture on an APP interface, the APP captures a screen and displays the obtained screen capture to the user. Specifically, when the screen operation representation user triggers the magnifying glass, the screenshot is rendered into a newly-built second view, and the second view is the topmost view. The screenshot may be a picture obtained by scaling up or scaling down a picture obtained by capturing a screen. In step S102, a mapping position of the operation position in the original interface can be obtained according to the zoom-in scale value or the zoom-out scale value and the original coordinate of the operation position.
Specifically, in step S102, a mapping relationship between the screenshot and the original interface is first obtained, where the mapping relationship includes the zoom-in ratio value or the zoom-out ratio value, and then a mapping position corresponding to the operation position in the original interface is determined according to the mapping relationship and the operation position of the screen operation.
Then, step S104 is executed to set the state of the topmost view to the non-response event state. At this time, upon receiving the trigger event and distributing the trigger event to the first responder, the parent view of the topmost view is the first responder since the topmost view has been set to the non-responsive event state. The first responder is the first view, the starting view, on the chain of event responses.
After the trigger event is distributed to the first responder, in step S106, starting from the first responder, the trigger event and the mapping position are passed along the response chain towards the root view of the response chain until reaching the bottom layer of the view. Because the first responder cannot process the trigger event, the trigger event is transferred to the parent view of the first responder, and the trigger event and the mapping position are sequentially transferred after confirming that the parent view of the first responder cannot process the trigger event, and finally the lowest layer of the view is reached.
After the trigger event and the mapping position are transferred to the bottom layer of the view, namely the Window layer, the APP magnifier starts to process the trigger event. Specifically, when the trigger event is processed, in step S108, the center point coordinate of the magnifier is first obtained according to the mapping position, and then the view corresponding to the original interface is rendered into the newly-created first view through zooming and translating according to the center point coordinate.
Referring to fig. 3, a flow diagram of one embodiment of a method of magnifying an APP interface according to the present disclosure is shown, comprising the steps of:
step S302, determining the corresponding mapping position of the operation position of the trigger event in the original interface.
And step S304, setting the state of the topmost view as a non-response event state.
Step S306, a first responder of the response chain is obtained.
Step S308, the trigger event and the mapping location are passed from the first responder along the response chain to the bottom-most direction of the view.
And step S310, the magnifying glass control processes the trigger event according to the mapping position.
According to the method for amplifying the APP interface, the state of the top view is set to be the non-response state event, so that the trigger event is transmitted to the bottom of the view and processed by the magnifying glass control, the cost of the magnifying glass control for accessing the application program is reduced, and the running efficiency of the application program is improved.
Embodiments of the apparatus of the present invention are described below, which can be used to perform the above-described method for magnifying an APP interface of the present invention. According to an exemplary embodiment of the present disclosure, the triggering gesture of the magnifying glass control is registered to the lowest layer of the view. Specifically, with reference to fig. 4, an apparatus 400 for magnifying an APP interface comprises:
the mapping unit 402 is configured to determine, after acquiring a trigger event triggered by a screen operation, a mapping position corresponding to an operation position in the original interface according to the operation position of the screen operation.
A setting unit 404, configured to set the state of the topmost view to a non-responsive event state.
A transfer unit 406 for transferring the trigger event and the mapping location along the response chain to the lowest layer of the view.
And the processing unit 408 is used for enabling the magnifying glass control to process the trigger event according to the mapping position.
According to an exemplary embodiment of the present disclosure, the APP magnifier is registered to the Window layer, i.e., the bottom layer of the view, so that according to the event response chain mechanism of the iOS system, when the page of the upper layer does not respond, an event is transferred to the Window layer, thereby triggering the APP magnifier. The scheme is easy to realize, service codes are not immersed, and compared with the method of realizing the amplification of display contents by setting fonts and picture sizes, the method reduces the code amount and the workload and the development cost of developers.
In addition, after the mapping position of the operation position of the trigger event in the original interface is obtained, the state of the top view is set to be the non-response event state, and then the trigger event is transmitted to the bottom layer of the view and processed by the APP magnifier, so that the execution efficiency of the APP is higher than that of the prior art, and the event transmission types supported by the APP are more.
Specifically, according to an exemplary embodiment of the present disclosure, the mapping unit 402 may obtain a mapping position corresponding to an operation position in the original interface according to an enlargement ratio value or a reduction ratio value of the screenshot to the original interface and an original coordinate of the operation position of the screen operation. Specifically, the mapping unit 402 first obtains a mapping relationship between the screenshot and the original interface, and then determines a mapping position corresponding to the operation position in the original interface according to the mapping relationship and the operation position of the screen operation.
Thereafter, the setting unit 404 sets the state of the topmost view to a non-responsive event state, such that when a trigger event is received and distributed to a first responder, the parent view of the topmost view is the first responder.
The passing unit 406 passes the trigger event and the mapping location from the first responder along the response chain towards the root view direction of the response chain until reaching the lowest level of the view. Here, since the first responder cannot process the trigger event, the trigger event is delivered to the parent view of the first responder, and the trigger event and the mapping location are sequentially delivered after confirming that the parent view of the first responder cannot process the trigger event, and finally reach the lowest layer of the view.
After the trigger event and the mapping position are transferred to the bottom layer of the view, i.e., Window layer, the processing unit 408 causes the magnifying glass control to process the trigger event according to the mapping position. Specifically, firstly, the coordinates of the central point of the magnifier are obtained according to the mapping position; and rendering the view corresponding to the original interface into the newly-built first view through amplification and translation according to the center point coordinate, thereby realizing the amplification function.
According to an exemplary embodiment of the present disclosure, referring to fig. 5, compared to the apparatus 400 for enlarging an APP interface, the apparatus 500 for enlarging an APP interface includes not only the mapping unit 402, the setting unit 404, the transferring unit 406, and the processing unit 408, but also the screenshot unit 502.
The screenshot unit 502 is configured to render the screenshot into a newly-created second view when the screen operation represents that the user triggers the magnifier, where the second view is a top-most view. The screenshot may be a picture obtained by scaling up or scaling down a picture obtained by capturing a screen. Then, the mapping unit 402 may obtain the mapping position of the operation position in the original interface according to the above scaled up scaling value or scaled down scaling value and the original coordinate of the operation position.
The other functions of the apparatus 500 for enlarging an APP interface are the same as those of the apparatus 400 for enlarging an APP interface, and the description thereof will not be repeated.
For details that are not disclosed in the embodiments of the apparatus of the present invention, please refer to the above-mentioned embodiment of the method for enlarging an APP interface of the present invention for the details that are not disclosed in the embodiments of the apparatus of the present invention.
The device for amplifying the APP interface provided by the embodiment of the invention enables the triggering event to be transmitted to the bottommost layer of the view by setting the state of the topmost view as the non-response state event, and the triggering event is processed by the magnifier control, so that the cost of the magnifier control for accessing the application program is reduced, and the running efficiency of the application program is improved.
Referring now to FIG. 6, shown is a block diagram of a computer system 600 suitable for use with the electronic device implementing an embodiment of the present invention. The computer system 600 of the electronic device shown in fig. 6 is only an example, and should not bring any limitation to the function and the scope of the use of the embodiments of the present invention.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU)601 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for system operation are also stored. The CPU 601, ROM 602, and RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output portion 607 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted in the storage section 608 as necessary.
In particular, according to an embodiment of the present invention, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the invention include a computer program product comprising a computer program embodied on a computer-readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611. The above-described functions defined in the system of the present application are executed when the computer program is executed by the Central Processing Unit (CPU) 601.
It should be noted that the computer readable medium shown in the present invention can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present invention may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs, which when executed by one of the electronic devices, cause the electronic device to implement the method for magnifying an APP interface as described in the foregoing embodiments.
For example, the electronic device may implement the following as shown in fig. 1: step S102, after a trigger event triggered by screen operation is acquired, determining a corresponding mapping position of the operation position in an original interface according to the operation position of the screen operation; step S104, setting the state of the topmost view as a non-response event state; step S106, transmitting the trigger event and the mapping position to the bottommost layer of the view along a response chain; and S108, processing the trigger event by the magnifying glass control according to the mapping position.
As another example, the electronic device may implement the steps shown in fig. 3.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the invention. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiment of the present invention can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which can be a personal computer, a server, a touch terminal, or a network device, etc.) to execute the method according to the embodiment of the present invention.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.
Claims (10)
1. A method of magnifying an APP interface, wherein a triggering gesture of a magnifying glass control is registered to a bottom most layer of a view, the method comprising:
after a trigger event triggered by screen operation is obtained, determining a corresponding mapping position of the operation position in an original interface according to the operation position of the screen operation;
setting the state of the topmost view to a non-responsive event state;
passing the trigger event and the mapping location along a chain of responses to a lowest level of the view;
the magnifying glass control processes the trigger event according to the mapping position, and the processing comprises the following steps:
acquiring the coordinate of the central point of the magnifying lens according to the mapping position;
and rendering the view corresponding to the original interface into the newly-built first view through amplification and translation according to the center point coordinate.
2. The method of claim 1, wherein the passing the trigger event and the mapping location along a chain of responses to a bottom level of the view comprises:
and transmitting the trigger event and the mapping position to the root view direction of the response chain along the response chain until reaching the lowest layer of the view.
3. The method according to claim 1 or 2, wherein the operation position is determined before the corresponding mapping position in the original interface according to the operation position of the screen operation, and the method further comprises:
and when the screen operation representation user triggers the magnifying glass, rendering the screenshot into a newly-built second view, wherein the second view is the topmost view.
4. The method according to claim 3, wherein the operation position is determined before the corresponding mapping position in the original interface according to the operation position of the screen operation, and the method further comprises:
acquiring a mapping relation between the screenshot and the original interface;
and determining a corresponding mapping position of the operation position in an original interface according to the mapping relation and the operation position of the screen operation.
5. An apparatus to magnify an APP interface, wherein a triggering gesture of a magnifier control is registered to a bottom most layer of a view, the apparatus comprising:
the mapping unit is used for determining a mapping position of the operation position in an original interface according to the operation position of the screen operation after acquiring a trigger event triggered by the screen operation;
the setting unit is used for setting the state of the topmost view into a non-response event state;
a transfer unit for transferring the trigger event and the mapping location to the lowest layer of the view along a response chain;
the processing unit is used for enabling the magnifying glass control to process the trigger event according to the mapping position, and comprises the following steps:
acquiring the coordinate of the central point of the magnifying lens according to the mapping position;
and rendering the view corresponding to the original interface into the newly-built first view through amplification and translation according to the center point coordinates.
6. The apparatus of claim 5, wherein the transfer unit is further configured to:
and transmitting the trigger event and the mapping position along the response chain to the root view direction of the response chain until reaching the lowest layer of the view.
7. The apparatus of claim 5 or 6, further comprising:
and the screenshot unit is used for rendering the screenshot into a newly-built second view when the screen operation representation user triggers the magnifying glass, wherein the second view is the topmost view.
8. The apparatus of claim 7, wherein the mapping unit is further configured to:
acquiring a mapping relation between the screenshot and the original interface;
and determining a corresponding mapping position of the operation position in an original interface according to the mapping relation and the operation position of the screen operation.
9. A computer-readable medium, on which a computer program is stored, which program, when being executed by a processor, is adapted to carry out a method of magnifying an APP interface as claimed in any one of claims 1 to 4.
10. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement a method of magnifying an APP interface as claimed in any one of claims 1 to 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811014444.5A CN110874172B (en) | 2018-08-31 | 2018-08-31 | Method, device, medium and electronic equipment for amplifying APP interface |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811014444.5A CN110874172B (en) | 2018-08-31 | 2018-08-31 | Method, device, medium and electronic equipment for amplifying APP interface |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110874172A CN110874172A (en) | 2020-03-10 |
CN110874172B true CN110874172B (en) | 2022-09-30 |
Family
ID=69715849
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811014444.5A Active CN110874172B (en) | 2018-08-31 | 2018-08-31 | Method, device, medium and electronic equipment for amplifying APP interface |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110874172B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113835697B (en) * | 2020-06-23 | 2024-10-25 | 北京字节跳动网络技术有限公司 | Event response method and device |
CN111831386B (en) * | 2020-07-30 | 2024-04-09 | 抖音视界有限公司 | Page content display method and device, electronic equipment and computer readable medium |
CN113094134B (en) * | 2021-04-06 | 2022-10-18 | 中科美络科技股份有限公司 | Display method and device of software interaction interface suitable for old people |
CN114546219B (en) * | 2022-01-28 | 2023-09-29 | 青岛海信移动通信技术有限公司 | Picture list processing method and related device |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101238430A (en) * | 2005-08-04 | 2008-08-06 | 微软公司 | Virtual magnifying glass with on-the-fly control functionalities |
CN106095466A (en) * | 2016-06-24 | 2016-11-09 | 北京市育学林教育技术有限公司 | Electronic teaching material clicks on amplification method and system thereof |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150019523A1 (en) * | 2013-07-15 | 2015-01-15 | Adam Lior | Event-based social networking system and method |
-
2018
- 2018-08-31 CN CN201811014444.5A patent/CN110874172B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101238430A (en) * | 2005-08-04 | 2008-08-06 | 微软公司 | Virtual magnifying glass with on-the-fly control functionalities |
CN106095466A (en) * | 2016-06-24 | 2016-11-09 | 北京市育学林教育技术有限公司 | Electronic teaching material clicks on amplification method and system thereof |
Non-Patent Citations (1)
Title |
---|
"史上最详细的iOS之事件的传递和响应机制-原理篇";VV木公子;《https://www.jianshu.com/p/2e074db792ba》;20160228;正文1-12页 * |
Also Published As
Publication number | Publication date |
---|---|
CN110874172A (en) | 2020-03-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110874172B (en) | Method, device, medium and electronic equipment for amplifying APP interface | |
CN109460233B (en) | Method, device, terminal equipment and medium for updating native interface display of page | |
CN110634049B (en) | Page display content processing method and device, electronic equipment and readable medium | |
WO2022063158A1 (en) | Local screen adaptation method and device | |
CN111459364B (en) | Icon updating method and device and electronic equipment | |
CN110688829A (en) | Table generation method, device, equipment and storage medium | |
CN111240786A (en) | Walkthrough method and device, electronic equipment and storage medium | |
CN111294395A (en) | Terminal page transmission method, device, medium and electronic equipment | |
CN111309617A (en) | Application program control method and device, storage medium and electronic equipment | |
CN111352957A (en) | Remote dictionary service optimization method and related equipment | |
CN113553123B (en) | Data processing method, device, electronic equipment and storage medium | |
CN111984888A (en) | Page rendering method and device, electronic equipment and computer readable medium | |
CN113961280B (en) | View display method and device, electronic equipment and computer readable storage medium | |
CN112395535A (en) | Image lazy loading method and device, medium and electronic equipment | |
CN113760438A (en) | Page display method and device for webpage application | |
CN112445394B (en) | Screenshot method and screenshot device | |
CN111258582B (en) | Window rendering method and device, computer equipment and storage medium | |
CN110427584A (en) | Page generation method, device, electronic equipment and computer readable storage medium | |
CN110618811B (en) | Information presentation method and device | |
CN116596748A (en) | Image stylization processing method, apparatus, device, storage medium, and program product | |
CN112818267A (en) | Data processing method and device, computer readable storage medium and electronic equipment | |
CN114222317B (en) | Data processing method and device, electronic equipment and storage medium | |
CN110619028A (en) | Map display method, device, terminal equipment and medium for house source detail page | |
CN111010449B (en) | Image information output method, system, device, medium, and electronic apparatus | |
CN116628366A (en) | Floor page processing method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |