CN110806831A - Touch screen response method and electronic equipment - Google Patents
Touch screen response method and electronic equipment Download PDFInfo
- Publication number
- CN110806831A CN110806831A CN201910944437.3A CN201910944437A CN110806831A CN 110806831 A CN110806831 A CN 110806831A CN 201910944437 A CN201910944437 A CN 201910944437A CN 110806831 A CN110806831 A CN 110806831A
- Authority
- CN
- China
- Prior art keywords
- sliding
- cursor
- touch screen
- electronic device
- distance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
Abstract
The embodiment of the application provides a response method of a touch screen and electronic equipment, wherein the method is applied to the electronic equipment with the touch screen and comprises the following steps: receiving a first operation of a finger of a user in a preset area on a first side edge of the touch screen, and responding to the first operation to display a cursor at the end point position of the first operation when the first operation is determined to meet a first response threshold; and when the second operation of the user finger on the cursor is detected and the user finger leaves the touch screen after the second operation is finished, responding to the second operation, triggering a click operation at a cursor position corresponding to the position of the user finger leaving the touch screen, wherein the moving distance of the cursor is in direct proportion to the moving distance of the user finger corresponding to the second operation, and the ratio is greater than 1, and responding to the click operation when the click operation is detected. By triggering the cursor display and then operating the cursor, the user can move the cursor a larger distance by moving a smaller distance with the fingers, so that the operation of any position on the screen by a single hand can be realized.
Description
Technical Field
The invention relates to the technical field of terminals, in particular to a touch screen response method and electronic equipment.
Background
With the updating and upgrading of electronic equipment, the screen size of the touch screen of the electronic equipment such as a smart phone and a tablet computer is larger and larger. Taking a mobile phone as an example, the vertical dimension of the current mobile phone is longer and longer, and the ratio of the vertical dimension to the transverse dimension is from 16:9 before to 18:9 and 19.5:9 now, and possibly to 21:9 in the future, which makes it difficult for a user to operate the upper area of the screen with the thumb if the user holds the lower half of the mobile phone with one hand while holding the mobile phone with one hand. While important buttons (such as return, set, etc.) on the display interface of many applications are arranged above the interface, the user has to use the other hand to operate. This tends to increase the difficulty of user operation and interrupt the immersion experience.
In order to change the situation that the size of the vertical screen is too large, when a user holds the mobile phone with one hand, the user cannot operate the upper half screen, a mobile phone system provides a single-hand mode, namely double-clicking a home key, and the display interface moves downwards to the position of the lower half screen, so that a button on the display interface is close to the finger of the user. However, this method may cause the window to become smaller or partially blocked, which may cause reading difficulty, and the interface needs to be reset after the operation is completed, which may affect the use efficiency, and in addition, the content of the interface changes greatly, and other operations (such as multitasking) related to the window need to adapt to the single-handed mode, which may increase the development workload and the maintenance cost.
Disclosure of Invention
The embodiment of the application provides a touch screen response method and electronic equipment, which are used for operating any position on a display screen of a mobile phone by using fingers held by a single hand under the condition of not changing the size and the position of a user interface.
In a first aspect, an embodiment of the present application provides a response method for a touch screen, which is applied to an electronic device having the touch screen, and the method includes: the method comprises the steps that the electronic equipment receives a first operation of a user finger in a preset area of a first side edge of a touch screen, wherein the first side edge is one of the left side and the right side of the touch screen, when the electronic equipment determines that the first operation meets a first response threshold value, a cursor is displayed at the end position of the first operation in response to the first operation, when the electronic equipment detects a second operation of the user finger on the cursor and the user finger leaves the touch screen when the second operation is finished, a click operation is triggered at a cursor position corresponding to the position where the user finger leaves the touch screen in response to the second operation, wherein the moving distance of the cursor is in direct proportion to the moving distance of the user finger corresponding to the second operation, the ratio is larger than 1, and when the electronic equipment detects the click operation, the electronic equipment responds to the click operation.
Based on the scheme, the user fingers operate in the preset area of the first side edge of the touch screen, the cursor can be touched to be displayed, then the cursor is operated, the cursor can move along with the movement of the fingers, the moving distance of the cursor is in direct proportion to the moving distance of the user fingers corresponding to the second operation, the ratio is larger than 1, namely, the user fingers move a small distance, the cursor can move a large distance, the single-hand operation can be carried out on the touch screen by using the thumb of the hand holding the mobile phone without changing the window position and the size of a user interface, and any position on the screen can be clicked.
In one possible design, the first operation includes a first sliding operation from a preset area of the first side edge into the screen and a second sliding operation continuing to move upwards or downwards from an end position of the first sliding operation, and the first response threshold is a first sliding distance corresponding to the first sliding operation and a second sliding distance corresponding to the second sliding operation along the vertical direction. That is to say, when the electronic device receives a first sliding operation from a preset area on the first side into the screen, it is determined whether a sliding distance corresponding to the first sliding operation reaches a first sliding distance, if the first sliding distance is reached, when a second sliding operation continuing to move upwards or downwards from an end position of the first sliding operation is received, it is determined whether a sliding distance in the vertical direction corresponding to the second sliding operation is greater than a second sliding distance, and if the second sliding distance is greater than the second sliding distance, a cursor is displayed at the end position of the first operation, that is, a cursor is displayed at the end position of the second sliding operation.
In one possible design, the first operation includes a first sliding operation from a preset area of the first side edge into the screen and a first pressing operation at an end position of the first sliding operation, and the first response threshold is a first sliding distance corresponding to the first sliding operation and a first pressing duration corresponding to the first pressing operation. That is to say, when the electronic device receives a first sliding operation from a preset region of the first side into the screen, it may be determined whether a sliding distance corresponding to the first sliding operation reaches a first sliding distance, if the first sliding distance is reached, when a first pressing operation at an end position of the first sliding operation is received, it is determined whether a pressing duration corresponding to the first pressing operation is longer than a first pressing duration, and if the pressing duration is longer than the first pressing duration, a cursor is displayed at the end position of the first operation, that is, the cursor is displayed at the end position of the first sliding operation.
In one possible design, the method further includes: the electronic equipment responds to the first sliding operation and displays an arc-shaped area on the first side edge. In this design, the user can see the result of the first sliding operation by displaying the arc-shaped area, thereby intuitively letting the user know the size of the distance that has slid.
In one possible design, the method further includes: the electronic equipment determines that the sliding distance corresponding to the first sliding operation reaches the first sliding distance, displays a return identifier in the arc-shaped area of the first side edge, and displays the cursor after the return identifier. In this design, the user is prompted by displaying a return identifier to return to the previous function.
In one possible design, the method further includes: and after determining that the sliding distance corresponding to the first sliding operation reaches the first sliding distance, the electronic equipment determines that the sliding distance of the second sliding operation along the vertical direction does not meet the second sliding distance, or the pressing duration corresponding to the first pressing operation does not meet the first pressing duration, and then the electronic equipment responds to the first operation and executes the function of returning to the previous stage.
In one possible design, the first operation is a third sliding operation upward or downward in a preset area of the first side edge, and the first response threshold is a third sliding distance in the vertical direction. That is to say, when the electronic device receives a third sliding operation of the first side edge in the upward or downward direction in the preset area, it may be determined whether a sliding distance of the third sliding operation in the vertical direction reaches a third sliding distance, and if the third sliding distance is reached, the cursor is displayed at the end point of the first operation, that is, the cursor is displayed at the end point of the third sliding operation.
In one possible design, the first operation is a second pressing operation, and the first response threshold is a second pressing duration and a first pressing area. That is to say, when the electronic device receives a second pressing operation on the preset region of the first side, it may be determined whether the pressing duration corresponding to the second pressing operation is longer than the second pressing duration and the pressing area corresponding to the second pressing operation is larger than the first pressing area, and if both the determination results are yes, the cursor is displayed at the end position of the first pressing operation, that is, the cursor is displayed at the position of the second pressing operation.
In one possible design, the second operation is a sliding operation.
In one possible design, the shape of the cursor is any one of: circle, arrow, I-shape, diamond, rectangle, vertical line.
In one possible embodiment, the predetermined area is the entire lateral area of the first lateral edge, or a partial lateral area of the first lateral edge.
In a second aspect, an embodiment of the present application provides a response method for a touch screen, which is applied to an electronic device having the touch screen, and the method includes: the electronic equipment detects a third operation of a user finger on a floating button displayed on the touch screen, when the electronic equipment determines that the third operation meets a second response threshold value, the electronic equipment responds to the third operation and displays a cursor at the end position of the third operation, the electronic equipment detects a fourth operation of the user finger on the cursor, the user finger leaves the touch screen when the second operation is finished, and responds to the fourth operation and triggers a click operation at a cursor position corresponding to the position where the user finger leaves the touch screen, wherein the moving distance of the cursor is in direct proportion to the moving distance of the user finger corresponding to the fourth operation, the ratio is greater than 1, and when the click operation is detected, the electronic equipment responds to the click operation.
Based on the scheme, the user fingers operate the suspension button, the cursor display can be touched, then the cursor is operated, the cursor can move along with the movement of the fingers, the moving distance of the cursor is in direct proportion to the moving distance of the user fingers corresponding to the second operation, the ratio is larger than 1, namely, the user fingers move a small distance, the cursor can move a large distance, so that the thumb of a hand holding the mobile phone can be used for carrying out one-hand operation on the touch screen without changing the position and the size of a window of a user interface, and any position on the screen can be clicked.
In one possible design, the third operation is a fourth sliding operation in a horizontal direction to the left or the right, or in a vertical direction to the up or down, and the second response threshold is a fourth sliding distance. That is, the electronic device detects a fourth sliding operation of the user's finger on the hover button displayed on the touch screen in the horizontal direction to the left or the right, or in the vertical direction to the up or down direction, and when the electronic device determines that the sliding distance corresponding to the fourth sliding operation is greater than the fourth sliding operation, the electronic device displays the cursor at the end position of the fourth sliding operation in response to the fourth sliding operation.
In one possible design, the third operation includes a third pressing operation and a fifth sliding operation, and the second response threshold is a third pressing duration corresponding to the third pressing operation and a fifth sliding distance corresponding to the fifth sliding operation. That is to say, the electronic device detects a third pressing operation of the finger of the user on the hover button displayed on the touch screen, and when the electronic device determines that the pressing duration corresponding to the third pressing operation is longer than the third pressing duration, detects a fifth sliding operation on the hover button, and determines that the sliding distance corresponding to the fifth sliding operation is longer than the fifth sliding distance, the electronic device responds to the fifth sliding operation and displays the cursor at the end point position of the fifth sliding operation.
In one possible design, the method further includes: and when the pressing duration of the third pressing operation reaches the third pressing duration, the electronic equipment performs vibration prompt. With this design, the user can determine that the pressing time period of the third pressing operation reaches the third pressing time period by sensing the vibration without looking at the screen.
In one possible design, the cursor is in the shape of any one of: arrow, I-shape, circle, diamond, rectangle, vertical line.
In one possible design, the fourth operation is a sliding operation.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, and a touch screen; the touch screen is used for receiving the operation of a user; the memory for storing one or more computer programs that, when executed by the processor, cause the electronic device to perform: receiving a first operation of a finger of a user in a preset area of a first side edge of the touch screen, wherein the first side edge is one of the left side and the right side of the touch screen; when the first operation is determined to meet a first response threshold value, displaying a cursor at the end position of the first operation in response to the first operation; detecting a second operation of the user finger on the cursor, and triggering a click operation at the cursor position corresponding to the position where the user finger leaves the touch screen in response to the second operation when the user finger leaves the touch screen at the end of the second operation, wherein the moving distance of the cursor is in direct proportion to the moving distance of the user finger corresponding to the second operation, and the ratio is greater than 1; and responding to the click operation when the click operation is detected.
In a possible design, the first operation includes a first sliding operation from a preset area of the first side edge into the screen and a second sliding operation continuing from an end position of the first sliding operation upwards or downwards, and the first response threshold is a first sliding distance corresponding to the first sliding operation and a second sliding distance corresponding to the second sliding operation along the vertical direction.
In a possible design, the first operation includes a first sliding operation from a preset area of the first side edge into the screen and a first pressing operation at an end position of the first sliding operation, and the first response threshold is a first sliding distance corresponding to the first sliding operation and a first pressing duration corresponding to the first pressing operation.
In one possible design, the processor is further configured to: in response to the first sliding operation, an arc-shaped area is displayed on the first side.
In one possible design, the processor is further configured to: and determining that the sliding distance corresponding to the first sliding operation reaches the first sliding distance, displaying a return identifier in the arc-shaped area of the first side edge, and displaying the cursor after the return identifier.
In one possible design, the processor is further configured to: after the sliding distance corresponding to the first sliding operation is determined to reach the first sliding distance, determining that the sliding distance of the second sliding operation along the vertical direction does not meet the second sliding distance, or responding to the first operation to execute a function of returning to the previous stage if the pressing duration corresponding to the first pressing operation does not meet the first pressing duration.
In one possible design, the first operation is a third sliding operation upward or downward in a preset area of the first side edge, and the first response threshold is a third sliding distance in the vertical direction.
In one possible design, the first operation is a second pressing operation, and the first response threshold is a second pressing duration and a first pressing area.
In one possible design, the second operation is a sliding operation.
In one possible design, the cursor is in the shape of any one of: circle, arrow, I-shape, diamond, rectangle, vertical line.
In one possible embodiment, the predetermined area is the entire lateral area of the first lateral edge, or a partial lateral area of the first lateral edge.
In a fourth aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, and a touch screen; the touch screen is used for receiving the operation of a user; the memory for storing one or more computer programs that, when executed by the processor, cause the electronic device to perform: detecting a third operation of a finger of a user on a floating button displayed on the touch screen; when the third operation is determined to meet a second response threshold value, displaying a cursor at the end point position of the third operation in response to the third operation; detecting a fourth operation of the user finger on the cursor, and triggering a click operation at the cursor position corresponding to the position where the user finger leaves the touch screen in response to the fourth operation when the user finger leaves the touch screen at the end of the second operation, wherein the moving distance of the cursor is in direct proportion to the moving distance of the user finger corresponding to the fourth operation, and the ratio is greater than 1; and responding to the click operation when the click operation is detected.
In one possible design, the third operation is a fourth sliding operation in a horizontal direction to the left or the right, or in a vertical direction to the up or down, and the second response threshold is a fourth sliding distance.
In a possible design, the third operation includes a third pressing operation and a fifth sliding operation, and the second response threshold is a third pressing duration corresponding to the third pressing operation and a fifth sliding distance corresponding to the fifth sliding operation.
In one possible design, the processor is further configured to: and when the pressing duration of the third pressing operation reaches the third pressing duration, the electronic equipment performs vibration prompt.
In one possible design, the cursor may be in the shape of any of: arrow, I-shape, circle, diamond, rectangle, vertical line.
In one possible design, the fourth operation is a sliding operation.
In a seventh aspect, the present application also provides an apparatus including a module/unit for performing the method of any one of the possible designs of any one of the above aspects. These modules/units may be implemented by hardware, or by hardware executing corresponding software.
In an eighth aspect, this embodiment further provides a computer-readable storage medium, where the computer-readable storage medium includes a computer program, and when the computer program is run on an electronic device, the electronic device is caused to perform any one of the possible design methods of the foregoing aspects.
In a ninth aspect, the present application further provides a computer program product, which when run on a terminal, causes the electronic device to execute any one of the possible design methods of any one of the above aspects.
These and other aspects of the present application will be more readily apparent from the following description of the embodiments.
Drawings
Fig. 1a is a schematic diagram of a hardware structure of a mobile phone according to an embodiment of the present disclosure;
fig. 1b is a schematic diagram of a software structure of a mobile phone according to an embodiment of the present application;
fig. 2 is a schematic view of a single-handed holding mobile phone according to an embodiment of the present disclosure;
fig. 3a to 3c are schematic views of hot zones provided in the embodiments of the present application;
FIG. 4 is a schematic diagram of a set of interfaces provided by an embodiment of the present application;
FIG. 5 is a schematic view of another set of interfaces provided by embodiments of the present application;
FIG. 6 is a schematic view of another set of interfaces provided by embodiments of the present application;
FIG. 7 is a schematic view of another set of interfaces provided by embodiments of the present application;
FIG. 8 is a schematic view of another set of interfaces provided by embodiments of the present application;
FIG. 9 is a schematic view of another set of interfaces provided by embodiments of the present application;
fig. 10 is a schematic flowchart of a response method of a touch screen according to an embodiment of the present application;
fig. 11 is a schematic flowchart of a response method of a touch screen according to an embodiment of the present application;
fig. 12 is a schematic diagram of a hardware structure of a mobile phone according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
Hereinafter, some terms in the embodiments of the present application are explained to facilitate understanding by those skilled in the art.
The cursor referred to in the embodiments of the present application is a symbol or a graphic displayed on a screen or other display device of an electronic device for human-computer interaction, and is configured to move in response to a finger operation, and the cursor may indicate a position point where a touch operation of a finger of a user on a touch screen occurs.
Gesture navigation is an interactive mode and animation effect for operating a mobile phone by using a sliding gesture. For example, the Emotion operation system (Emotion UI) of the mobile phone is a gesture navigation function added in version 9.0. Wherein, the function of the side sliding gesture in the gesture navigation function is: and after the two side edges of the screen slide inwards to generate a semitransparent arc-shaped graph and lift the hands, the application can back to the upper-level interface.
The floating navigation is an interactive mode for operating a floating button resident in a screen and an animation effect thereof, the floating button can be a semitransparent dot and is covered on an interface, and the position of the floating button can be changed manually, for example, a finger presses the floating button to move left. EMUI adds a new hover navigation feature in version 8.0.
The main screen and the sub-screen of the electronic device will be described below.
In some embodiments, some electronic devices have two display screens, one on the front of the phone called the primary screen and the other on the back of the phone called the secondary screen. When the electronic equipment is held by one hand, the held hand can be used for touching the secondary screen to realize operation on the secondary screen.
In other embodiments, for an electronic device having a foldable touch screen, the foldable touch screen may be considered a complete screen when the foldable touch screen is in a fully unfolded state; when the foldable touch screen is in a folded state, a screen displayed on the front side of the mobile phone is called a main screen, and a screen positioned on the back side of the mobile phone is called an auxiliary screen.
The term "user interface" in the description and claims and drawings of the present application is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of the user interface is a Graphical User Interface (GUI), which refers to a user interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
As shown in fig. 2, the user interface 200 is a main interface of the electronic device. User interface 200 may include, among other things, a status bar 201, a concealable navigation bar 202, time and weather gadgets (widgets) 203, and icons for various applications, such as WeChat icons, gallery icons, short message icons, and the like. The status bar 201 includes an operator name (e.g., china mobile), a mobile network (e.g., 4G), time, remaining power, and the like. A back key icon, a home key icon, and a forward key icon may be included in the navigation bar 202. Further, it is understood that in some embodiments, a bluetooth icon, a Wi-Fi icon, an alarm icon, an icon of an external device, etc. may also be included in the status bar 201. It is further understood that, in other embodiments, the user interface 200 shown in fig. 2 may further include a Dock bar 204, and the Dock bar 204 may include common application icons, such as the phone icon, the setting icon, the browser icon, and the microblog icon shown in fig. 2. When the processor 110 detects a touch event of a finger (or a stylus, etc.) of a user with respect to an application icon, in response to the touch event, a user interface of an application corresponding to the application icon is opened and displayed on the display screen 194.
It should be noted that the term "and/or" is only one kind of association relationship describing the associated object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship, unless otherwise specified. Also, in the description of the embodiments of the present application, the terms "first," "second," and the like are used for descriptive purposes only and are not intended to indicate or imply relative importance nor order to indicate or imply order.
The following describes electronic devices, Graphical User Interfaces (GUIs) for such electronic devices, and embodiments for using such electronic devices. In some embodiments of the present application, the electronic device may be a mobile phone, a tablet computer, a notebook computer, or a wearable device (such as a smart watch or smart glasses) with a wireless communication function. The electronic device comprises means (such as a processor, or application processor, or image processor, or other processor) capable of performing data processing functions, and means (such as a display screen) capable of displaying a user interface. Exemplary embodiments of the electronic device include, but are not limited to, a mountOr other operating system device. The electronic device may also be other portable devices such as laptop computers (laptop) with touch sensitive surfaces (e.g. touch panels) and the like. It should also be understood that in some other embodiments of the present application, the electronic device 01 may not be a portable electronic device, but may be a desktop computer having a touch-sensitive surface (e.g., a touch panel).
The structure of the electronic device is further explained with reference to the accompanying drawings.
Taking an electronic device as an example of a mobile phone, fig. 1a only shows a hardware structure diagram of a mobile phone 100 provided in the embodiment of the present application, and on the basis of fig. 1a, other structural modifications may also exist. As shown in fig. 1a, the mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the mobile phone 100. In other embodiments of the present application, the handset 100 may include more or fewer components than shown in FIG. 1a, or combine certain components, or split certain components, or a different arrangement of components. The components shown in FIG. 1a may be implemented in hardware, software, or a combination of software and hardware.
The components of the handset 100 shown in fig. 1a are described in detail below.
The processor 110 may include one or more processing units, for example, the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. The controller may be a neural center and a command center of the cell phone 100, among others. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory, so that repeated accesses can be avoided, the waiting time of the processor 110 can be reduced, and the efficiency of the system can be improved.
In some embodiments, processor 110 may include one or more interfaces. For example, the interface may include an integrated circuit (I2C) interface, an inter-integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement the touch function of the mobile phone 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a display screen serial interface (DSI), and the like. In some embodiments, the processor 110 and the camera 193 communicate through a CSI interface to implement the camera function of the handset 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the mobile phone 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the mobile phone 100, and may also be used to transmit data between the mobile phone 100 and peripheral devices. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not constitute a limitation on the structure of the mobile phone 100. In other embodiments of the present application, the mobile phone 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the cell phone 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the mobile phone 100 can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the handset 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the mobile phone 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the handset 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the handset 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou satellite navigation system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The mobile phone 100 implements the display function through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the cell phone 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The mobile phone 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the handset 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the handset 100 is in frequency bin selection, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. Handset 100 may support one or more video codecs. Thus, the handset 100 can play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize applications such as intelligent recognition of the mobile phone 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the mobile phone 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, a phonebook, etc.) created during use of the handset 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the cellular phone 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The mobile phone 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The cellular phone 100 can listen to music through the speaker 170A or listen to a hands-free call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the cellular phone 100 receives a call or voice information, it is possible to receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The handset 100 may be provided with at least one microphone 170C. In other embodiments, the handset 100 may be provided with two microphones 170C to achieve noise reduction functions in addition to collecting sound signals. In other embodiments, the mobile phone 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The handset 100 determines the intensity of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the mobile phone 100 detects the intensity of the touch operation according to the pressure sensor 180A. The cellular phone 100 can also calculate the touched position based on the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions.
The gyro sensor 180B may be used to determine the motion attitude of the cellular phone 100. In some embodiments, the angular velocity of the handpiece 100 about three axes (i.e., the x, y, and z axes) may be determined by the gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the handset 100 calculates altitude, aiding in positioning and navigation, from the barometric pressure measured by the barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The handset 100 can detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the handset 100 is a flip phone, the handset 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E can detect the magnitude of acceleration of the cellular phone 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the handset 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The handset 100 may measure distance by infrared or laser. In some embodiments, taking a picture of a scene, the cell phone 100 may utilize the range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The cellular phone 100 emits infrared light to the outside through the light emitting diode. The handset 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the cell phone 100. When insufficient reflected light is detected, the cell phone 100 can determine that there are no objects near the cell phone 100. The mobile phone 100 can detect that the mobile phone 100 is held by the user and close to the ear for communication by using the proximity light sensor 180G, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. The handset 100 may adaptively adjust the brightness of the display 194 according to the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the mobile phone 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The mobile phone 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, take a photograph of the fingerprint, answer an incoming call with the fingerprint, and the like. For example, the fingerprint sensor may be disposed on the front side of the cellular phone 100 (below the display 194), or the fingerprint sensor may be disposed on the rear side of the cellular phone 100 (below the rear camera). In addition, the fingerprint recognition function can also be realized by configuring a fingerprint sensor in the touch screen, that is, the fingerprint sensor can be integrated with the touch screen to realize the fingerprint recognition function of the mobile phone 100. In this case, the fingerprint sensor may be disposed in the touch screen, may be a part of the touch screen, or may be otherwise disposed in the touch screen. In addition, the fingerprint sensor can also be implemented as a full panel fingerprint sensor, and thus, the touch screen can be regarded as a panel which can perform fingerprint collection at any position. In some embodiments, the fingerprint sensor may process the acquired fingerprint (e.g., whether the fingerprint is verified) and send the processed fingerprint to the processor 110, and the processor 110 performs corresponding processing according to the processing result of the fingerprint. In other embodiments, the fingerprint sensor may also send the captured fingerprint to the processor 110 for processing (e.g., fingerprint verification, etc.) by the processor 110. The fingerprint sensor in embodiments of the present application may employ any type of sensing technology including, but not limited to, optical, capacitive, piezoelectric, or ultrasonic sensing technologies, among others.
The temperature sensor 180J is used to detect temperature. In some embodiments, the handset 100 implements a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the mobile phone 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the cell phone 100 heats the battery 142 when the temperature is below another threshold to avoid an abnormal shutdown of the cell phone 100 due to low temperatures. In other embodiments, when the temperature is lower than a further threshold, the mobile phone 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the mobile phone 100, different from the position of the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The cellular phone 100 may receive a key input, and generate a key signal input related to user setting and function control of the cellular phone 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the cellular phone 100 by being inserted into the SIM card interface 195 or being pulled out from the SIM card interface 195. The handset 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The mobile phone 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the handset 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the mobile phone 100 and cannot be separated from the mobile phone 100.
Although not shown in fig. 1a, the mobile phone 100 may further include a bluetooth device, a positioning device, a flash, a micro-projection device, a Near Field Communication (NFC) device, and the like, which are not described herein.
The software system of the electronic device 100 may adopt a layered architecture, and in the embodiment of the present application, the Android system of the layered architecture is taken as an example to exemplarily illustrate the software structure of the electronic device 100.
Fig. 1b is a block diagram of the software structure of the electronic device 100 according to the embodiment of the present invention.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 1b, the application package may include applications such as phone, camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 1b, the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources, such as localized strings, icons, pictures, layout files, video files, etc., to the application.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following embodiments can be implemented in an electronic device (e.g., the mobile phone 100) having the above hardware structure, and the following embodiments will take the mobile phone 100 as an example, and will be described in detail with reference to the drawings.
Fig. 2 illustrates a schematic view of a single-handed holding of a handset.
When the user holds the mobile phone with one hand, as shown in fig. 2, the user holds the mobile phone with the left hand, the thumb of the left hand is on the left side of the display screen 194 of the mobile phone 100, and the other fingers are on the back side or the right side of the display screen of the mobile phone. When a user holds the mobile phone with one hand, the user is accustomed to hold the lower half of the mobile phone in order to support the weight of the mobile phone and keep the balance of the mobile phone, and at this time, the thumb of the left hand can operate the lower half of the display screen, and the length of the thumb is limited, so that the thumb of the left hand can hardly reach the upper half of the mobile phone, and therefore, the thumb of the left hand can hardly operate the upper half of the display screen 194. If the width of the mobile phone is also large, the thumb of the left hand can operate the position close to the left side of the display screen, and the position close to the right side of the display screen is difficult to operate. A similar problem exists when the user has a right hand to hold the terminal with one hand.
Therefore, the method may be applied to an electronic device with a display screen, where the electronic device takes a mobile phone 100 as an example, a hot area is arranged in a side area of the display screen of the mobile phone 100, a user may operate the hot area at the side of the display screen to trigger a cursor to be displayed, the cursor may move along with the sliding of a finger of the user on the display screen, a ratio of a moving distance of the cursor to a sliding distance of the finger is greater than 1, when the finger slides to a certain position and leaves the display screen, the mobile phone 100 triggers a click operation at the position of the cursor, and responds to the click operation, so that a thumb of a hand holding the mobile phone may be used to perform a one-hand operation without changing a position and a size of a window of a user interface, and any position on the screen may be clicked.
The following describes in detail a response method of a touch screen provided in an embodiment of the present application, with reference to the accompanying drawings and taking an electronic device as a mobile phone as an example.
First, the hot zone at the side is described. As shown in fig. 3a and 3b, the vertical length of the display screen of the mobile phone 100 is L, the hot zone may be the entire side area of the display screen of the mobile phone, and the hot zone may also be a part of the side area, and considering the habit of the user, when the user holds the mobile phone with one hand, the thumb is generally in the middle or lower area of the side of the display screen, and it is generally hard to reach the upper quarter area, so as to facilitate the operation of the mobile phone when the user holds with one hand, the hot zone may be disposed in the middle or lower area of the mobile phone, for example, as shown in fig. 3a, the hot zone may also be the lower three quarters area of the side of the display screen of the mobile phone, such as the area 310 and the area 320. The hot zones may also be part of the sides except near the top and part near the bottom, for example as shown in fig. 3b, the hot zones may be the middle of the sides except for the top quarter and the bottom tenth, namely, regions 330 and 340.
The following describes the functions of the side hot zone and the response method of the mobile phone to the user operation, taking the user's thumb acting on the side hot zone as an example.
In one example, the thumb of the user may perform a sliding operation into the screen for the hot area on the side, the mobile phone may display a semi-transparent arc-shaped area in response to the sliding operation, as shown in a in fig. 4, the thumb acts on the position a on the side and slides into the screen along the direction of the arrow 411, a semi-transparent arc-shaped area 421 shown in B in fig. 4 is displayed during the sliding process, the semi-transparent arc-shaped area 421 gradually becomes larger as the sliding distance of the finger increases, and when the sliding horizontal distance is greater than or equal to the preset distance (e.g. slides to the position B shown in B in fig. 4), the return mark 422 is displayed.
In one example, as shown in a in FIG. 5, after the return indicator 422 appears when the thumb slides to position B, the sliding continues, as shown by arrow 433, as shown in B in FIG. 5, triggering the display of the cursor 432 when sliding to position C, the vertical distance between position C and position B being greater than the distance threshold. In this example, the cursor 432 is displayed at position C when the thumb is slid upward in the direction of arrow 433 a vertical distance greater than a distance threshold, and in some other embodiments, the finger may also be slid downward, and when the vertical distance of the downward slide is greater than the distance threshold, the cursor 432 is also displayed at the corresponding position.
Then, the cursor can move with the hand, that is, the cursor moves in the same direction when the thumb moves. The following takes the hot zone shown in fig. 3a as an example to describe the specific following hand movement rule in detail.
After the cursor display is triggered, the size of the movable area of the finger is related to the position when the cursor display is triggered, and taking the vertical length of the display screen of the mobile phone as 100 pixels and the horizontal width as 30 pixels as an example, the aspect ratio of the mobile phone is 10: 3.
In one example, the finger movement distance and the cursor movement distance may be in a linear relationship, and the ratio between the distance of the cursor vertically moving and the distance of the cursor horizontally moving may be consistent with the aspect ratio of the mobile phone, that is, if the finger vertically moves 1 pixel, the cursor vertically moves 10 pixels, and if the finger horizontally moves 1 pixel, the cursor vertically moves 3 pixels. The ratio between the distance of the vertical movement of the cursor and the distance of the horizontal movement of the cursor can be inconsistent with the aspect ratio of the mobile phone, for example, if the finger vertically moves 1 pixel, the cursor vertically moves 4 pixels, and if the finger horizontally moves 1 pixel, the cursor horizontally moves 1.5 pixels. As shown in fig. 3c, when the cursor display is triggered at a position M (50 pixels from the top, 50 pixels from the bottom, and 30 pixels from the right) on the left side of the display screen, the finger can move up 12.5 pixels vertically, down 12.5 pixels vertically, and move 30 width to the right, i.e. the area where the finger can move is a rectangular area (such as the rectangular dashed box 350 in fig. 3 c) with a vertical length of 25 and a horizontal width of 30.
In other examples, the finger movement distance and the cursor movement distance may also be non-linear, such as an exponential function, a power function, and the like. For example, the function curve can be used to realize that the cursor can move faster first, and then the cursor moves slower as the moving distance becomes farther, so that when the function button is farther from the finger, the cursor moves slower when the cursor is about to reach the function button, and the operation point can be selected more accurately.
A WeChat discovery interface 410 as shown in b of FIG. 5, which includes a variety of function buttons, such as a friend circle function button, a sweep function button, a pan function button, a see-while-see function button, a search-and-search function button, a nearby people function button, a shopping function button, a game function button, an applet function button, etc., a user wants to click on the friend circle function button, a thumb can continue to slide in the direction of arrow 433 at position C, as shown in C of FIG. 5, when the thumb slides from position C to position D, a cursor 432 slides from position C to position E, where the thumb leaves the display, and a cell phone implements a click function at position E in response to the operation, a friend circle interface 420 as shown in D of FIG. 5 is displayed, wherein the direction of the connection of position C to position E is the same as the direction of the connection of position C to position D, however, the distance between the position C and the position E is greater than the distance between the position C and the position D, so that the thumb of the user only needs to slide on the display screen for a small distance, and the operation on the area above the half screen of the display screen can be realized, thereby improving the user experience.
In another example, the thumb slides in the side hot area to the screen until the return mark is displayed, the thumb leaves the display screen, and the mobile phone performs the return function in response to the operation, namely returns to the previous interface. As shown in fig. 6 a, the friend circle interface 420, the thumb slides into the screen at position a of the side hot zone in the direction of arrow 601, displaying an arc 621 shown in fig. 6B, until a return sign 602 appears in the arc 621 when sliding to position B, at which time the thumb leaves the display screen, displaying the previous interface of the friend circle interface 420, such as the WeChat discovery interface 410 shown in fig. 6 c.
In the embodiment of the present application, the manner of triggering the cursor display may be as in the embodiments described above: when the thumb slides to position B and the display screen displays the semi-transparent arc area 421 shown as B in fig. 4, the thumb continues to slide for a certain distance to trigger the cursor display.
In one example, when the thumb is slid to position B and the display screen displays the semi-transparent arc area 421 shown as B in fig. 4, the thumb is at rest but the thumb is not away from the display screen, i.e. the thumb is still in contact with the display screen, when the thumb is at rest on the display screen for a period of time greater than or equal to the duration threshold, the cursor display is triggered, and if the thumb is away from the display screen for a period of time less than the duration threshold, the mobile phone performs a return function, i.e. returns to the previous interface. This allows the thumb to be stationary while the semi-transparent arc area 421 is displayed, and allows the user's intent to be determined by determining the length of time the thumb is stationary on the display screen, whether it is desired to perform a return function or trigger a cursor display.
The user's thumb slides within a hot area on the side of the display screen, showing the semi-transparent arced region 421 in the above-described embodiment, and when the thumb is removed from the display screen, the semi-transparent arced region 421 disappears, i.e., the display screen no longer shows the semi-transparent arced region 421. It will be appreciated that the cursor in the above embodiments may also disappear when the thumb leaves the display screen, in a specific implementation, the cursor may disappear immediately after the thumb leaves the display screen, in this case, when the cursor needs to be used again, the cursor needs to slide again in the side hot area to the screen so as to trigger the cursor display again, in some other embodiments, the cursor may disappear after the thumb leaves the display screen for a certain time (for example, 3s), in this case, for example, when the thumb leaves the display screen for 2s, the cursor may continue to be operated by sliding the display screen again.
In another example, a cursor display may be triggered when the distance of the sliding is greater than a certain distance threshold, either sliding up or down within the side thermal zone. As shown in fig. 7 a, the thumb is applied to position a in the hot zone of the side edge and slid upward in the direction of arrow 711, and when the sliding distance is greater than or equal to the distance threshold (e.g., to position F shown in fig. 7 b), a cursor 721 is displayed. Cursor 721 appears after position F, moves following the thumb slide, and triggers the click function at the position where the thumb is off the display screen. The thumb at position F may continue to slide in the direction of arrow 731 as shown in c of fig. 7, and when the thumb slides from position F to position G and the cursor 432 slides from position F to position E, where the circle of friends function button is located, the thumb leaves the display screen, and the cell phone implements a click function at position E in response to this operation, displaying the circle of friends interface 420 as shown in d of fig. 5. The connecting direction from the position F to the position G is the same as that from the position F to the position E, but the connecting distance from the position F to the position E is larger than that from the position F to the position G, so that the thumb of a user only needs to slide a small distance on the display screen, the operation on the area of the upper half screen of the display screen can be realized, and the user experience is improved.
In yet another example, a thumb may press within a lateral hot zone to trigger a cursor display. The capacitance signal generated by the thumb pressing on the display screen can be approximate to an ellipse shape actually, that is, the thumb actually touches not a point but an ellipse area, and the cursor display can be triggered according to the fact that the area of the ellipse area is larger than an area threshold value and the pressing duration is larger than or equal to the preset duration. When a user holds the mobile phone with one hand, fingers often touch the side edge area carelessly, and by setting the filtering condition that the contact area of the hot area of the side edge pressed by the thumb is larger than the area threshold value and the pressing time is larger than or equal to the preset time, misoperation caused by the fact that the fingers touch the side edge area by mistake to trigger cursor display can be filtered.
In other embodiments, the function of triggering the cursor display may also be implemented by a floating button, and the floating button may be a semi-transparent dot or may be in other shapes, which is not limited in this embodiment of the application.
The function of the hover button is described in detail below.
In one example, when the thumb drags the hover button, the hover button may be moved, such as by dragging the hover button up or down in a vertical direction, such as by dragging the hover button left or right in a horizontal direction, and the hover button may be moved the same distance as the finger. So, can move the region that does not have the display content through removing the suspension button, with the suspension button removal to avoid the suspension button to shelter from the display content, promote user experience.
When the time for pressing the floating button by the thumb reaches a time threshold, after the mobile phone is vibrated to prompt the user to perform further operation, the thumb continues to press the floating button to slide upwards or downwards along the vertical direction, and when the vertical displacement is larger than or equal to the distance threshold, the cursor display is triggered. After the thumb presses the hover button 801, as shown in a in fig. 8, the thumb slides upward (as indicated by arrow 802) from position H to position K, as shown in b in fig. 8, and the cursor 803 is displayed at position H and moved to the position of the thumb (i.e., position K), while the hover button 801 remains at position H, where the vertical distance between position K and position H is greater than or equal to the distance threshold. After the cursor 803 is displayed, the cursor 803 moves along with the sliding of the finger, and for specific hand following rules, reference is made to relevant contents in the foregoing embodiments, and details are not repeated here.
After the cell phone vibrates to indicate that the user can do further operations, the thumb continues to press the hover button to slide left or right in the horizontal direction, which may trigger the multitasking management function, as shown in c of fig. 8, and the thumb slides right in the horizontal direction (as indicated by arrow 804) from position H to position L, displaying a multitasking management interface 810 as shown in d of fig. 8, where the multitasking management interface 810 includes multiple pages of applications running in the background, such as an information page 811 and a WeChat page 812.
In another example, when a thumb drags the hover button up or down in a vertical direction, a cursor display may be triggered when the distance dragging the hover button is greater than or equal to a certain distance threshold; when the thumb drags the hover button left or right along the horizontal direction, the multitask management function may be triggered when the distance of dragging the hover button is greater than or equal to a certain distance threshold. Therefore, the floating button can be directly dragged to realize the display of the trigger cursor, and compared with a mode of realizing the display of the trigger cursor by long-time pressing of the floating button and sliding operation, the mode of directly dragging the floating button can simplify the operation process and improve the efficiency.
When the time for pressing the floating button by the thumb reaches the time threshold, after the mobile phone is vibrated to prompt the user to perform further operation, the thumb continues to press the floating button to slide upwards or downwards along the vertical direction, or the thumb continues to press the floating button to slide leftwards or rightwards along the horizontal direction, so that the floating button can be dragged.
In yet another example, the functionality of the hover button may be defined as triggering a cursor display, taking into account that the hover navigation functionality is less frequently used and is repeated with portions of the gesture navigation functionality. For example, when the thumb drags the hover button up or down in the vertical direction, or the thumb drags the hover button left or right in the horizontal direction, triggering the cursor display may be implemented. For another example, when the time for pressing the hover button by the thumb reaches the time threshold, the mobile phone vibrates to prompt the user that further operation is possible, and when the sliding distance for the thumb to slide upwards or downwards on the hover button in the vertical direction is greater than or equal to the distance threshold, or when the sliding distance for the thumb to slide leftwards or rightwards on the hover button in the horizontal direction is greater than or equal to the distance threshold, the cursor display is triggered. Therefore, the function of the suspension button can be simplified, and the learning cost is reduced.
In other embodiments, if the front and the back of the mobile phone have display screens, or the mobile phone with the foldable screen is in a fully folded state, when the user holds the mobile phone with one hand, the user can operate the mobile phone with his thumb on the display screen at the front, and operate the mobile phone with his index finger or middle finger on the display screen at the back, and when the user performs a preset operation with his finger on the display screen at the back, the user can trigger the cursor to display, for example, the preset operation may include, but is not limited to, long pressing, sliding, double clicking, pressing with a large area, drawing a specific shape, and the like. In order to facilitate user operation, a finger of a user can operate in the display screen on the back side, a cursor moves on the display screen on the front side, and when the finger leaves the display screen at a certain position on the display screen on the back side, a click function is triggered at a position on the display screen on the front side corresponding to the position. It should be noted that when a finger moves on the display screen on the back side, a cursor displayed on the display screen on the front side moves along with the finger, and specific hand following rules refer to relevant contents in the foregoing embodiments, which are not described herein again.
The following description will be made with reference to fig. 9 by taking an example in which both the front and back sides of the mobile phone have display screens.
The display 901 on the back of the mobile phone as shown in a in fig. 9 and the display 902 on the front of the mobile phone as shown in b in fig. 9, the display 901 and the display 902 have a position mapping relationship. For example, the index finger performs a sliding operation on the display 901, the sliding operation is performed from the position M to the position N, and the mobile phone displays the cursor 903 at a position corresponding to the position N on the display 902 in response to the sliding operation.
Referring to fig. 10, a flow of a response method of a touch screen provided by an embodiment of the present application is exemplarily shown, where the method is executed by an electronic device.
In step 1001, an electronic device receives a first operation of a finger of a user in a preset area of a first side of a touch screen, where the first side is one of a left side and a right side of the touch screen.
For example, the predetermined region may be a hot region in the above embodiments, such as the region 310, further such as the region 320, further such as the region 330, further such as the region 340.
In step 1002, when the electronic device determines that the first operation satisfies the first response threshold, a cursor is displayed at an end position of the first operation in response to the first operation.
For example, the first operation includes a first sliding operation from a preset area of the first side edge into the screen and a second sliding operation continuing upward or downward from an end position of the first sliding operation, and the first response threshold is a first sliding distance corresponding to the first sliding operation and a second sliding distance in the vertical direction corresponding to the second sliding operation.
For another example, the first operation includes a first sliding operation from a preset area of the first side edge into the screen and a first pressing operation at an end position of the first sliding operation, and the first response threshold is a first sliding distance corresponding to the first sliding operation and a first pressing duration corresponding to the first pressing operation.
For another example, the first operation is a third sliding operation upward or downward in a preset area of the first side, and the first response threshold is a third sliding distance in the vertical direction.
For another example, the first operation is a second pressing operation, and the first response threshold is a second pressing duration and a first pressing area.
For example, the second operation is a slide operation. A sliding operation in the direction of arrow 433 shown in b in fig. 5, and a sliding operation in the direction of arrow 731 shown in c in fig. 7.
And step 1004, when the electronic equipment detects the click operation, responding to the click operation.
Based on the method, the user finger operates in the preset area of the first side edge of the touch screen, the cursor display can be touched, then the cursor is operated, the cursor can move along with the movement of the finger, the moving distance of the cursor is in direct proportion to the moving distance of the user finger corresponding to the second operation, the ratio is larger than 1, namely, the user finger moves a smaller distance, the cursor can move a larger distance, so that the thumb of the hand holding the mobile phone can be used for carrying out single-hand operation on the touch screen without changing the position and the size of a window of a user interface, and any position on the screen can be clicked.
In a possible design, when the first operation includes a first sliding operation from a preset area of the first side into the screen and a second sliding operation continuing to move up or down from an end position of the first sliding operation, as described with reference to fig. 4 and 5, when the electronic device receives the first sliding operation from the preset area of the first side into the screen (the sliding operation along arrow 411 shown in a in fig. 4), it is determined whether a sliding distance corresponding to the first sliding operation reaches the first sliding distance, if the first sliding distance is reached, it is detected whether a second sliding operation continuing to move up or down from the end position of the first sliding operation (the sliding operation along arrow 423 shown in a in fig. 5) is received, if the second sliding operation continuing to move up or down from the end position of the first sliding operation is received, it is determined whether a sliding distance corresponding to the second sliding operation along the vertical direction is greater than the second sliding distance, if it is larger than the second sliding distance, a cursor (e.g., cursor 432 shown in b of fig. 5) is displayed at the end position of the second sliding operation (e.g., position C shown in b of fig. 5).
In one possible design, when the first operation includes a first sliding operation from a preset area of the first side edge into the screen and a first pressing operation at an end position of the first sliding operation, when the electronic device receives a first sliding operation from the preset area of the first side edge into the screen (a sliding operation along arrow 411 as shown in a in fig. 4), whether the sliding distance corresponding to the first sliding operation reaches the first sliding distance can be judged, if the sliding distance reaches the first sliding distance, when a first pressing operation at the end position of the first sliding operation (position B shown in B in fig. 4) is received, it is determined whether the pressing duration corresponding to the first pressing operation is longer than the first pressing duration, and if so, the cursor is displayed at the end position of the first operation, that is, at the end position of the first sliding operation.
Further, as explained in conjunction with fig. 4, the electronic device displays an arc-shaped area (e.g., arc-shaped area 421 shown in b in fig. 4) on the first side in response to the first sliding operation (e.g., a sliding operation along arrow 411 shown in a in fig. 4). The result of the first sliding operation can be seen by the user through the display arc-shaped area, so that the user can intuitively know the sliding distance.
Further, the electronic device determines that the sliding distance corresponding to the first sliding operation (the sliding operation along the arrow 411 as shown in a in fig. 4) reaches the first sliding distance, displays a return identifier (the return identifier 422 as shown in b in fig. 4) in the arc-shaped area of the first side edge, and displays the cursor after the return identifier. In this design, the user is prompted by displaying a return identifier to return to the previous function.
Further, after determining that the sliding distance corresponding to the first sliding operation reaches the first sliding distance, the electronic device determines that the sliding distance of the second sliding operation in the vertical direction does not satisfy the second sliding distance, or the pressing duration corresponding to the first pressing operation does not satisfy the first pressing duration, and then, in response to the first sliding operation, performs a return to a previous function, for example, as shown in B in fig. 6, after the first sliding operation of the user finger sliding from the position a to the position B, if the second sliding operation is continued, the electronic device leaves the touch screen when the sliding distance in the vertical direction does not satisfy the second sliding distance, and then, the electronic device performs a return to a previous function, and returns to a previous page, that is, the interface 410 shown in c in fig. 6, from the interface 420 shown in B in fig. 6. Or after the finger of the user performs the first sliding operation from the position A to the position B, the electronic equipment leaves the touch screen when the pressing time length of the position B does not meet the first pressing time length, and then the electronic equipment returns to the previous function.
In a possible design, when the electronic device receives a third sliding operation (a sliding operation in the direction of the arrow 711 shown in a in fig. 7) that is performed upward or downward on the preset area on the first side, it may be determined whether the sliding distance of the third sliding operation in the vertical direction reaches a third sliding distance, and if the third sliding distance is reached, a cursor (a cursor 721 shown in b in fig. 7) is displayed at the end position of the first operation, that is, the cursor is displayed at the end position of the third sliding operation.
In a possible design, when the electronic device receives a second pressing operation on the preset region of the first side, it may be determined whether a pressing duration corresponding to the second pressing operation is greater than the second pressing duration and a pressing area corresponding to the second pressing operation is greater than the first pressing area, and if both the determination results are yes, a cursor is displayed at an end position of the first pressing operation, that is, the cursor is displayed at a position of the second pressing operation.
In one possible design, the shape of the cursor is any one of, but not limited to: circle, arrow, I-shape, diamond, rectangle, vertical line.
In one possible embodiment, the predetermined area is the entire lateral area of the first lateral edge, or a partial lateral area of the first lateral edge.
Referring to fig. 11, a flow of another touch screen response method provided in an embodiment of the present application is exemplarily shown, and the method is performed by an electronic device.
In step 1101, the electronic device detects a third operation of the user's finger on the hover button displayed on the touch screen.
As an example, the hover button may be the hover button 801 in the above example.
In step 1102, when the electronic device determines that the third operation satisfies the second response threshold, the electronic device displays a cursor at an end position of the third operation in response to the third operation.
For example, the third operation is a fourth sliding operation to the left or right in the horizontal direction, or to the up or down in the vertical direction, and the second response threshold is a fourth sliding distance.
For another example, the third operation includes a third pressing operation and a fifth sliding operation, and the second response threshold is a third pressing duration corresponding to the third pressing operation and a fifth sliding distance corresponding to the fifth sliding operation.
Step 1103, when the electronic device detects a fourth operation of the user's finger on the cursor and the user's finger leaves the touch screen after the second operation is finished, in response to the fourth operation, a click operation is triggered at a cursor position corresponding to a position where the user's finger leaves the touch screen, where a moving distance of the cursor is in direct proportion to a moving distance of the user's finger corresponding to the fourth operation, and the ratio is greater than 1.
For example, the fourth operation is a slide operation.
And step 1104, when the electronic device detects the click operation, responding to the click operation.
Based on the method, the user operates the suspension button by fingers, cursor display can be touched, then the cursor is operated, the cursor can move along with the movement of the fingers, the moving distance of the cursor is in direct proportion to the moving distance of the user fingers corresponding to the second operation, the ratio is larger than 1, namely, the cursor can move for a larger distance by moving the user fingers for a smaller distance, so that the user can use the thumb of the hand holding the mobile phone to carry out single-hand operation on the touch screen without changing the position and the size of a window of a user interface, and any position on the screen can be clicked.
In one possible design, the electronic device detects a fourth sliding operation of the user's finger on the hover button displayed on the touch screen in the horizontal direction to the left or the right, or in the vertical direction to the up or the down direction, and when the electronic device determines that the sliding distance corresponding to the fourth sliding operation is greater than the fourth sliding operation, the electronic device displays the cursor at the end position of the fourth sliding operation in response to the fourth sliding operation. Illustratively, the fourth sliding operation is a sliding operation in the direction of the arrow 802 as shown by b in fig. 8, sliding from the position H to the position K, and the distance difference between the position K and the position H is larger than the fourth sliding operation, then the cursor 802 is displayed at the position K.
In one possible design, the electronic device detects a third pressing operation of a finger of the user on a hover button displayed on the touch screen, detects a fifth sliding operation on the hover button when the electronic device determines that a pressing duration corresponding to the third pressing operation is longer than the third pressing duration, and displays a cursor at an end position of the fifth sliding operation in response to the fifth sliding operation when it is determined that a sliding distance corresponding to the fifth sliding operation is longer than the fifth sliding distance.
Further, when the pressing duration of the third pressing operation reaches the third pressing duration, the electronic device performs vibration prompting. Thus, the user can determine that the pressing time period of the third pressing operation reaches the third pressing time period by sensing the vibration without looking at the screen.
In one possible design, the shape of the cursor includes, but is not limited to, any of: arrow, I-shape, circle, diamond, rectangle, vertical line.
In the embodiments provided in the present application, the method provided in the embodiments of the present application is described from the perspective of an electronic device as an execution subject. In order to implement the functions in the method provided by the embodiments of the present application, the electronic device may include a hardware structure and/or a software module, and the functions are implemented in the form of a hardware structure, a software module, or a hardware structure and a software module. Whether any of the above-described functions is implemented as a hardware structure, a software module, or a hardware structure plus a software module depends upon the particular application and design constraints imposed on the technical solution.
When implemented in hardware, the hardware implementation of the electronic device may refer to fig. 12 and its associated description.
Referring to fig. 12, the electronic device 100 includes: a touch screen 1201, wherein the touch screen 1201 includes a touch panel 1207 and a display screen 1208; one or more processors 1202; a memory 1203; one or more application programs (not shown); and one or more computer programs 1204, the sensors 1205, the various devices described above may be connected by one or more communication buses 1206. Wherein the one or more computer programs 1204 are stored in the memory 1203 and configured to be executed by the one or more processors 1202, the one or more computer programs 1204 comprising instructions that can be used to perform the methods of any of the embodiments described above.
The embodiment of the present application further provides a computer storage medium, where a computer instruction is stored in the computer storage medium, and when the computer instruction runs on an electronic device, the electronic device is enabled to execute the relevant method steps to implement the response method of the touch screen in the foregoing embodiment.
The embodiment of the present application further provides a computer program product, which when running on a computer, causes the computer to execute the above related steps to implement the response method of the touch screen in the above embodiment.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component or a module, and may include a processor and a memory connected to each other; the memory is used for storing computer execution instructions, and when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the chip can execute the translation method in the above-mentioned method embodiments.
In addition, the electronic device, the computer storage medium, the computer program product, or the chip provided in the embodiments of the present application are all configured to execute the corresponding method provided above, so that the beneficial effects achieved by the electronic device, the computer storage medium, the computer program product, or the chip may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Through the description of the above embodiments, those skilled in the art will understand that, for convenience and simplicity of description, only the division of the above functional modules is used as an example, and in practical applications, the above function distribution may be completed by different functional modules as needed, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, a module or a unit may be divided into only one logic function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (31)
1. A response method of a touch screen is applied to an electronic device with the touch screen, and is characterized by comprising the following steps:
receiving a first operation of a finger of a user in a preset area of a first side edge of the touch screen, wherein the first side edge is one of the left side and the right side of the touch screen;
when the first operation is determined to meet a first response threshold value, displaying a cursor at the end position of the first operation in response to the first operation;
detecting a second operation of the user finger on the cursor, and triggering a click operation at the cursor position corresponding to the position where the user finger leaves the touch screen in response to the second operation when the user finger leaves the touch screen at the end of the second operation, wherein the moving distance of the cursor is in direct proportion to the moving distance of the user finger corresponding to the second operation, and the ratio is greater than 1;
and responding to the click operation when the click operation is detected.
2. The method according to claim 1, wherein the first operation includes a first sliding operation from a preset area of the first side edge into a screen and a second sliding operation continuing from an end position of the first sliding operation upward or downward, and the first response threshold is a first sliding distance corresponding to the first sliding operation and a second sliding distance in a vertical direction corresponding to the second sliding operation.
3. The method according to claim 1, wherein the first operation includes a first sliding operation from a preset area of the first side edge into a screen and a first pressing operation at an end position of the first sliding operation, and the first response threshold is a first sliding distance corresponding to the first sliding operation and a first pressing duration corresponding to the first pressing operation.
4. The method of claim 2 or 3, wherein the method further comprises:
in response to the first sliding operation, an arc-shaped area is displayed on the first side.
5. The method of claim 2 or 3, wherein the method further comprises:
and determining that the sliding distance corresponding to the first sliding operation reaches the first sliding distance, displaying a return identifier in the arc-shaped area of the first side edge, and displaying the cursor after the return identifier.
6. The method of claim 2 or 3, wherein the method further comprises:
after the sliding distance corresponding to the first sliding operation is determined to reach the first sliding distance, determining that the sliding distance of the second sliding operation along the vertical direction does not meet the second sliding distance, or responding to the first operation to execute a function of returning to the previous stage if the pressing duration corresponding to the first pressing operation does not meet the first pressing duration.
7. The method of claim 1, wherein the first operation is a third sliding operation upward or downward in a preset area of the first side edge, and the first response threshold is a third sliding distance in a vertical direction.
8. The method of claim 1, wherein the first operation is a second press operation and the first response threshold is a second press duration and a first press area.
9. The method of any one of claims 1-8, wherein the predetermined area is the entire side area of the first side edge or a portion of the side area of the first side edge.
10. A response method of a touch screen is applied to an electronic device with the touch screen, and is characterized by comprising the following steps:
detecting a third operation of a finger of a user on a floating button displayed on the touch screen;
when the third operation is determined to meet a second response threshold value, displaying a cursor at the end point position of the third operation in response to the third operation;
detecting a fourth operation of the user finger on the cursor, and triggering a click operation at the cursor position corresponding to the position where the user finger leaves the touch screen in response to the fourth operation when the user finger leaves the touch screen at the end of the second operation, wherein the moving distance of the cursor is in direct proportion to the moving distance of the user finger corresponding to the fourth operation, and the ratio is greater than 1;
and responding to the click operation when the click operation is detected.
11. The method of claim 10, wherein the third operation is a fourth sliding operation in a horizontal direction to the left or the right, or in a vertical direction up or down, and the second response threshold is a fourth sliding distance.
12. The method of claim 10, wherein the third operation comprises a third press operation and a fifth swipe operation, and the second response threshold is a third press duration corresponding to the third press operation and a fifth swipe distance corresponding to the fifth swipe operation.
13. The method of claim 12, wherein the method further comprises:
and when the pressing duration of the third pressing operation reaches the third pressing duration, the electronic equipment performs vibration prompt.
14. An electronic device comprising a processor, a memory, a touch screen;
the touch screen is used for receiving the operation of a user;
the memory for storing one or more computer programs that, when executed by the processor, cause the electronic device to perform:
receiving a first operation of a finger of a user in a preset area of a first side edge of the touch screen, wherein the first side edge is one of the left side and the right side of the touch screen;
when the first operation is determined to meet a first response threshold value, displaying a cursor at the end position of the first operation in response to the first operation;
detecting a second operation of the user finger on the cursor, and triggering a click operation at the cursor position corresponding to the position where the user finger leaves the touch screen in response to the second operation when the user finger leaves the touch screen at the end of the second operation, wherein the moving distance of the cursor is in direct proportion to the moving distance of the user finger corresponding to the second operation, and the ratio is greater than 1;
and responding to the click operation when the click operation is detected.
15. The electronic device according to claim 14, wherein the first operation includes a first sliding operation from a preset area on the first side into the screen and a second sliding operation continuing upward or downward from an end position of the first sliding operation, and the first response threshold is a first sliding distance corresponding to the first sliding operation and a second sliding distance in a vertical direction corresponding to the second sliding operation.
16. The electronic device according to claim 14, wherein the first operation includes a first sliding operation from a preset area of the first side edge into the screen and a first pressing operation at an end position of the first sliding operation, and the first response threshold is a first sliding distance corresponding to the first sliding operation and a first pressing duration corresponding to the first pressing operation.
17. The electronic device of claim 15 or 16, wherein the processor is further configured to:
in response to the first sliding operation, an arc-shaped area is displayed on the first side.
18. The electronic device of claim 15 or 16, wherein the processor is further configured to:
and determining that the sliding distance corresponding to the first sliding operation reaches the first sliding distance, displaying a return identifier in the arc-shaped area of the first side edge, and displaying the cursor after the return identifier.
19. The electronic device of claim 15 or 16, wherein the processor is further configured to:
after the sliding distance corresponding to the first sliding operation is determined to reach the first sliding distance, determining that the sliding distance of the second sliding operation along the vertical direction does not meet the second sliding distance, or responding to the first operation to execute a function of returning to the previous stage if the pressing duration corresponding to the first pressing operation does not meet the first pressing duration.
20. The electronic device of claim 14, wherein the first operation is a third sliding operation upward or downward in a preset area of the first side, and the first response threshold is a third sliding distance in a vertical direction.
21. The electronic device of claim 14, wherein the first operation is a second press operation, and the first response threshold is a second press duration and a first press area.
22. The electronic device of any of claims 14-21, wherein the second operation is a sliding operation.
23. The electronic device of any one of claims 14-22, wherein the cursor is in the shape of any one of: circle, arrow, I-shape, diamond, rectangle, vertical line.
24. The electronic device according to any one of claims 14-23, wherein the predetermined area is an entire side area of the first side edge or a partial side area of the first side edge.
25. An electronic device comprising a processor, a memory, a touch screen;
the touch screen is used for receiving the operation of a user;
the memory for storing one or more computer programs that, when executed by the processor, cause the electronic device to perform:
detecting a third operation of a finger of a user on a floating button displayed on the touch screen;
when the third operation is determined to meet a second response threshold value, displaying a cursor at the end point position of the third operation in response to the third operation;
detecting a fourth operation of the user finger on the cursor, and triggering a click operation at the cursor position corresponding to the position where the user finger leaves the touch screen in response to the fourth operation when the user finger leaves the touch screen at the end of the second operation, wherein the moving distance of the cursor is in direct proportion to the moving distance of the user finger corresponding to the fourth operation, and the ratio is greater than 1;
and responding to the click operation when the click operation is detected.
26. The electronic device of claim 25, wherein the third operation is a fourth sliding operation to the left or right in a horizontal direction, or to the up or down in a vertical direction, and the second response threshold is a fourth sliding distance.
27. The electronic device of claim 25, wherein the third operation comprises a third press operation and a fifth swipe operation, and the second response threshold is a third press duration corresponding to the third press operation and a fifth swipe distance corresponding to the fifth swipe operation.
28. The electronic device of claim 27, wherein the processor is further configured to:
and when the pressing duration of the third pressing operation reaches the third pressing duration, the electronic equipment performs vibration prompt.
29. The electronic device of any one of claims 25-28, wherein the cursor is in the shape of any one of: arrow, I-shape, circle, diamond, rectangle, vertical line.
30. The electronic device of any of claims 25-29, wherein the fourth operation is a sliding operation.
31. A computer-readable storage medium, comprising a computer program which, when run on an electronic device, causes the electronic device to perform the method of any of claims 1 to 13.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910944437.3A CN110806831A (en) | 2019-09-30 | 2019-09-30 | Touch screen response method and electronic equipment |
PCT/CN2020/105489 WO2021063098A1 (en) | 2019-09-30 | 2020-07-29 | Touch screen response method, and electronic device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910944437.3A CN110806831A (en) | 2019-09-30 | 2019-09-30 | Touch screen response method and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110806831A true CN110806831A (en) | 2020-02-18 |
Family
ID=69488124
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910944437.3A Pending CN110806831A (en) | 2019-09-30 | 2019-09-30 | Touch screen response method and electronic equipment |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110806831A (en) |
WO (1) | WO2021063098A1 (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021063098A1 (en) * | 2019-09-30 | 2021-04-08 | 华为技术有限公司 | Touch screen response method, and electronic device |
CN112764623A (en) * | 2021-01-26 | 2021-05-07 | 维沃移动通信有限公司 | Content editing method and device |
CN113515217A (en) * | 2021-04-08 | 2021-10-19 | Oppo广东移动通信有限公司 | Touch processing method and device, storage medium and electronic equipment |
CN114115629A (en) * | 2020-08-26 | 2022-03-01 | 华为技术有限公司 | Interface display method and equipment |
CN114168022A (en) * | 2021-11-04 | 2022-03-11 | 厦门知本家科技有限公司 | Vibration feedback system and method for editing house type structure model |
CN114997186A (en) * | 2021-09-02 | 2022-09-02 | 荣耀终端有限公司 | Control method of translation control and electronic equipment |
CN115657863A (en) * | 2022-12-29 | 2023-01-31 | 北京东舟技术股份有限公司 | Non-invasive follow-up chirality detection method and device of touch screen equipment |
CN117008772A (en) * | 2022-04-28 | 2023-11-07 | 华为技术有限公司 | Display method of application window and electronic equipment |
CN117827054A (en) * | 2022-05-31 | 2024-04-05 | 荣耀终端有限公司 | Screen capturing method, device and storage medium |
US12073071B2 (en) | 2020-07-29 | 2024-08-27 | Huawei Technologies Co., Ltd. | Cross-device object drag method and device |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20110138743A (en) * | 2010-06-21 | 2011-12-28 | 안현구 | Mobile device having back and side touch pad |
CN103412725A (en) * | 2013-08-27 | 2013-11-27 | 广州市动景计算机科技有限公司 | Touch operation method and device |
CN103543913A (en) * | 2013-10-25 | 2014-01-29 | 小米科技有限责任公司 | Terminal device operation method and device, and terminal device |
CN104615352A (en) * | 2015-01-28 | 2015-05-13 | 上海华豚科技有限公司 | Mobile terminal desktop, mobile terminal and operation method for mobile terminal desktop |
CN104932809A (en) * | 2014-03-19 | 2015-09-23 | 索尼公司 | Device and method for controlling a display panel |
CN105204754A (en) * | 2014-06-26 | 2015-12-30 | 深圳Tcl新技术有限公司 | One-handed operation method and device of touch screen |
CN106371749A (en) * | 2016-08-30 | 2017-02-01 | 青岛海信移动通信技术股份有限公司 | Method and device for terminal control |
TW201706817A (en) * | 2015-08-06 | 2017-02-16 | 兆霆科技股份有限公司 | Operating structure and method of operation of the communication device |
CN108008868A (en) * | 2016-10-28 | 2018-05-08 | 南宁富桂精密工业有限公司 | Interface control method and electronic device |
CN108958580A (en) * | 2018-06-28 | 2018-12-07 | 维沃移动通信有限公司 | A kind of display control method and terminal device |
US10628000B2 (en) * | 2015-11-25 | 2020-04-21 | Misumi Group Inc. | Electronic book browsing assistance method and browsing assistance program |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI536249B (en) * | 2013-08-14 | 2016-06-01 | Nat Chengchi University | Hand-held touch device and its single-hand manipulation of the full touch range method |
US9304680B2 (en) * | 2013-11-25 | 2016-04-05 | At&T Mobility Ii Llc | Methods, devices, and computer readable storage device for touchscreen navigation |
CN110806831A (en) * | 2019-09-30 | 2020-02-18 | 华为技术有限公司 | Touch screen response method and electronic equipment |
-
2019
- 2019-09-30 CN CN201910944437.3A patent/CN110806831A/en active Pending
-
2020
- 2020-07-29 WO PCT/CN2020/105489 patent/WO2021063098A1/en active Application Filing
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20110138743A (en) * | 2010-06-21 | 2011-12-28 | 안현구 | Mobile device having back and side touch pad |
CN103412725A (en) * | 2013-08-27 | 2013-11-27 | 广州市动景计算机科技有限公司 | Touch operation method and device |
CN103543913A (en) * | 2013-10-25 | 2014-01-29 | 小米科技有限责任公司 | Terminal device operation method and device, and terminal device |
CN104932809A (en) * | 2014-03-19 | 2015-09-23 | 索尼公司 | Device and method for controlling a display panel |
CN105204754A (en) * | 2014-06-26 | 2015-12-30 | 深圳Tcl新技术有限公司 | One-handed operation method and device of touch screen |
CN104615352A (en) * | 2015-01-28 | 2015-05-13 | 上海华豚科技有限公司 | Mobile terminal desktop, mobile terminal and operation method for mobile terminal desktop |
TW201706817A (en) * | 2015-08-06 | 2017-02-16 | 兆霆科技股份有限公司 | Operating structure and method of operation of the communication device |
US10628000B2 (en) * | 2015-11-25 | 2020-04-21 | Misumi Group Inc. | Electronic book browsing assistance method and browsing assistance program |
CN106371749A (en) * | 2016-08-30 | 2017-02-01 | 青岛海信移动通信技术股份有限公司 | Method and device for terminal control |
CN108008868A (en) * | 2016-10-28 | 2018-05-08 | 南宁富桂精密工业有限公司 | Interface control method and electronic device |
CN108958580A (en) * | 2018-06-28 | 2018-12-07 | 维沃移动通信有限公司 | A kind of display control method and terminal device |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021063098A1 (en) * | 2019-09-30 | 2021-04-08 | 华为技术有限公司 | Touch screen response method, and electronic device |
US12073071B2 (en) | 2020-07-29 | 2024-08-27 | Huawei Technologies Co., Ltd. | Cross-device object drag method and device |
CN114115629A (en) * | 2020-08-26 | 2022-03-01 | 华为技术有限公司 | Interface display method and equipment |
CN112764623A (en) * | 2021-01-26 | 2021-05-07 | 维沃移动通信有限公司 | Content editing method and device |
CN113515217A (en) * | 2021-04-08 | 2021-10-19 | Oppo广东移动通信有限公司 | Touch processing method and device, storage medium and electronic equipment |
CN114997186A (en) * | 2021-09-02 | 2022-09-02 | 荣耀终端有限公司 | Control method of translation control and electronic equipment |
CN114168022A (en) * | 2021-11-04 | 2022-03-11 | 厦门知本家科技有限公司 | Vibration feedback system and method for editing house type structure model |
CN117008772A (en) * | 2022-04-28 | 2023-11-07 | 华为技术有限公司 | Display method of application window and electronic equipment |
CN117827054A (en) * | 2022-05-31 | 2024-04-05 | 荣耀终端有限公司 | Screen capturing method, device and storage medium |
CN115657863A (en) * | 2022-12-29 | 2023-01-31 | 北京东舟技术股份有限公司 | Non-invasive follow-up chirality detection method and device of touch screen equipment |
Also Published As
Publication number | Publication date |
---|---|
WO2021063098A1 (en) | 2021-04-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110489043B (en) | Management method and related device for floating window | |
KR102470275B1 (en) | Voice control method and electronic device | |
CN113645351B (en) | Application interface interaction method, electronic device and computer-readable storage medium | |
CN114816210B (en) | Full screen display method and device of mobile terminal | |
CN112714901B (en) | Display control method of system navigation bar, graphical user interface and electronic equipment | |
CN110362244B (en) | Screen splitting method and electronic equipment | |
WO2021036571A1 (en) | Desktop editing method and electronic device | |
CN111669459B (en) | Keyboard display method, electronic device and computer readable storage medium | |
WO2021063098A1 (en) | Touch screen response method, and electronic device | |
CN110798552A (en) | Volume adjusting method and electronic equipment | |
CN111078091A (en) | Split screen display processing method and device and electronic equipment | |
CN111758263A (en) | Display method of flexible screen and terminal | |
CN112751954B (en) | Operation prompting method and electronic equipment | |
CN110633043A (en) | Split screen processing method and terminal equipment | |
CN114327666A (en) | Application starting method and device and electronic equipment | |
CN112698756A (en) | Display method of user interface and electronic equipment | |
CN112068907A (en) | Interface display method and electronic equipment | |
CN113746961A (en) | Display control method, electronic device, and computer-readable storage medium | |
CN114077365A (en) | Split screen display method and electronic equipment | |
CN113010076A (en) | Display element display method and electronic equipment | |
CN113961115B (en) | Object editing method, electronic device, medium, and program product | |
JP2024513773A (en) | Display methods, electronic devices, storage media, and program products | |
CN112449101A (en) | Shooting method and electronic equipment | |
CN113885973A (en) | Translation result display method and device and electronic equipment | |
CN113050864A (en) | Screen capturing method and related equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200218 |
|
RJ01 | Rejection of invention patent application after publication |