CN113448482B - Sliding response control method and device of touch screen and electronic equipment - Google Patents
Sliding response control method and device of touch screen and electronic equipment Download PDFInfo
- Publication number
- CN113448482B CN113448482B CN202010219561.6A CN202010219561A CN113448482B CN 113448482 B CN113448482 B CN 113448482B CN 202010219561 A CN202010219561 A CN 202010219561A CN 113448482 B CN113448482 B CN 113448482B
- Authority
- CN
- China
- Prior art keywords
- report
- data
- current
- point
- point data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 67
- 230000004044 response Effects 0.000 title claims abstract description 64
- 238000010801 machine learning Methods 0.000 claims abstract description 6
- 238000004422 calculation algorithm Methods 0.000 claims description 52
- 230000001133 acceleration Effects 0.000 claims description 31
- 238000012545 processing Methods 0.000 claims description 23
- 238000005070 sampling Methods 0.000 claims description 6
- 230000003139 buffering effect Effects 0.000 claims description 2
- 238000004891 communication Methods 0.000 description 45
- 230000006854 communication Effects 0.000 description 45
- 230000006870 function Effects 0.000 description 24
- 238000007726 management method Methods 0.000 description 18
- 230000005236 sound signal Effects 0.000 description 13
- 238000010295 mobile communication Methods 0.000 description 12
- 238000013461 design Methods 0.000 description 11
- 238000010586 diagram Methods 0.000 description 10
- 210000000988 bone and bone Anatomy 0.000 description 9
- 230000008569 process Effects 0.000 description 8
- 238000004590 computer program Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 5
- 238000009877 rendering Methods 0.000 description 5
- 238000013136 deep learning model Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000001934 delay Effects 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 230000009467 reduction Effects 0.000 description 3
- 229920001621 AMOLED Polymers 0.000 description 2
- 230000003416 augmentation Effects 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000002194 synthesizing effect Effects 0.000 description 2
- 241000251730 Chondrichthyes Species 0.000 description 1
- 244000062793 Sorghum vulgare Species 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007175 bidirectional communication Effects 0.000 description 1
- 238000013529 biological neural network Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 239000013078 crystal Substances 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000010985 leather Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 235000019713 millet Nutrition 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 230000002138 osteoinductive effect Effects 0.000 description 1
- 230000010349 pulsation Effects 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000003238 somatosensory effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
Abstract
The embodiment of the application provides a sliding response control method and device of a touch screen and electronic equipment. The present application relates to the field of machine learning. The method comprises the following steps: judging whether to carry out reporting point prediction or not when current reporting point data is received in the current reporting point period; if yes, judging whether the current report data meets a preset condition or not; wherein the preset condition includes that the current report point data is generated by a MOVE event; if the preset condition is met, predicting the next report data based on the cached report data to obtain predicted report data; and replacing the current report point data with the predicted report point data in the current report point period, and reporting the predicted report point data to a foreground application, so that the foreground application triggers a sliding response operation in advance. The technical scheme shortens the following response time of the electronic equipment.
Description
Technical Field
The present application relates to the field of touch technologies, and in particular, to a sliding response control method and apparatus for a touch screen, and an electronic device.
Background
The heel response time is a key index for measuring touch experience, and in a human body perceivable range, the heel response time is shortened, so that the heel chirality can be improved, and the touch experience of a user is improved; meanwhile, mobile phone manufacturers and media evaluation pay more and more attention to follow-up performance.
The hand following performance refers to delay from Touch control of a user on a Touch Panel (TP) to end-to-end processing of a Touch Panel display, is end-to-end performance data, and can embody overall comprehensive processing performance of the mobile phone. Firstly, starting a hand touch screen, and scanning to the point by a TP kernel layer, namely, starting the time delay of the point; secondly, after the point data of the kernel layer is transmitted to an input subsystem, the time delay of the point data transmitted to a User Interface (UI) window is packaged and transmitted in the input subsystem; the UI control responds to the point event to trigger the time delay of UI drawing and Render thread rendering; then, the rendering end is sent to an Android map layer synthesizing module (Surface Flinger, SF for short) for synthesizing time delay; and finally, sending the synthesized data to display delays displayed by the touch screen, wherein the delays form integral hand following performance delays, so that the hand following performance can be improved by improving each section.
In the prior art, by means of the scheme for improving the sliding hand following performance, the time consumption of the TP side can be reduced by improving the scheme for rapidly reporting the first point, for example, the TP scanning frequency is improved, for example, a black shark mobile phone of a mobile phone manufacturer is improved, the TP scanning frequency is improved to 240HZ, and the time delay from touching a touch screen to reporting the first point data to an input subsystem is improved. However, the requirement for the touch screen is increased, and the cost is increased. The other type of sliding-improving hand following performance can improve the hand following response by improving the CPU frequency point, reducing the delay time from the input subsystem to the UI. For example, the mobile phones such as Samsung S10 and millet 9 of mobile phone manufacturers increase CPU frequency points for key threads from input subsystems to UI. But this way the power consumption of the electronic device is increased.
Disclosure of Invention
The embodiment of the application provides a sliding response control method and device of a touch screen and electronic equipment, and shortens the following response time of the electronic equipment.
In a first aspect of an embodiment of the present application, a sliding response control method of a touch screen is provided, where the method includes: judging whether to carry out reporting point prediction or not when current reporting point data is received in the current reporting point period; if yes, judging whether the current report data meets a preset condition or not; wherein the preset condition includes that the current report point data is generated by a MOVE event; if the preset condition is met, predicting the next report data based on the cached report data to obtain predicted report data; and replacing the current report point data with the predicted report point data in the current report point period, and reporting the predicted report point data to a foreground application, so that the foreground application triggers a sliding response operation in advance.
In one possible design, the determining whether to report the point includes: determining the current sliding speed according to the current report point data and the cached report point data; the current sliding speed is the sliding speed of the sampling time corresponding to the current report data; judging whether the current sliding speed is within a preset sliding speed range or not; if yes, determining to carry out report point prediction; if not, determining not to carry out the reporting point prediction.
In one possible design, predicting the next report data based on the cached report data to obtain predicted report data includes: determining a prediction algorithm according to the current application scene; determining the predicted report point data based on the prediction algorithm; wherein the predictive algorithm comprises a machine learning based algorithm.
In one possible design, predicting the next report data based on the cached report data to obtain predicted report data includes: calculating the acceleration of the current report data at the corresponding touch point position on the touch screen according to the sliding speed and the report period of the cached report data at the corresponding touch point position on the touch screen; determining a predicted position track of the next report data on the touch screen according to the predicted time interval; determining a predicted position of the next report data on the predicted position track based on the acceleration of the current report data on the touch screen corresponding to the touch point position; determining the predicted report point data based on the predicted position; the predicted time interval refers to the time from the current report data to the predicted position when the current report data slides to the corresponding touch point position on the touch screen.
In one possible design, the predicted time interval is determined based on the current application scenario.
In one possible design, the determining whether to report the point includes: and judging whether the current application scene is an application scene suitable for reporting point prediction.
The judging whether the preset condition is met further comprises: and judging whether the number of the cached report point data meets the requirement of report point prediction.
In one possible design, the method further comprises: and when the current report point period receives the current report point data, if the report point prediction is not performed, reporting the current report point data to the foreground application in the current report point period.
In one possible design, replacing the current report point data with the predicted report point data in the current report point period includes: and storing the current report data into a designated area for the foreground application to acquire.
In one possible design, the method further comprises: and the foreground application distributes the acquired current report data to a window control of the foreground application, so that the foreground control triggers the foreground application to respond to the sliding operation in advance according to the current report data.
In one possible design, the method further comprises: if the current report point data is generated by a DOWN event or a MOVE event, caching the current report point data; and if the current report point data is generated by the UP event, clearing the cached report point data.
In one possible design, the method further comprises: and if the corresponding touch point position of the predicted report point data on the touch screen exceeds the boundary of the touch screen, adjusting the predicted report point data based on the boundary of the touch screen.
In one possible design, the method further comprises: determining the sliding track direction according to the cached report data; and if the position of the touch point corresponding to the predicted report point data on the touch screen is opposite to the sliding track direction, adjusting the predicted report point data.
In a second aspect of the embodiment of the present application, there is provided a sliding response control device for a touch screen, including: the report point prediction determining module is used for judging whether to carry out report point prediction or not when the current report point data is received in the current report point period;
The preset condition judging module is used for judging whether the current report data meets preset conditions or not if the judgment result of the report prediction determining module is yes; wherein the preset condition includes that the current report point data is generated by a MOVE event; the prediction report point data algorithm module is used for predicting the next report point data based on the cached report point data to obtain prediction report point data if the judgment result of the report point prediction determination module meets the preset condition; and the report point data reporting module is used for replacing the current report point data with the predicted report point data in the current report point period and reporting the predicted report point data to a foreground application so that the foreground application triggers a sliding response operation in advance.
In a third aspect of an embodiment of the present application, there is provided an electronic device, including: the electronic device comprises a memory and a processor, wherein the memory is used for storing information comprising program instructions, and the processor is used for controlling execution of the program instructions, and the electronic device is enabled to execute the sliding response control method of the touch screen when the program instructions are loaded and executed by the processor.
In a fourth aspect of an embodiment of the present application, there is provided a storage medium including: the storage medium comprises a stored program, and is characterized in that the device where the storage medium is located is controlled to execute a sliding response control method of the touch screen when the program runs.
Compared with the prior art, the technical scheme has the following beneficial effects:
According to the sliding response control method of the touch screen, when the current report point period receives the current report point data, whether preset conditions are met or not is judged, and at least the current report point data is generated by a MOVE event in the preset conditions; if yes, further judging whether to carry out point prediction; and if the report point prediction is carried out, predicting the next report point data according to the cached report point data to obtain predicted report point data. And then, replacing the current report point data with the predicted report point data in the current report point period, and reporting the predicted report point data to a foreground application, so that the foreground application triggers a sliding response operation in advance. Therefore, the touch screen can start sliding response at least one report period in advance, and the system delay part time from the input subsystem to the UI window meeting the sliding threshold can be reduced.
Further, whether the current application scene of the electronic equipment is suitable for the application scene of the report point prediction is judged, if yes, a corresponding prediction algorithm can be selected according to the current application scene when the report point prediction is carried out, and the received report point data is processed by the selected prediction algorithm, so that the predicted report point data is obtained.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a coordinate position of a point event on a touch screen in a sliding response control method of the touch screen;
Fig. 3 is a flowchart of a specific embodiment of a sliding response control method of a touch screen according to an embodiment of the present application;
fig. 4 is a schematic diagram of a coordinate position of a point event on a touch screen in a sliding response control method of the touch screen according to an embodiment of the present application;
Fig. 5 is a schematic block diagram of an input subsystem in a sliding response control method of a touch screen according to an embodiment of the present application;
Fig. 6 is a schematic structural diagram of a specific embodiment of a sliding response control device for a touch screen according to an embodiment of the present application.
Detailed Description
The terminology used in the description of the embodiments of the application herein is for the purpose of describing particular embodiments of the application only and is not intended to be limiting of the application.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Referring to fig. 1, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (applicationprocessor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural-Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SERIAL DATA LINE, SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (CAMERA SERIAL INTERFACE, CSI), display serial interfaces (DISPLAY SERIAL INTERFACE, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near field communication (NEAR FIELD communication, NFC), infrared (IR), etc., applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques can include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation SATELLITE SYSTEM, GLONASS), a beidou satellite navigation system (beidou navigation SATELLITE SYSTEM, BDS), a quasi zenith satellite system (quasi-zenith SATELLITE SYSTEM, QZSS) and/or a satellite based augmentation system (SATELLITE BASED AUGMENTATION SYSTEMS, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, a quantum dot LIGHT EMITTING diode (QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (movingpicture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association ofthe USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip machine, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 180G, so as to automatically extinguish the screen for the purpose of saving power. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may analyze the voice signal based on the vibration signal of the sound portion vibration bone block obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor may analyze the heart rate information based on the blood pressure beat signal acquired by the bone conduction sensor 180M, so as to implement a heart rate detection function.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
As described in the background, the following performance delay includes: 1) The delay from the touch screen to the first point of the Kernel layer scanning to the touch point is the first point delay; 2) The report point data of the Kernel layer is transmitted to the input subsystem, and the time delay of the report point data transmitted to the UI window by the input subsystem; 3) The UI space responds to the point event to trigger the time delay of UI drawing and Render thread rendering; 4) Time delay from the end of rendering to the synthesis of SF; 5) And the time delay of the synthesized data transmitted to the touch screen for display. The time delay of the five stages forms the integral hand following performance time delay.
The technical scheme of the application is mainly aimed at improving the time delay of the second stage (namely, the time delay of the Kernel layer point data transmitted to the input subsystem and transmitted to the UI window by the input subsystem), and reducing the system delay part time from the input subsystem to the UI window meeting the sliding threshold.
Fig. 2 is a schematic diagram of a coordinate position of a point event on a touch screen in a sliding response control method of the touch screen. Referring to fig. 2, a coordinate system of a touch screen based on an electronic device is shown in fig. 2 with a left lower corner vertex of the touch screen as an origin, and directions of an X axis and a Y axis.
The electronic device can be a device with a touch screen, such as a mobile phone, a watch, a tablet computer and the like. A hardware architecture of an electronic device may be described with reference to the embodiment of fig. 1.
Referring to fig. 1 and 2 in combination, when a user performs a touch operation on the touch screen of the electronic device shown in fig. 1, the touch sensor 180K may detect a touch operation performed on the touch screen by the user. The touch sensor 180K may communicate the detected touch operation to the application processor to determine the touch event type. Specifically, in the process from the beginning of the user to the touch screen and through sliding until the user lifts the touch screen, the point reporting event generated by the Kernel layer sequentially comprises: a DOWN event, a number of MOVE events, and an UP event, wherein the number of MOVE events depends on the time of touching the screen.
In fig. 2, position point 1 is the corresponding coordinate position of the DOWN event on the touch screen, and position point 2, position point 3, and position point 4 are the corresponding coordinate positions of the MOVE event on the touch screen after the DOWN event, respectively. Since the touch screen starts to respond to the touch sliding of the user under the current UI sliding scene and needs to reach the sliding response threshold, that is, after the DOWN event, the subsequent MOVE event needs to reach a certain threshold, the touch screen can trigger the sliding response.
With continued reference to fig. 2, assume that, during the sliding process, when the user slides to the position point 4, the input subsystem reports the report point data generated by the MOVE event to the UI window, and at this time, the distance between the position point 4 and the position point 1 meets the sliding response threshold, and then the touch screen starts to respond to the sliding in the report point period corresponding to the position point 4.
Fig. 3 is a flowchart of a specific embodiment of a sliding response control method of a touch screen according to an embodiment of the present application. Referring to fig. 3, the method includes:
Step 301, judging whether to carry out reporting point prediction or not when current reporting point data is received in the current reporting point period;
step 302, if yes, judging whether the current report data meets a preset condition; wherein the preset condition includes that the current report point data is generated by a MOVE event;
Step 303, if the preset condition is met, predicting the next report data based on the cached report data to obtain predicted report data;
And 304, replacing the current report point data with the predicted report point data in the current report point period, and reporting the predicted report point data to a foreground application, so that the foreground application triggers a sliding response operation in advance.
It should be noted that, the execution subject of each step of the sliding response control method described in this embodiment is an input subsystem.
The embodiment adds a report point prediction function on the basis of the existing sliding response control method, namely, when the input subsystem receives the current report point data in the current report point period, the input subsystem determines whether to perform report point prediction or not. If it is determined that the reporting prediction is not performed, the processing is still performed according to the existing sliding response control method. If the report prediction is determined, further judging whether the current report data meets the preset condition, wherein the preset condition comprises that the current report data is generated by a MOVE event. If so, the sliding response control method according to the embodiment processes the next report data, so that when the predicted next report data meets the sliding response threshold, the touch screen can start sliding response at least one report period in advance, thereby reducing the system delay time from the input subsystem to the UI window to meet the sliding threshold.
As described in step 301, when the current report point period receives the current report point data, it is determined whether to perform report point prediction.
Specifically, the method comprises the following steps:
Step 3011, determining a current sliding speed according to the current report point data and the cached report point data; the current sliding speed is the sliding speed of the sampling time corresponding to the current report data;
step 3012, judging whether the current sliding speed is within a preset sliding speed range;
step 3013, if yes, determining to carry out report prediction;
Step 3014, if not, determining not to carry out the reporting prediction.
In this embodiment, the input subsystem may determine, according to the received current point data, a touch point position corresponding to the current point data on the touch screen. The current sliding speed may then be determined in combination with the cached point data and the current point data. The cached report data comprises report data corresponding to a DOWN event and report data corresponding to a MOVE event generated by a Kernel layer when a user starts to touch the touch screen. And when the Kernel layer generates an UP event, the Kernel layer indicates that the user has completed the sliding operation, and the buffer is emptied at this time. The implementation of the buffer memory can be to write the report point data corresponding to the DOWN event and the MOVE event into the queue for buffer memory.
For example, according to the distance between the touch point position corresponding to the previous report point data on the touch screen and the touch point position corresponding to the current report point data on the touch screen, and the time interval of the report point periods of the two adjacent report point data (i.e., the previous report point data and the current report point data), the sliding speed (i.e., the current sliding speed) from the user sliding on the touch screen to the sampling time corresponding to the current report point data can be calculated.
Then, judging whether the current sliding speed is within a preset sliding speed range. Wherein the preset sliding speed range may be preset by the input subsystem. If the current sliding speed is within the preset sliding speed range, determining to carry out reporting prediction; and if the current sliding speed is not in the preset sliding speed range, determining that the reporting prediction is not performed.
When a user slides to the corresponding touch point position of the current report point data on the touch screen, if the current sliding speed is too high, the report point data generated by the follow-up MOVE event can quickly reach a sliding response threshold value, so that report point prediction can be omitted and still be processed according to the existing sliding response control method; if the current sliding speed is too slow, the problem that clicking becomes sliding easily occurs in the reporting prediction, namely, the clicking operation of the user is misjudged as the sliding operation. And only when the current sliding speed meets the preset sliding speed range (namely, the current sliding speed is not too fast or too slow), the next MOVE event report point data is considered to possibly reach the sliding response threshold value, and report point prediction is determined.
Further, determining whether to perform the reporting prediction includes: and judging whether the current application scene is an application scene suitable for reporting point prediction. Specifically, an application scenario white list may be set, where the application scenario white list includes all application scenarios suitable for reporting point prediction. The electronic device may identify a current 1 application scenario (e.g., game, browsing a web page, operating APP, etc.) of the electronic device when the user operates the touch screen, and determine whether the current application scenario is an application scenario in the application scenario whitelist.
If yes, determining whether the current report data meets a preset condition, as shown in step 302; wherein the preset condition includes that the current report point data is generated by a MOVE event.
In this embodiment, the input subsystem needs to determine whether a preset condition is satisfied for current report data received in a current report period, where the preset condition includes that the current report data is generated by a MOVE event. As can be appreciated from the embodiment described above with respect to fig. 2, the reporting of point prediction is not required if the reporting of point data is generated by a DOWN event or an UP event.
In step 303, if the preset condition is met, the next report data is predicted based on the cached report data to obtain predicted report data.
In one embodiment, the step includes:
Step 3031, determining a prediction algorithm according to the current application scene;
Step 3032, determining the predicted report point data based on the prediction algorithm; wherein the predictive algorithm comprises a machine learning based algorithm.
As depicted in step 3031, a prediction algorithm is determined based on the current application scenario.
The sliding response control method described in the present embodiment may be applicable to various application scenarios of the electronic device, including but not limited to: 2D application scenes, game scenes, stylus scenes, as well as handle application scenes with associated bluetooth or wireless peripherals, etc. The mode of judging the current application scene may be that the current application scene is judged according to a foreground application, for example, if the foreground application is a game APP, the current application scene is determined to be a game scene. The mode of judging the current application scene can also be to judge the current application scene according to an application area (such as a multi-window, a split screen scene and the like) activated by the corresponding position of the point data on the touch screen.
Determining the predicted report point data based on the prediction algorithm, as described in step 3032; wherein the predictive algorithm comprises a machine-based learning algorithm.
In this embodiment, the prediction algorithm may be an algorithm based on machine learning, and the deep learning model is built and trained, so that the trained deep learning model implements the prediction algorithm to output prediction report data.
In another embodiment, the step includes:
step 3033, calculating the acceleration of the current report data at the corresponding touch point position on the touch screen according to the sliding speed and the report period of the cached report data at the corresponding touch point position on the touch screen;
Step 3034, determining a predicted position track of the next report data on the touch screen according to the predicted time interval; wherein the prediction time interval is determined based on the prediction algorithm;
step 3035, determining a predicted position of the next report data on the predicted position track based on the acceleration of the current report data on the corresponding touch point position on the touch screen;
Step 3036, determining the predicted report point data based on the predicted position; the predicted time interval refers to the time from the current report data to the predicted position when the current report data slides to the corresponding touch point position on the touch screen.
Fig. 4 is a schematic diagram of a coordinate position of a point event on a touch screen in a sliding response control method of the touch screen according to an embodiment of the present application.
Referring to fig. 4, position point 1 is the corresponding coordinate position of the DOWN event on the touch screen, and position point 2 and position point 3 are the corresponding coordinate positions of the MOVE event on the touch screen after the DOWN event, respectively. The position point 3 is a coordinate position corresponding to the current report point data received by the input subsystem in the current report point period on the touch screen.
As shown in step 3033, the input subsystem calculates the sliding speed and the reporting period of the corresponding touch point positions (including the position point 1, the position point 2 and the position point 3) on the touch screen according to the cached data of each reporting point, and the calculating formula of the acceleration is shown as follows:
Wherein a is acceleration, deltav is sliding speed variation, deltat is time interval, and the acceleration of the current report data at the corresponding touch point position (namely position point 3) on the touch screen is calculated.
For example, if the sliding speed of the position point 2 is v2, the sliding speed of the position point 3 is v3, and the time interval between the position point 2 and the position point 3 is a reporting period T, the acceleration of the position point 3
For another example, if the sliding speed of the position point 1 is v2, the sliding speed of the position point 3 is v3, and the time interval between the position point 1 and the position point 3 is 2 report periods 2T, the acceleration of the position point 3
Determining a predicted position track of the next report data on the touch screen according to the predicted time interval as described in step 3034; the prediction time interval is determined based on the prediction algorithm, and the prediction time interval refers to the time from the current report point to the predicted position when the current report point slides to the corresponding touch point position on the touch screen. The preset time intervals corresponding to different application scenes are different.
According to the distance calculation formula
Where s is the distance between the current report point data and the next report point data, v 0 is the sliding speed of the current report point data, t is the prediction time interval, and a is the acceleration of the current report point data.
In practical application, a deep learning model can be further built and trained, so that the trained deep learning model can determine the predicted position track of the next report data on the touch screen based on the predicted time interval.
With continued reference to fig. 4, based on the above distance calculation formula, according to the predicted time interval, the sliding speed and the acceleration of the position point 3, the predicted position track of the next report data on the touch screen may be determined. The predicted position track is a section of circular arc taking the position point 3 as a circle center and s as a radius.
The predicted location of the next report data on the predicted location trajectory is determined based on the acceleration of the current report data at the corresponding touch point location on the touch screen, as described in step 2035.
With continued reference to fig. 4, based on the predicted position track determined in step 2034, a predicted position of the next report data on the predicted position track is further determined according to an acceleration of the current report data at a corresponding touch point position (i.e., position point 3) on the touch screen.
The acceleration is a vector, and the direction of the sliding track from the current report point data to the next report point data can be predicted according to the acceleration direction on the position point 3, so that the predicted position of the next report point data is determined on the predicted position track. As shown in fig. 4, a position point 4 (a position point 4 with a diagonal line in a circle) is determined on a predicted position trajectory as a predicted position on the predicted position trajectory according to the acceleration direction of the position point 3.
The predicted point data is determined based on the predicted location, as described in step 2036.
Based on the above steps, the position (the position point 4 with oblique lines in the circle) of the touch point corresponding to the predicted report point data on the touch screen, and the sliding speed and the acceleration on the position point 4 can be determined.
And step 304, replacing the current report point data with the predicted report point data in the current report point period, and reporting the predicted report point data to a foreground application, so that the foreground application triggers a sliding response operation in advance.
Specifically, in this embodiment, whether the distance between the touch point position corresponding to the predicted report point data (e.g., the position point 4 with diagonal lines in the circle in fig. 4) on the touch screen and the touch point position corresponding to the first report point data (e.g., the position point 1 in fig. 4) on the touch screen meets the sliding response threshold or not, the input subsystem reports the predicted report point data as the current report point data to the foreground application in the current report point period. And the foreground application reports the predicted report point data to a UI window through internal processing.
For example, the input subsystem may store the current report data in a designated area for retrieval by the foreground application. And the foreground application distributes the acquired current report data to a window control of the foreground application, so that the foreground control triggers the foreground application to respond to the sliding operation in advance according to the current report data.
And if the distance between the touch point position corresponding to the predicted report point data reported in the current report point period on the touch screen and the touch point position corresponding to the first report point data on the touch screen does not meet the sliding response threshold, the UI window does not trigger the sliding response operation. However, as long as the predicted report point data is reported to the foreground application as the current report point data every time in the current report point period, the predicted report point data is closer to the sliding response threshold value than the received current report point data, so that the UI window can trigger the sliding response operation in advance.
And if the distance between the touch point position corresponding to the predicted report point data on the touch screen and the touch point position corresponding to the first report point data on the touch screen, reported in the current report point period, meets a sliding response threshold, reporting the predicted report point data as the current report point data to a foreground application in the current report point period, so that the UI window triggers a sliding response operation. That is, the UI window will trigger a sliding response operation at least one tick period in advance.
Further, in this embodiment, in step 301, the determining whether the preset condition is satisfied further includes:
and judging whether the number of the cached report point data meets the requirement of report point prediction.
Specifically, in order to improve the accuracy of report point prediction, the application provides a method for predicting the next report point data on the basis that the number of the cached report point data meets a certain number requirement. In this embodiment, the number of the report point data is set to be 2, that is, the report point prediction processing is performed when the number of the cached report point data (for example, the report point data generated by the MOVE event or the report point data generated by combining the DOWN event and the MOVE event) is at least 2. It can be appreciated that the greater the number of cached point data, the greater the accuracy of predicting the next point data based on the cached point data.
In addition, in this embodiment, the determination result in step 302 is no, that is, when the current report point data is received in the current report point period, if the report point prediction is not performed, the current report point data is reported to the foreground application in the current report point period.
As described above, in this embodiment, the report point prediction function is added on the basis of the existing sliding response control method, so if no report point prediction is performed, the processing is still performed according to the existing sliding response control method, that is, the input subsystem reports the current report point data received in the current report point period to the foreground application.
Further, in the present embodiment, it further includes: if the current report point data is generated by a DOWN event or a MOVE event, caching the current report point data; and if the current report point data is generated by the UP event, clearing the cached report point data.
When a user starts to touch the touch screen and slides until the user lifts the touch screen, a point report event generated by the Kernel layer sequentially comprises: a DOWN event, several MOVE events, and an UP event. If the current report point data is generated by a DOWN event or a MOVE event, the report point data needs to be cached, so that the cached report point data can be acquired to predict the next report point data when the report point prediction is performed. If the current report point data is generated by the UP event, the user is indicated to finish the sliding operation on the touch screen, and the cached report point data is emptied.
Further, in the present embodiment, it further includes: and if the corresponding touch point position of the predicted report point data on the touch screen exceeds the boundary of the touch screen, adjusting the predicted report point data based on the boundary of the touch screen.
Because the display area on the touch screen has a boundary, in one possible embodiment, the coordinate position range is determined based on the long-width boundary of the touch screen, and if the coordinate position of the corresponding touch point position of the predicted report point data on the touch screen obtained according to the prediction algorithm exceeds the coordinate position range of the boundary of the display area of the touch screen, the predicted report point data needs to be adjusted.
For example, if the coordinate position of the touch point position corresponding to the predicted report point data on the touch screen exceeds the coordinate position range of the display area boundary of the touch screen, an intersection point exists between the sliding track of the predicted position corresponding to the predicted report point data and the display area boundary of the touch screen, and the coordinate position corresponding to the intersection point can be used as the coordinate position of the touch point position corresponding to the predicted report point data on the touch screen.
Further, in the present embodiment, it further includes:
determining the sliding track direction according to the cached report data;
and if the position of the touch point corresponding to the predicted report point data on the touch screen is opposite to the sliding track direction, adjusting the predicted report point data.
Specifically, as described in the above embodiments, according to the position of the corresponding touch point on the touch screen and the sliding speed and acceleration of the cached data of each report point on the corresponding touch point position, the sliding track direction of the user on the touch screen may be determined. For example, as shown in fig. 4, the sliding track directions of the position point 1, the position point 2, and the position point 3 are as the arrow directions of the sliding track in the figure.
And if the position of the touch point corresponding to the predicted point data on the touch screen, which is calculated according to a prediction algorithm, is opposite to the sliding track direction. For example, with continued reference to fig. 4, if the predicted report point data is retracted on the sliding track between the position point 2 and the position point 3 or the sliding track between the position point 1 and the position point 2 at the position of the corresponding touch point on the touch screen, the predicted report point data is considered to be wrong. And further, the predicted report point data needs to be adjusted, for example, the predicted report point data can be recalculated.
Fig. 5 is a schematic block diagram of an input subsystem in a sliding response control method of a touch screen according to an embodiment of the present application.
Referring to fig. 5, the input subsystem 5 includes: an input subsystem management module 51, a service communication module (including a first service communication module 521 and a second service communication module 522), and a touch core module 53. Wherein the first service communication module 521 and the second service communication module 522 are newly added modules.
The input subsystem management module 51 includes an input subsystem reading module 511, an input subsystem scheduling module 512, and an input subsystem transmission module 513. A prediction algorithm module 531 is newly added to the touch core module 53, where the prediction algorithm module 531 includes a plurality of prediction algorithms (such as a prediction algorithm 1, a prediction algorithm 2, and a prediction algorithm 3) that are applicable to different application scenarios of the electronic device.
Specifically, unlike the existing input subsystem, in this embodiment, the function of the input subsystem transmission module 513 is modified, in the case of executing the report prediction processing, the received report data is transmitted to the second service communication module 522 through the first service communication module 521, then the report data is transmitted to the prediction algorithm module 531 in the touch core module 53 by the second service communication module 522, an adaptive prediction algorithm is selected based on the application scenario of the electronic device, and then the received report data is processed by using the prediction algorithm to obtain the predicted report data. The touch core module 53 then feeds back the predicted report point data to the input subsystem management module 51 via the second service communication module 522 and the first service communication module 521, and then the predicted report point data is reported to the foreground application by the input subsystem 5.
Fig. 6 is a schematic structural diagram of a specific embodiment of a sliding response control device for a touch screen according to an embodiment of the present application.
Referring to fig. 6, the sliding response control device 6 of the touch screen includes: the report point prediction determining module 61 is configured to determine whether to perform report point prediction when current report point data is received in a current report point period; the preset condition judging module 62 is configured to judge whether the current report data meets a preset condition if the judgment result of the report prediction determining module is yes; wherein the preset condition includes that the current report point data is generated by a MOVE event; the predicted report point data algorithm module 63 is configured to predict, if the determination result of the report point prediction determining module meets a preset condition, the next report point data based on the cached report point data, so as to obtain predicted report point data; and the report point data reporting module 64 is configured to replace the current report point data with the predicted report point data in a current report point period, and report the predicted report point data to a foreground application, so that the foreground application triggers a sliding response operation in advance.
The point prediction determining module 61 includes: a sliding speed determining unit 611, configured to determine a current sliding speed according to the current report point data and the cached report point data; the current sliding speed is the sliding speed of the sampling time corresponding to the current report data; a sliding speed judging unit 612, configured to judge whether the current sliding speed is within a preset sliding speed range; a report prediction judging unit 613, configured to determine to perform report prediction if the sliding speed judging unit judges that the sliding speed is positive; and if the judgment result of the sliding speed judging unit is negative, determining that the reporting prediction is not performed.
The report point prediction determining module 61 is further configured to determine whether the current application scenario is an application scenario suitable for report point prediction.
In a specific embodiment, the predictive point data algorithm module 63 includes: the acceleration determining unit 631 is configured to calculate, according to the sliding speed and the reporting period of each cached reporting data at the corresponding touch point position on the touch screen, the acceleration of the current reporting data at the corresponding touch point position on the touch screen; a predicted position track determining unit 632, configured to determine a predicted position track of the next report data on the touch screen according to a predicted time interval; wherein the prediction time interval is determined based on the prediction algorithm; a predicted position determining unit 633, configured to determine a predicted position of the next report data on the predicted position track based on an acceleration of the current report data on the touch screen corresponding to the touch point position; a predicted report point data determining unit 634, configured to determine the predicted report point data based on the predicted position; the predicted time interval refers to the time from the current report data to the predicted position when the current report data slides to the corresponding touch point position on the touch screen.
In another specific embodiment, the predictive point data algorithm module 63 includes: a prediction algorithm determining unit (not shown) for determining a prediction algorithm according to a current application scenario; a prediction report point data determining unit (not shown) for determining the prediction report point data based on the prediction algorithm; wherein the predictive algorithm comprises a machine-based learning algorithm.
The preset condition judging module 62 is further configured to judge whether the number of cached report point data meets the requirement of report point prediction.
The report point data reporting module 64 is further configured to report, when current report point data is received in a current report point period, the current report point data to a foreground application in the current report point period if report point prediction is not performed.
The device 6 further comprises: a buffer processing module (not shown) for buffering the current report point data if the current report point data is generated by a DOWN event or a MOVE event; and if the current report point data is generated by the UP event, clearing the cached report point data.
The device 6 further comprises: and the exception processing module (not shown) is used for adjusting the predicted report point data based on the boundary of the touch screen if the corresponding touch point position of the predicted report point data on the touch screen exceeds the boundary of the touch screen.
The device 6 further comprises: a rollback protection processing module (not shown) for determining a sliding track direction according to all received report data; and if the position of the touch point corresponding to the predicted report point data on the touch screen is opposite to the sliding track direction, adjusting the predicted report point data.
In the foregoing embodiments, specific implementations of each module in the sliding response control device 6 of the touch screen may refer to the detailed descriptions in the foregoing method embodiments, which are not repeated herein.
The embodiment of the application provides electronic equipment, which comprises a memory and a processor, wherein the memory is used for storing information comprising program instructions, the processor is used for controlling the execution of the program instructions, and when the program instructions are loaded and executed by the processor, the electronic equipment is enabled to execute the sliding response control method of the touch screen.
The embodiment of the application provides a storage medium, which comprises a stored program, and the device where the storage medium is located is controlled to execute the sliding response control method of the touch screen when the program runs.
It is to be understood that some or all of the steps or operations in the above-described embodiments are merely examples, and that embodiments of the present application may also perform other operations or variations of the various operations. Furthermore, the various steps may be performed in a different order presented in the above embodiments, and it is possible that not all of the operations in the above embodiments are performed.
The embodiment of the present application also provides a computer-readable storage medium having stored therein a computer program which, when run on a computer, causes the computer to execute the communication method described in the above embodiment.
Furthermore, an embodiment of the present application provides a computer program product, which includes a computer program, which when run on a computer causes the computer to execute the communication method described in the above embodiment.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk Solid STATE DISK), etc.
Claims (26)
1. A sliding response control method for a touch screen, the method comprising:
judging whether to carry out reporting point prediction or not when current reporting point data is received in the current reporting point period; if yes, judging whether the current report data meets a preset condition or not; wherein the preset condition includes that the current report point data is generated by a MOVE event;
If the preset condition is met, predicting the next report data based on the cached report data to obtain predicted report data;
Replacing the current report point data with the predicted report point data in the current report point period, and reporting the predicted report point data to a foreground application, so that the foreground application triggers a sliding response operation in advance;
predicting the next report data based on the cached report data to obtain predicted report data comprises:
Calculating the acceleration of the current report data at the corresponding touch point position on the touch screen according to the sliding speed and the report period of the cached report data at the corresponding touch point position on the touch screen;
determining a predicted position track of the next report point data on the touch screen according to a predicted time interval, wherein the predicted position track is an arc taking the current report point data as a circle center, and the radius of the arc is based on a formula Determining that s is the radius of the circular arc, v 0 is the sliding speed of the current report point data, t is the prediction time interval, and a is the acceleration of the current report point data;
Determining a predicted position of the next report data on the predicted position track based on the acceleration of the current report data on the touch screen corresponding to the touch point position;
determining the predicted report point data based on the predicted position;
The predicted time interval refers to the time from the current report data to the predicted position when the current report data slides to the corresponding touch point position on the touch screen.
2. The method of claim 1, wherein the determining whether to perform the point prediction comprises:
determining the current sliding speed according to the current report point data and the cached report point data; the current sliding speed is the sliding speed of the sampling time corresponding to the current report data;
judging whether the current sliding speed is within a preset sliding speed range or not;
If yes, determining to carry out report point prediction; if not, determining not to carry out the reporting point prediction.
3. The method of claim 1 or2, wherein predicting the next report data based on the cached report data to obtain predicted report data comprises:
determining a prediction algorithm according to the current application scene;
Determining the predicted report point data based on the prediction algorithm; wherein the predictive algorithm comprises a machine learning based algorithm.
4. The method of claim 1, wherein the predicted time interval is determined based on a current application scenario.
5. The method of claim 1, wherein the determining whether to perform the point prediction comprises: and judging whether the current application scene is an application scene suitable for reporting point prediction.
6. The method of claim 1, wherein the determining whether the preset condition is met further comprises:
and judging whether the number of the cached report point data meets the requirement of report point prediction.
7. The method as recited in claim 1, further comprising:
And when the current report point period receives the current report point data, if the report point prediction is not performed, reporting the current report point data to the foreground application in the current report point period.
8. The method of claim 1, wherein replacing the current reporting point data with the predicted reporting point data for the current reporting point period comprises: and storing the current report data into a designated area for the foreground application to acquire.
9. The method as recited in claim 8, further comprising: and the foreground application distributes the acquired current report data to a window control of the foreground application, so that the window control triggers the foreground application to respond to sliding operation in advance according to the current report data.
10. The method as recited in claim 1, further comprising:
if the current report point data is generated by a DOWN event or a MOVE event, caching the current report point data;
and if the current report point data is generated by the UP event, clearing the cached report point data.
11. The method as recited in claim 1, further comprising:
and if the corresponding touch point position of the predicted report point data on the touch screen exceeds the boundary of the touch screen, adjusting the predicted report point data based on the boundary of the touch screen.
12. The method as recited in claim 1, further comprising:
determining the sliding track direction according to the cached report data;
and if the position of the touch point corresponding to the predicted report point data on the touch screen is opposite to the sliding track direction, adjusting the predicted report point data.
13. A sliding response control device for a touch screen, the device comprising:
The report point prediction determining module is used for judging whether to carry out report point prediction or not when the current report point data is received in the current report point period;
the preset condition judging module is used for judging whether the current report data meets preset conditions or not if the judgment result of the report prediction determining module is yes; wherein the preset condition includes that the current report point data is generated by a MOVE event;
the prediction report point data algorithm module is used for predicting the next report point data based on the cached report point data to obtain prediction report point data if the judgment result of the report point prediction determination module meets the preset condition;
The report point data reporting module is used for replacing the current report point data with the predicted report point data in the current report point period and reporting the predicted report point data to a foreground application so that the foreground application triggers a sliding response operation in advance;
The predictive point data algorithm module comprises:
The acceleration determining unit is used for calculating the acceleration of the current report data at the corresponding touch point position on the touch screen according to the sliding speed and the report period of each cached report data at the corresponding touch point position on the touch screen;
The predicted position track determining unit is used for determining a predicted position track of the next report data on the touch screen according to a predicted time interval, wherein the predicted position track is an arc taking the current report data as a circle center, and the radius of the arc is based on a formula Determining that s is the radius of the circular arc, v 0 is the sliding speed of the current report point data, t is the prediction time interval, and a is the acceleration of the current report point data;
the predicted position determining unit is used for determining the predicted position of the next report data on the predicted position track based on the acceleration of the current report data on the corresponding touch point position on the touch screen;
A predicted report point data determining unit configured to determine the predicted report point data based on the predicted position;
The predicted time interval refers to the time from the current report data to the predicted position when the current report data slides to the corresponding touch point position on the touch screen.
14. The apparatus of claim 13, wherein the point prediction determination module comprises:
The sliding speed determining unit is used for determining the current sliding speed according to the current report point data and the cached report point data; the current sliding speed is the sliding speed of the sampling time corresponding to the current report data;
A sliding speed judging unit for judging whether the current sliding speed is within a preset sliding speed range;
The report point prediction judging unit is used for determining to carry out report point prediction if the judging result of the sliding speed judging unit is yes; and if the judgment result of the sliding speed judging unit is negative, determining that the reporting prediction is not performed.
15. The apparatus of claim 13 or 14, wherein the predictive point data algorithm module comprises:
the prediction algorithm determining unit is used for determining a prediction algorithm according to the current application scene;
The prediction report point data determining unit is used for determining the prediction report point data based on the prediction algorithm; wherein the predictive algorithm comprises a machine learning based algorithm.
16. The apparatus of claim 13, wherein the predicted time interval is determined based on a current application scenario.
17. The apparatus of claim 13, wherein the point prediction determination module is further configured to determine whether the current application scenario is an application scenario suitable for point prediction.
18. The apparatus of claim 13, wherein the preset condition determining module is further configured to determine whether the number of cached reporting point data meets a reporting point prediction requirement.
19. The apparatus of claim 13, wherein the reporting module is further configured to: and when the current report point period receives the current report point data, if the report point prediction is not performed, reporting the current report point data to the foreground application in the current report point period.
20. The apparatus of claim 13, wherein the report data reporting module is further configured to store the current report data in a designated area for acquisition by the foreground application.
21. The apparatus of claim 13, wherein the report data reporting module is further configured to control the foreground application to distribute the obtained current report data to a window control of the foreground application, so that the window control triggers the foreground application to respond to a sliding operation in advance according to the current report data.
22. The apparatus as recited in claim 13, further comprising: the buffer processing module is used for buffering the current report point data if the current report point data is generated by a DOWN event or a MOVE event; and if the current report point data is generated by the UP event, clearing the cached report point data.
23. The apparatus of claim 13, further comprising an exception handling module to adjust the predicted point data based on boundaries of the touch screen if a corresponding touch point location of the predicted point data on the touch screen exceeds the boundaries of the touch screen.
24. The apparatus of claim 13, further comprising a rollback protection processing module configured to determine a sliding track direction based on all received respective report data; and if the position of the touch point corresponding to the predicted report point data on the touch screen is opposite to the sliding track direction, adjusting the predicted report point data.
25. An electronic device comprising a memory for storing information including program instructions and a processor for controlling execution of the program instructions, wherein the program instructions, when loaded and executed by the processor, cause the electronic device to perform the method of any one of claims 1 to 12.
26. A storage medium, comprising: the storage medium comprising a stored program, characterized in that the program, when run, controls a device in which the storage medium is located to perform the method of any one of claims 1 to 12.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010219561.6A CN113448482B (en) | 2020-03-25 | 2020-03-25 | Sliding response control method and device of touch screen and electronic equipment |
PCT/CN2021/080095 WO2021190314A1 (en) | 2020-03-25 | 2021-03-11 | Sliding response control method and apparatus of touch screen, and electronic device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010219561.6A CN113448482B (en) | 2020-03-25 | 2020-03-25 | Sliding response control method and device of touch screen and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113448482A CN113448482A (en) | 2021-09-28 |
CN113448482B true CN113448482B (en) | 2024-09-13 |
Family
ID=77806798
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010219561.6A Active CN113448482B (en) | 2020-03-25 | 2020-03-25 | Sliding response control method and device of touch screen and electronic equipment |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113448482B (en) |
WO (1) | WO2021190314A1 (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114115589A (en) * | 2021-10-08 | 2022-03-01 | 北京小米移动软件有限公司 | Point reporting information processing method, device, terminal and storage medium |
CN114466006B (en) * | 2021-12-22 | 2024-01-02 | 天翼云科技有限公司 | Touch screen information sending and responding method and device |
CN115328345A (en) * | 2022-04-19 | 2022-11-11 | 天津先楫半导体科技有限公司 | Method, system, equipment and medium for refreshing display control |
CN116521018B (en) * | 2023-07-04 | 2023-10-20 | 荣耀终端有限公司 | False touch prompting method, terminal equipment and storage medium |
CN117692483B (en) * | 2023-08-18 | 2024-09-24 | 荣耀终端有限公司 | Input event transmission method, electronic equipment and storage medium |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10452188B2 (en) * | 2012-01-13 | 2019-10-22 | Microsoft Technology Licensing, Llc | Predictive compensation for a latency of an input device |
CN102929433A (en) * | 2012-11-06 | 2013-02-13 | 山东大学 | Method for reducing dragging delay on embedded device through contact prediction |
US20150153890A1 (en) * | 2013-12-03 | 2015-06-04 | Elwha Llc | Compensating for a latency in displaying a portion of a hand-initiated movement |
CN104035714B (en) * | 2014-06-24 | 2017-05-03 | 中科创达软件股份有限公司 | Event processing method, device and equipment based on Android system |
US10514799B2 (en) * | 2016-09-08 | 2019-12-24 | Google Llc | Deep machine learning to perform touch motion prediction |
-
2020
- 2020-03-25 CN CN202010219561.6A patent/CN113448482B/en active Active
-
2021
- 2021-03-11 WO PCT/CN2021/080095 patent/WO2021190314A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
CN113448482A (en) | 2021-09-28 |
WO2021190314A1 (en) | 2021-09-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113905179B (en) | Method for switching cameras by terminal and terminal | |
CN115866121B (en) | Application interface interaction method, electronic device and computer readable storage medium | |
CN113448482B (en) | Sliding response control method and device of touch screen and electronic equipment | |
CN112651510B (en) | Model updating method, working node and model updating system | |
CN113542580B (en) | Method and device for removing light spots of glasses and electronic equipment | |
CN114221402A (en) | Charging method and device of terminal equipment and terminal equipment | |
CN114968543A (en) | Method for processing document page and related device | |
CN116048772B (en) | Method and device for adjusting frequency of central processing unit and terminal equipment | |
CN116048831B (en) | Target signal processing method and electronic equipment | |
CN116055859B (en) | Image processing method and electronic device | |
CN115389927B (en) | Method and system for measuring and calculating motor damping | |
CN115695640B (en) | Shutdown prevention protection method and electronic equipment | |
CN117319369A (en) | File delivery method, electronic device and storage medium | |
CN116069580A (en) | Processor operation regulation method, electronic device and storage medium | |
CN114116610A (en) | Method, device, electronic equipment and medium for acquiring storage information | |
CN116048769B (en) | Memory recycling method and device and terminal equipment | |
CN114520870B (en) | Display method and terminal | |
CN116708317B (en) | Data packet MTU adjustment method and device and terminal equipment | |
CN116346982B (en) | Method for processing audio, electronic device and readable storage medium | |
CN117714835B (en) | Image processing method, electronic equipment and readable storage medium | |
CN116233599B (en) | Video mode recommendation method and electronic equipment | |
CN114125144B (en) | Method, terminal and storage medium for preventing false touch | |
CN116717486B (en) | Method and device for adjusting rotation speed of fan, electronic equipment and readable storage medium | |
CN117793861A (en) | Mode switching method and device for near field communication and terminal equipment | |
CN117243640A (en) | Method and device for predicting gestation period and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |