CN109298791B - Terminal control method, device, storage medium and mobile terminal - Google Patents
Terminal control method, device, storage medium and mobile terminal Download PDFInfo
- Publication number
- CN109298791B CN109298791B CN201811244092.2A CN201811244092A CN109298791B CN 109298791 B CN109298791 B CN 109298791B CN 201811244092 A CN201811244092 A CN 201811244092A CN 109298791 B CN109298791 B CN 109298791B
- Authority
- CN
- China
- Prior art keywords
- mobile terminal
- ultrasonic signal
- user
- limb
- user limb
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000000034 method Methods 0.000 title claims abstract description 52
- 230000033001 locomotion Effects 0.000 claims abstract description 108
- 238000010521 absorption reaction Methods 0.000 claims description 31
- 239000011159 matrix material Substances 0.000 claims description 14
- 238000004590 computer program Methods 0.000 claims description 11
- 230000011664 signaling Effects 0.000 claims description 8
- 238000012544 monitoring process Methods 0.000 claims description 4
- 238000012216 screening Methods 0.000 claims description 4
- 239000002609 medium Substances 0.000 description 12
- 230000002093 peripheral effect Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 230000008859 change Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 230000001788 irregular Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000013459 approach Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 239000002699 waste material Substances 0.000 description 2
- 206010063385 Intellectualisation Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000011038 discontinuous diafiltration by volume reduction Methods 0.000 description 1
- 239000012120 mounting media Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008707 rearrangement Effects 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application discloses a terminal control method, a terminal control device, a storage medium and a mobile terminal. The method comprises the following steps: when detecting that an object is close to the mobile terminal, judging whether the object is a user limb, when the object is the user limb, controlling an ultrasonic signal transmitter of the mobile terminal to transmit an ultrasonic signal, controlling an ultrasonic signal receiver to receive a reflection signal of the user limb to the ultrasonic signal, determining a movement track of the user limb according to the transmitted ultrasonic signal and the received reflection signal, generating a control instruction according to the movement track of the user limb, and executing the control instruction. By adopting the above scheme, based on the reflection characteristic of the ultrasonic signal, the movement track of the limb of the user close to the mobile terminal is obtained, and the corresponding control instruction is generated to control the mobile terminal, so that the control mode of the mobile terminal is increased, the operation is convenient and fast, and the flexibility is high.
Description
Technical Field
The embodiment of the application relates to the technical field of mobile terminals, in particular to a terminal control method, a terminal control device, a storage medium and a mobile terminal.
Background
With the continuous development of intelligent terminals such as mobile phones, the intelligent terminals are widely used in life and work of users, and meanwhile, the requirements of the users on the intelligent terminals are higher and higher.
At present, an intelligent terminal has multiple functions, and can control various functions through a touch gesture, but a target touch gesture generally needs to touch a body of a mobile terminal, and the touch gesture is more and more complex along with the increase of the functions.
Disclosure of Invention
The embodiment of the application provides a terminal control method, a terminal control device, a storage medium and a mobile terminal, and improves flexibility and intellectualization of terminal control.
In a first aspect, an embodiment of the present application provides a terminal control method, which is applied to a mobile terminal, where the mobile terminal includes an ultrasonic signal transmitter and an ultrasonic signal receiver, and includes:
when detecting that an object is close to the mobile terminal, judging whether the object is a user limb;
when the object is a user limb, controlling an ultrasonic signal transmitter of the mobile terminal to transmit an ultrasonic signal, and controlling an ultrasonic signal receiver to receive a reflected signal of the user limb to the ultrasonic signal;
determining the movement track of the user limb according to the transmitted ultrasonic signal and the received reflected signal;
and generating a control instruction according to the movement track of the user limb, and executing the control instruction.
In a second aspect, an embodiment of the present application provides a terminal control apparatus, including:
the user limb judging module is used for judging whether the object is a user limb or not when detecting that the object is close to the mobile terminal;
the ultrasonic signal transmitting module is used for controlling an ultrasonic signal transmitter of the mobile terminal to transmit an ultrasonic signal and controlling the ultrasonic signal receiver to receive a reflected signal of the user limb to the ultrasonic signal when the object is the user limb;
the mobile track determining module is used for determining the mobile track of the user limb according to the transmitted ultrasonic signal and the received reflected signal;
and the control instruction generating module is used for generating a control instruction according to the movement track of the user limb and executing the control instruction.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements a terminal control method according to an embodiment of the present application.
In a fourth aspect, an embodiment of the present application provides a mobile terminal, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program to implement the terminal control method according to the embodiment of the present application.
According to the terminal control method provided by the embodiment of the application, when an object is detected to be close to the mobile terminal, whether the object is a user limb is judged, when the object is the user limb, the ultrasonic signal transmitter of the mobile terminal is controlled to transmit an ultrasonic signal, the ultrasonic signal receiver is controlled to receive a reflection signal of the user limb to the ultrasonic signal, the movement track of the user limb is determined according to the transmitted ultrasonic signal and the received reflection signal, a control instruction is generated according to the movement track of the user limb, and the control instruction is executed. By adopting the above scheme, based on the reflection characteristic of the ultrasonic signal, the movement track of the limb of the user close to the mobile terminal is obtained, and the corresponding control instruction is generated to control the mobile terminal, so that the control mode of the mobile terminal is increased, the operation is convenient and fast, and the flexibility is high.
Drawings
Fig. 1 is a schematic flowchart of a terminal control method according to an embodiment of the present application;
fig. 2 is a schematic flowchart of another terminal control method according to an embodiment of the present application;
fig. 3 is a schematic diagram of a sound field of a mobile terminal according to an embodiment of the present application;
fig. 4 is a schematic flowchart of another terminal control method according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a terminal control device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a mobile terminal according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of another mobile terminal according to an embodiment of the present application.
Detailed Description
The technical scheme of the application is further explained by the specific implementation mode in combination with the attached drawings. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be further noted that, for the convenience of description, only some of the structures related to the present application are shown in the drawings, not all of the structures.
Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the steps as a sequential process, many of the steps can be performed in parallel, concurrently or simultaneously. In addition, the order of the steps may be rearranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.
Fig. 1 is a schematic flowchart of terminal control provided in an embodiment of the present application, where the method may be performed by a terminal control method apparatus, where the apparatus may be implemented by software and/or hardware, and may be generally integrated in a mobile terminal. As shown in fig. 1, the method includes:
And 102, when the object is a user limb, controlling an ultrasonic signal transmitter of the mobile terminal to transmit an ultrasonic signal, and controlling an ultrasonic signal receiver to receive a reflected signal of the user limb to the ultrasonic signal.
And 103, determining the movement track of the user limb according to the transmitted ultrasonic signal and the received reflected signal.
And 104, generating a control instruction according to the movement track of the user limb, and executing the control instruction.
For example, the mobile terminal in the embodiment of the present application may include a smart device such as a mobile phone and a tablet computer, which is configured with an ultrasonic signal transmitter and an ultrasonic signal receiver.
Wherein, whether an object is close to the mobile terminal can be monitored by arranging an infrared sensor, a distance sensor or an ultrasonic transmitter. For example, infrared sensors are provided on the front and rear surfaces of the mobile terminal and emit infrared rays, and when the infrared sensors are switched from a state in which they do not receive a reflected signal to a state in which they receive a reflected signal, it is determined that there is an object approaching the mobile terminal. For example, an ultrasonic transmitter is provided in the mobile terminal to transmit an ultrasonic signal, and if a reflected signal of the ultrasonic signal is received and the distance of an object determined according to the time difference between the transmission of the ultrasonic signal and the reception of the ultrasonic signal becomes smaller, it is determined that the object is close to the mobile terminal. In some embodiments, the detecting that the object is near the mobile terminal includes: forming a sound field based on an ultrasonic signal transmitted by an ultrasonic signal transmitter of the mobile terminal; and monitoring the sound field of the mobile terminal, and determining that an object is close to the mobile terminal when the sound field changes. When no object enters the sound field, the sound field formed by the ultrasonic waves is in a spherical shape with the ultrasonic signal emitter as the center of sphere, and the ultrasonic signal emitter is arranged on the mobile terminal, namely the sound field can be regarded as the spherical shape with the mobile terminal as the center of sphere. When an object enters a sound field, the sound field is deformed due to the influence of the object on the sound field, irregular spheres are formed, and different objects have different degrees of influence on the sound field. In this embodiment, the sound field of the mobile terminal may be detected in real time, and when the sound field changes, the position change of the object near the mobile terminal may be determined. Further, whether the object is close to the mobile terminal or not is determined according to the change trend of the sound field. Illustratively, when the sound field changes from a regular sphere to an irregular sphere, it is determined that an object is close to the mobile terminal, and when the sound field changes from an irregular sphere to a regular sphere, it is determined that an object is far from the mobile terminal.
In this embodiment, when it is determined that an object is close to the mobile terminal, the type of the object is determined, when the object is a limb of a user, a movement track of the limb of the user is collected, so as to generate a control instruction according to the movement track, and when the object is another object outside the limb of the user (for example, the another object may be a mobile phone, a desk, a wall, or the like), the mobile terminal is not processed at all, so that an incorrect operation of the mobile terminal by the another object is avoided, and the control accuracy of the mobile terminal is improved. The body used may be, but is not limited to, a user's hand, head, arm, or the like.
In this embodiment, when it is determined that the object is a limb of the user, the ultrasonic signal transmitter and the ultrasonic signal receiver are activated to transmit the ultrasonic signal and receive a reflected signal of the ultrasonic signal. The relative spatial position of the user limb and the mobile terminal at each moment can be determined according to the transmitted ultrasonic signal and the received reflected signal, and the relative spatial positions at each moment are combined according to time to form the movement track of the user limb. For example, the relative spatial position of the user limb and the mobile terminal may be represented in the form of three-dimensional spatial coordinates, for example, the three-dimensional spatial coordinates may be represented by spherical coordinates with the ultrasonic signal transmitter as a sphere center, and further, the movement track of the user limb is a three-dimensional track generated by connecting a plurality of spatial coordinates. Optionally, when the object is determined to be a user limb in a preset time period after the user limb approaches the mobile terminal, the relative spatial position of the user limb and the mobile terminal is determined in real time, and the movement track of the user limb is further obtained. For example, the movement track of the limb of the user can be close to the screen of the mobile terminal in the vertical direction or move in the horizontal direction.
In some embodiments, when a limb of a user enters a track acquisition range of the mobile terminal, a movement track of the limb of the user is acquired, wherein the track acquisition range of the mobile terminal may be a range of 30cm from the mobile terminal, so as to avoid misoperation caused by limb movement of other users.
It should be noted that, the mobile terminal includes a preset instruction database, the preset instruction database includes a plurality of preset gestures, where each preset gesture includes a spatial motion trajectory of a user limb relative to the mobile terminal, and a corresponding relationship is established between each preset gesture and a control instruction, optionally, a control instruction is generated according to the motion trajectory of the user limb, including: and matching the movement track of the user limb with a preset gesture in a preset command database, and calling a control command corresponding to the successfully matched preset gesture when the matching is successful. The preset gestures in the preset instruction database can be set by the system or set by the user according to personal requirements, and the user can edit the preset gestures in the preset instruction database according to real-time requirements. And when the movement trend of the movement track of the user limb is the same as the movement trend of the preset gesture, and the direction of the user limb relative to the mobile terminal in the movement track of the user limb is the same as the direction of the preset gesture relative to the mobile terminal, determining that the movement track of the user limb is successfully matched with the preset gesture. Illustratively, the control instructions may include, but are not limited to, volume adjustment instructions, image capture instructions, screen highlight instructions, incoming call processing instructions, and the like.
For example, the first preset gesture may be in a direction opposite to a screen of the mobile terminal and close to the screen of the mobile terminal in a vertical direction, and the second preset gesture may be in a direction opposite to a rear shell of the mobile terminal, where the first preset gesture and the second preset gesture are different from a relative direction of the mobile terminal due to a movement track input by a user, and the first preset gesture and the second preset gesture are two different gestures and may respectively correspond to different control instructions to support diversity of the control instructions. In some embodiments, the plurality of preset gestures may correspond to the same control instruction, and for example, the first preset gesture and the second preset gesture may both correspond to a volume adjustment instruction, so as to improve convenience of user operation and reduce a limitation condition for a user to input a movement trajectory.
In some embodiments, when the mobile terminal is in an audio playing state, if the movement trajectory of the user's limb is a movement trajectory along a vertical direction of the mobile terminal, the generated control instruction is a volume adjustment instruction. The audio playing state may be, but is not limited to, playing music through a music player, and may also be an alarm clock ringing or an incoming call ringing, etc. For example, in an alarm clock ringing state, when it is detected that a user limb approaches the mobile terminal, the ultrasonic signal transmitter of the mobile terminal is controlled to transmit an ultrasonic signal within a preset time period (15 seconds or 30 seconds), the ultrasonic signal receiver is controlled to receive a reflection signal of the user limb to the ultrasonic signal, and a movement track of the user limb is determined according to the transmitted ultrasonic signal and the received reflection signal. And matching the movement track of the user limb with a preset gesture in a preset command database, generating a volume adjusting command when the movement track of the user limb is successfully matched with the preset gesture in the vertical direction of the mobile terminal, executing the volume adjusting command, and adjusting the volume of the alarm clock ringing. For example, when the movement track of the limbs of the user is close to the mobile terminal along the vertical direction of the mobile terminal, a volume turning-down instruction is generated to reduce the volume of the alarm clock ringing, and when the limbs of the user contact the mobile terminal, the volume of the alarm clock ringing is adjusted to be mute; and when the movement track of the limbs of the user is far away from the mobile terminal along the vertical direction of the mobile terminal, generating a volume increasing instruction so as to increase the volume of the alarm clock ringing.
Similarly, when the music player plays music or the incoming call rings, the movement track of the user limb is a movement track along the vertical direction of the mobile terminal, and a volume adjusting instruction is generated to adjust the volume of the music or the incoming call rings.
In some embodiments, when the mobile terminal is in an incoming call state and when it is detected that a user limb is close to the mobile terminal, the ultrasonic signal transmitter of the mobile terminal is controlled to transmit an ultrasonic signal, the ultrasonic signal receiver is controlled to receive a reflected signal of the user limb to the ultrasonic signal, and a movement track of the user limb is determined according to the transmitted ultrasonic signal and the received reflected signal. When the movement track of the user limb is the movement track along the horizontal direction of the mobile terminal, points to the answering key direction and is matched with a preset gesture corresponding to the incoming call answering instruction, the generated control instruction is the incoming call answering instruction, the incoming call answering instruction is executed, and the incoming call is connected; and if the movement track of the user limb is the movement track along the horizontal direction of the mobile terminal, points to the hang-up key direction and is matched with the preset gesture corresponding to the incoming call hang-up instruction, the generated control instruction is the incoming call hang-up instruction, and the incoming call hang-up instruction is executed to hang up the incoming call.
Optionally, the same movement track may correspond to different control instructions, after the movement track of the user limb is obtained, the current state of the mobile terminal is determined, and the control instruction is generated according to the current state and the movement track. Illustratively, if the current state of the mobile terminal is a photographing state and the obtained movement track is close to the mobile terminal along the vertical direction of the mobile terminal, a photographing instruction may be generated; if the current state of the mobile terminal is a ringing state and the obtained moving track is close to the mobile terminal along the vertical direction of the mobile terminal, the current state may be a volume reduction instruction.
According to the terminal control method provided by the embodiment of the application, when an object is detected to be close to the mobile terminal, whether the object is a user limb is judged, when the object is the user limb, the ultrasonic signal transmitter of the mobile terminal is controlled to transmit an ultrasonic signal, the ultrasonic signal receiver is controlled to receive a reflection signal of the user limb to the ultrasonic signal, the movement track of the user limb is determined according to the transmitted ultrasonic signal and the received reflection signal, a control instruction is generated according to the movement track of the user limb, and the control instruction is executed. By adopting the above scheme, based on the reflection characteristic of the ultrasonic signal, the movement track of the limb of the user close to the mobile terminal is obtained, and the corresponding control instruction is generated to control the mobile terminal, so that the control mode of the mobile terminal is increased, the operation is convenient and fast, and the flexibility is high.
Fig. 2 is a schematic flowchart of another terminal control method provided in the embodiment of the present application, and referring to fig. 2, the method of the embodiment includes the following steps:
And step 203, generating ultrasonic signal absorption data of the object according to the reflection signals of the ultrasonic signals of the frequencies.
And 204, when the ultrasonic signal absorption data of the object are matched with preset human body ultrasonic signal absorption data, determining that the object is the limb of the user.
And step 206, generating a control instruction according to the movement track of the user limb, and executing the control instruction.
In step 202, when it is determined that the storage object is close to the mobile terminal, the ultrasonic signal transmitter may be controlled to transmit a spherical or hemispherical ultrasonic signal, the type of the object may be detected, and a movement trace of the object may be acquired. Optionally, the direction of the object and the mobile terminal is determined according to the change data of the sound field. Exemplarily, referring to fig. 3, fig. 3 is a schematic sound field diagram of a mobile terminal provided in an embodiment of the present application, where the sound field in fig. 3 is only a schematic plan view. When an object approaches, irregular data changes such as recess or protrusion exist in the sound field, and the direction of the object relative to the mobile terminal is determined according to the data change position. For example, when the data change position of the sound field is a vertical direction to a central position of a front surface (e.g., a surface on which a screen is disposed) of the mobile terminal, it may be determined that an object is in the vertical direction of the front surface of the mobile terminal. After the direction of the object relative to the mobile terminal is determined, an ultrasonic signal is transmitted to the direction, the type of the object is detected, and the moving track of the object is acquired. The problems of waste of invalid signals caused by omnibearing transmission of ultrasonic signals and increase of calculation amount caused by invalid data processing are solved.
In the present embodiment, different types of objects differ in their absorption ability for ultrasonic signals, and the type of object is detected based on the above-described principle. The method comprises the steps of obtaining absorption data of an object on ultrasonic signals under various frequencies to form a variation curve of the absorption data along with the frequency, respectively matching the variation curve of the absorption data of the object along with the frequency with a preset standard variation curve of the absorption data of various types of objects along with the frequency, and determining the type of the object. The ultrasonic signal absorption data may be displayed in a form of a curve or a form of a table, which is not limited in this embodiment. For example, when the ultrasonic signal absorption data of the object matches with the preset human body ultrasonic signal absorption data, the object is determined to be the user's limb. Optionally, the generating ultrasonic signal absorption data of the object according to the reflected signal of the ultrasonic signal of each frequency includes: for the ultrasonic signal of any frequency, determining the absorption value of the ultrasonic signal of the object at the frequency according to the energy difference of the transmitted ultrasonic signal and the reflected signal; and counting the absorption value of the object to the ultrasonic signal under each frequency to generate ultrasonic signal absorption data of the object.
In the embodiment, when the object close to the mobile terminal is determined to be the limb of the user, the ultrasonic signal transmitter is controlled to continuously transmit the ultrasonic signal, and the ultrasonic signal receiver is controlled to continuously receive the reflected signal of the ultrasonic signal, so as to determine the movement track of the limb of the user.
According to the terminal control method provided by the embodiment of the application, the ultrasonic signal is transmitted to the object, the type of the object is determined according to the energy absorption condition of the object on the ultrasonic signal, whether the object close to the mobile terminal is a user limb or not is accurately judged, misoperation caused by the fact that other objects are close to the mobile terminal is avoided, and the control precision of the mobile terminal is improved. Furthermore, a corresponding control instruction is generated according to the movement track of the user limb to control the mobile terminal, so that the control mode of the mobile terminal is increased, the operation is convenient and fast, and the flexibility is high.
Fig. 4 is a schematic flow chart of another terminal control method provided in an embodiment of the present application, where the present embodiment is an alternative to the foregoing embodiment, and correspondingly, as shown in fig. 4, the method of the present embodiment includes the following steps:
And 406, generating a movement track of the user limb according to the plurality of spatial positions of the user limb relative to the mobile terminal within a preset time when the user limb is detected to be close to the mobile terminal.
And 408, determining the type of the user limb according to the outline of the user limb, and screening in a preset instruction database according to the type of the user limb.
And 409, matching the movement track of the user limb with a preset gesture corresponding to the screened preset instruction data, and calling a control instruction corresponding to the successfully matched preset gesture when the matching is successful. The preset instruction database comprises preset gestures and control instructions corresponding to the preset gestures.
In this embodiment, a matrix type ultrasonic signal receiver may be provided, wherein each sub-ultrasonic signal receiver may be configured to receive a reflected signal of an ultrasonic signal corresponding to the ultrasonic signal transmission. Based on the reflection characteristic of the ultrasonic signal, the propagation time of the ultrasonic signal in the air can be determined according to the transmitting time of the ultrasonic signal and the receiving time of the reflection signal, and based on the propagation speed of the ultrasonic signal in the air, the distance parameter between the limb of the user and the mobile terminal can be determined. When the user limb is positioned in the front direction of the mobile terminal (the direction opposite to the screen of the mobile terminal), the obtained distance parameter is set to be a positive number, and when the user limb is positioned in the back direction of the mobile terminal (the direction opposite to the rear shell of the mobile terminal), the obtained distance parameter is set to be a negative number so as to distinguish the directions.
A plane coordinate system parallel to the screen of the mobile terminal is established, for example, the coordinate system may be established with the lower left corner of the screen as the origin, the short side of the screen as the horizontal axis, and the long side of the screen as the vertical axis. The distribution range of the matrix type ultrasonic signal receiver is the same as the range of a screen. The matrix type ultrasonic signal receiver is arranged in the mobile terminal, the layout range of the matrix type ultrasonic signal receiver is the same as the screen range, and all the sub ultrasonic signal receivers are uniformly distributed. And determining the direction parameters of the limbs of the user according to the position of the ultrasonic signal receiver receiving the reflected signals and the ultrasonic transmitting direction. For example, when only one sub-ultrasonic signal receiver in the matrix type ultrasonic signal receiver receives a reflected signal, the direction parameter of the user limb is the position of the sub-ultrasonic signal receiver in a pre-established plane coordinate system or the ultrasonic wave transmitting direction; when a plurality of sub ultrasonic signal receivers in the matrix type ultrasonic signal receiver receive the reflected signals, the central positions or the central directions of the positions of the sub ultrasonic signal receivers in a pre-established plane coordinate system or the central directions of the ultrasonic emission directions are determined, and the central positions or the central directions are determined as the direction parameters of the limbs of the user. And combining the distance parameter and the direction parameter of the user limb to determine the space coordinate of the user limb, and representing the space position of the user limb relative to the mobile terminal based on the space coordinate. And connecting the detected spatial positions of the limbs of the user within the preset time when the limbs of the user are detected to be close to the mobile terminal according to the moment to form the movement track of the limbs of the user. In the embodiment, the three-dimensional space coordinate is formed by determining the horizontal coordinate and the vertical coordinate of the user limb relative to the mobile terminal, so that the position of the user limb is represented with high precision, the precision of the movement track is improved, and the risk of misoperation is reduced.
The user limb may be, but is not limited to, a palm, a finger, an arm, a head, etc., wherein the control instructions corresponding to different user limbs may be different for the same movement trajectory. Illustratively, for the same movement track, the palm corresponds to a volume adjusting instruction, and the head corresponds to an incoming call answering instruction. In the embodiment, the matrix type ultrasonic signal receiver receives the reflected signals, determines the outline of the limb of the user, and determines the type of the limb of the user according to the outline of the limb of the user. Illustratively, when the mobile terminal is in an alarm clock ringing state, a movement track corresponding to the volume adjusting instruction is a movement track input by a palm of a user along a vertical direction of the mobile terminal, when it is detected that a limb of the user inputting the movement track along the vertical direction of the mobile terminal is an arm of the user and the types of the limbs of the user are different, the alarm clock ringing state is continuously maintained until it is detected that the movement track input by the palm of the user along the vertical direction of the mobile terminal is generated. The misoperation caused by other limb actions is avoided, and meanwhile, the diversity of the movement track is increased to support the diversity of the control instructions.
It should be noted that, in the embodiment of the present application, step 407 and step 408 may be omitted, the movement trajectory of the limb of the user obtained in step 406 is matched with a preset gesture of a preset instruction database, a control instruction is determined according to a matching result, and the determined control instruction is executed. Meanwhile, step 407 and step 408 may be set before step 403, and screening may be performed in a preset instruction database according to the type of the user limb, and when there is no control instruction corresponding to the user limb type in the preset instruction database, returning to step 401, continuing to monitor whether there is an object close to the mobile terminal, and when there is a control instruction corresponding to the user limb type, continuing to execute step 403, so as to avoid resource waste caused by acquisition of an invalid movement trajectory.
In some embodiments, determining the spatial position of the user's limb may also be: determining a distance parameter between the user limb and the mobile terminal according to the transmitting time of the ultrasonic signal and the receiving time of the reflected signal; according to the position of the change data in the sound field of the mobile terminal, determining the azimuth angle and the elevation angle of the user limb relative to the position of the sound field spherical center to determine the direction parameter of the user limb, determining the spherical coordinate value of the user limb relative to the sound field spherical center according to the distance parameter and the direction parameter, and representing the space position of the user limb based on the spherical coordinate value. And connecting the detected spatial positions of the limbs of the user within the preset time when the limbs of the user are detected to be close to the mobile terminal according to the moment to form the movement track of the limbs of the user.
According to the terminal control method provided by the embodiment of the application, the movement track input by the user limb is accurately determined by determining the spatial position of the user limb relative to the mobile terminal at each moment, so that the control precision of the mobile terminal is improved. Furthermore, by determining the limb type of the user, misoperation caused by user action is avoided.
Fig. 5 is a block diagram of a terminal control apparatus according to an embodiment of the present disclosure, where the apparatus may be implemented by software and/or hardware, and is generally integrated in a mobile terminal, and the terminal may be controlled by executing a terminal control method. As shown in fig. 5, the apparatus includes: a user limb judging module 501, an ultrasonic signal transmitting module 502, a moving track determining module 503 and a control instruction generating module 504.
A user limb judgment module 501, configured to judge whether an object is a user limb when it is detected that the object is close to the mobile terminal;
an ultrasonic signal transmitting module 502, configured to control an ultrasonic signal transmitter of the mobile terminal to transmit an ultrasonic signal and control the ultrasonic signal receiver to receive a reflected signal of the user's limb to the ultrasonic signal when the object is the user's limb;
a movement track determining module 503, configured to determine a movement track of the user limb according to the transmitted ultrasonic signal and the received reflected signal;
and a control instruction generating module 504, configured to generate a control instruction according to the movement trajectory of the user limb, and execute the control instruction.
The terminal control device provided in the embodiment of the application acquires the movement track of the limb of the user close to the mobile terminal based on the reflection characteristic of the ultrasonic signal, generates the corresponding control instruction, controls the mobile terminal, increases the control mode of the mobile terminal, and is convenient and fast to operate and high in flexibility.
On the basis of the above embodiment, the user limb determination module 501 includes:
the ultrasonic signal transmitting unit is used for transmitting ultrasonic signals with different frequencies to the object based on an ultrasonic signal transmitter of the mobile terminal and receiving reflected signals of the ultrasonic signals with the different frequencies;
an absorption data determining unit for generating ultrasonic signal absorption data of the object according to the reflection signal of the ultrasonic signal of each frequency;
and the user limb determining unit is used for determining the object as the limb of the user when the ultrasonic signal absorption data of the object is matched with the preset human body ultrasonic signal absorption data.
On the basis of the above embodiment, the absorption data determining unit is further configured to:
for the ultrasonic signal of any frequency, determining the absorption value of the ultrasonic signal of the object at the frequency according to the energy difference of the transmitted ultrasonic signal and the reflected signal;
and counting the absorption value of the object to the ultrasonic signal under each frequency to generate ultrasonic signal absorption data of the object.
On the basis of the above embodiment, the movement track determining module 503 may include:
the distance parameter determining unit is used for determining the distance parameter between the user limb and the mobile terminal according to the transmitting time of the ultrasonic signal and the receiving time of the reflected signal;
the direction parameter determining unit is used for determining the direction parameters of the user limb relative to the mobile terminal according to the position of an ultrasonic signal receiver for receiving the reflected signal, wherein the ultrasonic signal receiver is a matrix type ultrasonic signal receiver;
the space position determining unit is used for determining the space position of the user limb relative to the mobile terminal according to the distance parameter and the direction parameter;
the mobile track determining unit is used for generating a mobile track of the user limb according to a plurality of spatial positions of the user limb relative to the mobile terminal within a preset time when the user limb is detected to be close to the mobile terminal.
On the basis of the above embodiment, the control instruction generating module 504 may include:
the mobile track matching unit is used for matching the mobile track of the user limb with a preset gesture in a preset instruction database, wherein the preset instruction database comprises the preset gesture and a control instruction corresponding to the preset gesture;
and the control instruction generating unit is used for calling the control instruction corresponding to the successfully matched preset gesture when the matching is successful.
On the basis of the above embodiment, the control instruction generating module 504 further includes:
the limb contour determining unit is used for determining the contour of the user limb according to a reflection signal received by the matrix type ultrasonic signal receiver before matching the movement track of the user limb with a preset gesture in a preset instruction database;
the limb type determining unit is used for determining the type of the limb of the user according to the outline of the limb of the user;
correspondingly, the movement track matching unit is used for:
screening in a preset instruction database according to the user limb type;
and matching the movement track of the user limb with the preset gesture corresponding to the screened preset instruction data.
On the basis of the above embodiment, the method comprises the following steps:
the mobile terminal comprises a sound field forming module, a sound source generating module and a sound source generating module, wherein the sound field forming module is used for forming a sound field based on an ultrasonic signal transmitted by an ultrasonic signal transmitter of the mobile terminal;
and the object detection module is used for monitoring the sound field of the mobile terminal and determining that an object is close to the mobile terminal when the sound field changes.
On the basis of the above embodiment, when the mobile terminal is in an audio playing state, if the movement track of the user's limb is a movement track along the vertical direction of the mobile terminal, the generated control instruction is a volume adjustment instruction;
on the basis of the embodiment, when the mobile terminal is in a call state, if the movement track of the limb of the user is a movement track along the horizontal direction of the mobile terminal and points to the direction of an answering key, the generated control instruction is a call answering instruction;
and if the movement track of the user limb is the movement track along the horizontal direction of the mobile terminal and points to the hang-up key direction, the generated control instruction is an incoming call hang-up instruction.
Embodiments of the present application also provide a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform a terminal control method, the method including:
when detecting that an object is close to the mobile terminal, judging whether the object is a user limb;
when the object is a user limb, controlling an ultrasonic signal transmitter of the mobile terminal to transmit an ultrasonic signal, and controlling an ultrasonic signal receiver to receive a reflected signal of the user limb to the ultrasonic signal;
determining the movement track of the user limb according to the transmitted ultrasonic signal and the received reflected signal;
and generating a control instruction according to the movement track of the user limb, and executing the control instruction.
Storage medium-any of various types of memory devices or storage devices. The term "storage medium" is intended to include: mounting media such as CD-ROM, floppy disk, or tape devices; computer system memory or random access memory such as DRAM, DDRRAM, SRAM, EDORAM, Lanbas (Rambus) RAM, etc.; non-volatile memory such as flash memory, magnetic media (e.g., hard disk or optical storage); registers or other similar types of memory elements, etc. The storage medium may also include other types of memory or combinations thereof. In addition, the storage medium may be located in a first computer system in which the program is executed, or may be located in a different second computer system connected to the first computer system through a network (such as the internet). The second computer system may provide program instructions to the first computer for execution. The term "storage medium" may include two or more storage media that may reside in different locations, such as in different computer systems that are connected by a network. The storage medium may store program instructions (e.g., embodied as a computer program) that are executable by one or more processors.
Of course, the storage medium provided in the embodiments of the present application and containing computer-executable instructions is not limited to the terminal control operation described above, and may also perform related operations in the terminal control method provided in any embodiment of the present application.
The embodiment of the application provides a mobile terminal, and the terminal control device provided by the embodiment of the application can be integrated in the mobile terminal. Fig. 6 is a schematic structural diagram of a mobile terminal according to an embodiment of the present application. The mobile terminal 600 may include: the terminal control method comprises a memory 601, a processor 602 and a computer program stored on the memory 601 and executable by the processor 602, wherein the processor 602 implements the terminal control method according to the embodiment of the application when executing the computer program.
The mobile terminal provided by the embodiment of the application acquires the movement track of the limb of the user close to the mobile terminal based on the reflection characteristic of the ultrasonic signal, generates the corresponding control instruction, controls the mobile terminal, increases the control mode of the mobile terminal, and is convenient and fast to operate and high in flexibility.
Fig. 7 is a schematic structural diagram of another mobile terminal according to an embodiment of the present application. The mobile terminal may include: a housing (not shown), a memory 701, a Central Processing Unit (CPU) 702 (also called a processor, hereinafter referred to as CPU), a circuit board (not shown), and a power circuit (not shown). The circuit board is arranged in a space enclosed by the shell; the CPU702 and the memory 701 are provided on the circuit board; the power supply circuit is used for supplying power to each circuit or device of the mobile terminal; the memory 701 is used for storing executable program codes; the CPU702 executes a computer program corresponding to the executable program code by reading the executable program code stored in the memory 701 to implement the steps of:
when detecting that an object is close to the mobile terminal, judging whether the object is a user limb;
when the object is a user limb, controlling an ultrasonic signal transmitter of the mobile terminal to transmit an ultrasonic signal, and controlling an ultrasonic signal receiver to receive a reflected signal of the user limb to the ultrasonic signal;
determining the movement track of the user limb according to the transmitted ultrasonic signal and the received reflected signal;
and generating a control instruction according to the movement track of the user limb, and executing the control instruction.
The mobile terminal further includes: peripheral interfaces 703, RF (Radio Frequency) circuitry 705, audio circuitry 706, speakers 711, power management chip 708, input/output (I/O) subsystems 709, other input/control devices 710, touch screen 712, other input/control devices 710, and external port 704, which communicate via one or more communication buses or signal lines 707.
It should be understood that the illustrated mobile terminal 700 is merely one example of a mobile terminal and that the mobile terminal 700 may have more or fewer components than shown, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The following describes the mobile terminal for controlling operations provided in this embodiment in detail, and the mobile terminal is exemplified by a mobile phone.
A memory 701, the memory 701 being accessible by the CPU702, the peripheral interface 703, and the like, the memory 701 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other volatile solid state storage devices.
A peripheral interface 703, said peripheral interface 703 may connect input and output peripherals of the device to the CPU702 and the memory 701.
An I/O subsystem 709, which I/O subsystem 709 may connect input and output peripherals on the device, such as a touch screen 712 and other input/control devices 710, to the peripheral interface 703. The I/O subsystem 709 may include a display controller 7091 and one or more input controllers 7092 for controlling other input/control devices 710. Where one or more input controllers 7092 receive electrical signals from or transmit electrical signals to other input/control devices 710, the other input/control devices 710 may include physical buttons (push buttons, rocker buttons, etc.), dials, slide switches, joysticks, click wheels. It is worth noting that the input controller 7092 may be connected to any one of the following: a keyboard, an infrared port, a USB interface, and a pointing device such as a mouse.
A touch screen 712, the touch screen 712 being an input interface and an output interface between the user's mobile terminal and the user, displays visual output to the user, which may include graphics, text, icons, video, and the like.
The display controller 7091 in the I/O subsystem 709 receives electrical signals from the touch screen 712 or transmits electrical signals to the touch screen 712. The touch screen 712 detects a contact on the touch screen, and the display controller 7091 converts the detected contact into an interaction with a user interface object displayed on the touch screen 712, i.e., implements a human-computer interaction, and the user interface object displayed on the touch screen 712 may be an icon for running a game, an icon networked to a corresponding network, or the like. It is worth mentioning that the device may also comprise a light mouse, which is a touch sensitive surface that does not show visual output, or an extension of the touch sensitive surface formed by the touch screen.
The RF circuit 705 is mainly used to establish communication between the mobile phone and the wireless network (i.e., network side), and implement data reception and transmission between the mobile phone and the wireless network. Such as sending and receiving short messages, e-mails, etc. In particular, RF circuitry 705 receives and transmits RF signals, also referred to as electromagnetic signals, through which RF circuitry 705 converts electrical signals to or from electromagnetic signals and communicates with communication networks and other devices. RF circuitry 705 may include known circuitry for performing these functions including, but not limited to, an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC (CODEC) chipset, a Subscriber Identity Module (SIM), and so forth.
The audio circuit 706 is mainly used to receive audio data from the peripheral interface 703, convert the audio data into an electric signal, and transmit the electric signal to the speaker 711.
The speaker 711 is used to convert the voice signal received by the handset from the wireless network through the RF circuit 705 into sound and play the sound to the user.
And a power management chip 708 for supplying power and managing power to the hardware connected to the CPU702, the I/O subsystem, and the peripheral interface.
The terminal control device, the storage medium and the mobile terminal provided in the above embodiments may execute the terminal control method provided in any embodiment of the present application, and have corresponding functional modules and beneficial effects for executing the method. For technical details that are not described in detail in the above embodiments, reference may be made to a terminal control method provided in any embodiment of the present application.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present application and the technical principles employed. It will be understood by those skilled in the art that the present application is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the application. Therefore, although the present application has been described in more detail with reference to the above embodiments, the present application is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present application, and the scope of the present application is determined by the scope of the appended claims.
Claims (10)
1. A terminal control method is applied to a mobile terminal, the mobile terminal comprises an ultrasonic signal transmitter and an ultrasonic signal receiver, and the terminal control method is characterized by comprising the following steps:
when detecting that an object is close to the mobile terminal, judging whether the object is a user limb;
when the object is a user limb, controlling an ultrasonic signal transmitter of the mobile terminal to transmit an ultrasonic signal, and controlling an ultrasonic signal receiver to receive a reflected signal of the user limb to the ultrasonic signal;
determining the movement track of the user limb according to the transmitted ultrasonic signal and the received reflected signal;
generating a control instruction according to the movement track of the user limb, and executing the control instruction;
the movement track of the user limb is a three-dimensional track generated by connecting a plurality of space coordinates;
the determining the movement track of the user limb according to the transmitted ultrasonic signal and the received reflected signal comprises:
determining a distance parameter between the user limb and the mobile terminal according to the transmitting time of the ultrasonic signal and the receiving time of the reflected signal;
determining the direction parameters of the user limb relative to the mobile terminal according to the position of an ultrasonic signal receiver for receiving the reflected signal, wherein the ultrasonic signal receiver is a matrix type ultrasonic signal receiver;
determining the spatial position of the user limb relative to the mobile terminal according to the distance parameter and the direction parameter;
and generating a movement track of the user limb according to a plurality of spatial positions of the user limb relative to the mobile terminal within a preset time when the user limb is detected to be close to the mobile terminal.
2. The method of claim 1, wherein determining whether the object is a user's limb comprises:
transmitting ultrasonic signals with different frequencies to the object based on an ultrasonic signal transmitter of the mobile terminal, and receiving reflected signals of the ultrasonic signals with different frequencies;
generating ultrasonic signal absorption data of the object according to the reflection signals of the ultrasonic signals of all the frequencies;
and when the ultrasonic signal absorption data of the object are matched with preset human body ultrasonic signal absorption data, determining that the object is the limb of the user.
3. The method of claim 2, wherein generating ultrasonic signal absorption data of the object from the reflected signals of the ultrasonic signals of the respective frequencies comprises:
for the ultrasonic signal of any frequency, determining the absorption value of the ultrasonic signal of the object at the frequency according to the energy difference of the transmitted ultrasonic signal and the reflected signal;
and counting the absorption value of the object to the ultrasonic signal under each frequency to generate ultrasonic signal absorption data of the object.
4. The method of claim 1, wherein generating control instructions according to the movement trajectory of the user's limb comprises:
matching the movement track of the user limb with a preset gesture in a preset instruction database, wherein the preset instruction database comprises the preset gesture and a control instruction corresponding to the preset gesture;
and when the matching is successful, calling a control instruction corresponding to the successfully matched preset gesture.
5. The method of claim 4, wherein prior to matching the trajectory of the user's limb movement with a preset gesture in a preset instruction database, the method further comprises:
determining the outline of the user limb according to the reflected signal received by the matrix type ultrasonic signal receiver;
determining the type of the user limb according to the outline of the user limb;
correspondingly, matching the movement track of the user limb with a preset gesture in a preset instruction database, including:
screening in a preset instruction database according to the user limb type;
and matching the movement track of the user limb with the preset gesture corresponding to the screened preset instruction data.
6. The method of claim 1, wherein the detecting that the object is near the mobile terminal comprises:
forming a sound field based on an ultrasonic signal transmitted by an ultrasonic signal transmitter of the mobile terminal;
and monitoring the sound field of the mobile terminal, and determining that an object is close to the mobile terminal when the sound field changes.
7. The method according to claim 1, wherein when the mobile terminal is in an audio playing state, if the movement track of the user's limb is a movement track along a vertical direction of the mobile terminal, the generated control instruction is a volume adjustment instruction;
when the mobile terminal is in a call state, if the movement track of the user limb is the movement track along the horizontal direction of the mobile terminal and points to the direction of an answering key, the generated control instruction is a call answering instruction;
and if the movement track of the user limb is the movement track along the horizontal direction of the mobile terminal and points to the hang-up key direction, the generated control instruction is an incoming call hang-up instruction.
8. A terminal control apparatus, comprising:
the user limb judging module is used for judging whether the object is a user limb or not when detecting that the object is close to the mobile terminal;
the ultrasonic signal transmitting module is used for controlling an ultrasonic signal transmitter of the mobile terminal to transmit an ultrasonic signal and controlling the ultrasonic signal receiver to receive a reflected signal of the user limb to the ultrasonic signal when the object is the user limb;
the mobile track determining module is used for determining the mobile track of the user limb according to the transmitted ultrasonic signal and the received reflected signal;
the control instruction generating module is used for generating a control instruction according to the movement track of the user limb and executing the control instruction;
the movement track of the user limb is a three-dimensional track generated by connecting a plurality of space coordinates;
the movement trajectory determination module includes:
the distance parameter determining unit is used for determining the distance parameter between the user limb and the mobile terminal according to the transmitting time of the ultrasonic signal and the receiving time of the reflected signal;
the direction parameter determining unit is used for determining the direction parameters of the user limb relative to the mobile terminal according to the position of an ultrasonic signal receiver for receiving the reflected signal, wherein the ultrasonic signal receiver is a matrix type ultrasonic signal receiver;
the space position determining unit is used for determining the space position of the user limb relative to the mobile terminal according to the distance parameter and the direction parameter;
the mobile track determining unit is used for generating a mobile track of the user limb according to a plurality of spatial positions of the user limb relative to the mobile terminal within a preset time when the user limb is detected to be close to the mobile terminal.
9. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when executed by a processor, implements a terminal control method according to any one of claims 1-7.
10. A mobile terminal, characterized in that it comprises a memory, a processor and a computer program stored on the memory and executable on the processor, said processor implementing the terminal control method according to any one of claims 1 to 7 when executing said computer program.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811244092.2A CN109298791B (en) | 2018-10-24 | 2018-10-24 | Terminal control method, device, storage medium and mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811244092.2A CN109298791B (en) | 2018-10-24 | 2018-10-24 | Terminal control method, device, storage medium and mobile terminal |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109298791A CN109298791A (en) | 2019-02-01 |
CN109298791B true CN109298791B (en) | 2022-03-29 |
Family
ID=65158587
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811244092.2A Expired - Fee Related CN109298791B (en) | 2018-10-24 | 2018-10-24 | Terminal control method, device, storage medium and mobile terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109298791B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113050788A (en) * | 2019-12-26 | 2021-06-29 | 华为技术有限公司 | Sound playing control method and device |
CN111722715A (en) * | 2020-06-17 | 2020-09-29 | 上海思立微电子科技有限公司 | Gesture control system and electronic equipment |
CN112130737B (en) * | 2020-09-30 | 2022-09-02 | Oppo广东移动通信有限公司 | Electronic apparatus, control method, and storage medium |
CN112987925B (en) * | 2021-03-04 | 2022-11-25 | 歌尔科技有限公司 | Earphone and control method and device thereof |
CN114422762B (en) * | 2021-12-25 | 2023-10-13 | 深圳市幕工坊科技有限公司 | Projection screen motion control system |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105245712A (en) * | 2015-10-15 | 2016-01-13 | 广东欧珀移动通信有限公司 | Mobile terminal and call processing method and device of mobile terminal |
CN105242861A (en) * | 2015-10-15 | 2016-01-13 | 广东欧珀移动通信有限公司 | Ultrasonic wave-based parameter adjustment method and device |
CN105260024A (en) * | 2015-10-15 | 2016-01-20 | 广东欧珀移动通信有限公司 | Method and apparatus for stimulating gesture motion trajectory on screen |
CN107942335A (en) * | 2017-11-28 | 2018-04-20 | 达闼科技(北京)有限公司 | A kind of object identification method and equipment |
EP3324204A1 (en) * | 2016-11-21 | 2018-05-23 | HTC Corporation | Body posture detection system, suit and method |
-
2018
- 2018-10-24 CN CN201811244092.2A patent/CN109298791B/en not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105245712A (en) * | 2015-10-15 | 2016-01-13 | 广东欧珀移动通信有限公司 | Mobile terminal and call processing method and device of mobile terminal |
CN105242861A (en) * | 2015-10-15 | 2016-01-13 | 广东欧珀移动通信有限公司 | Ultrasonic wave-based parameter adjustment method and device |
CN105260024A (en) * | 2015-10-15 | 2016-01-20 | 广东欧珀移动通信有限公司 | Method and apparatus for stimulating gesture motion trajectory on screen |
EP3324204A1 (en) * | 2016-11-21 | 2018-05-23 | HTC Corporation | Body posture detection system, suit and method |
CN107942335A (en) * | 2017-11-28 | 2018-04-20 | 达闼科技(北京)有限公司 | A kind of object identification method and equipment |
Also Published As
Publication number | Publication date |
---|---|
CN109298791A (en) | 2019-02-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109298791B (en) | Terminal control method, device, storage medium and mobile terminal | |
CN107636893B (en) | Multi-antenna communication system configured to detect objects | |
JP6100286B2 (en) | Gesture detection based on information from multiple types of sensors | |
CN109218535B (en) | Method and device for intelligently adjusting volume, storage medium and terminal | |
CN108829320A (en) | Exchange method, device, storage medium, mobile terminal and interactive system | |
US20200158556A1 (en) | Power management | |
CN110060680B (en) | Electronic equipment interaction method and device, electronic equipment and storage medium | |
WO2019105376A1 (en) | Gesture recognition method, terminal and storage medium | |
CN111757241B (en) | Sound effect control method and device, sound box array and wearable device | |
KR20150005020A (en) | coordinate measuring apparaturs which measures input position of coordinate indicating apparatus and method for controlling thereof | |
WO2023284418A1 (en) | Positioning method and apparatus, and electronic device and computer-readable storage medium | |
CN106487984B (en) | A kind of method and apparatus adjusting volume | |
CN112738886B (en) | Positioning method, positioning device, storage medium and electronic equipment | |
WO2021160000A1 (en) | Wearable device and control method | |
CN108055644B (en) | Positioning control method and device, storage medium and terminal equipment | |
KR20220123036A (en) | Touch keys, control methods and electronics | |
CN109029252B (en) | Object detection method, object detection device, storage medium, and electronic apparatus | |
Wang et al. | Faceori: Tracking head position and orientation using ultrasonic ranging on earphones | |
WO2017031647A1 (en) | Method and apparatus for detecting touch mode | |
CN112098929A (en) | Method, device and system for determining relative angle between intelligent devices and intelligent devices | |
CN109088982A (en) | Terminal security improvement method, device, storage medium and terminal device | |
CN108614263A (en) | Mobile terminal, method for detecting position and Related product | |
CN111356932A (en) | Method for managing multiple devices and electronic device | |
CN108632713B (en) | Volume control method and device, storage medium and terminal equipment | |
CN110730266B (en) | Alarm clock volume control method and device, storage medium and terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20220329 |