CN113778315A - Data interaction method, device and system and electronic equipment - Google Patents
Data interaction method, device and system and electronic equipment Download PDFInfo
- Publication number
- CN113778315A CN113778315A CN202110995623.7A CN202110995623A CN113778315A CN 113778315 A CN113778315 A CN 113778315A CN 202110995623 A CN202110995623 A CN 202110995623A CN 113778315 A CN113778315 A CN 113778315A
- Authority
- CN
- China
- Prior art keywords
- received
- interface
- presenting
- preset area
- display screen
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 35
- 238000000034 method Methods 0.000 title claims abstract description 25
- 238000001514 detection method Methods 0.000 claims description 12
- 238000009877 rendering Methods 0.000 claims description 2
- 238000004590 computer program Methods 0.000 claims 2
- 230000006870 function Effects 0.000 abstract description 21
- 230000008569 process Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 4
- 230000004075 alteration Effects 0.000 description 3
- 238000012790 confirmation Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000004913 activation Effects 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the disclosure provides a data interaction method, a device, a system and an electronic device, wherein the method comprises the following steps: detecting whether a touch signal from remote control equipment is received; under the condition that a touch signal is received, presenting a voice recognition interface in a first preset area on a display screen, and presenting an application icon interface in a second preset area; detecting whether an application program icon in a second preset area receives a starting triggering operation; and under the condition of receiving a starting triggering operation, presenting an interface of an application program corresponding to the started application program icon on the display screen. According to the embodiment of the disclosure, when the touch signal of the remote control device is received, two areas with different functions are displayed on the display screen, the first preset area can receive the voice signal, the application program icon displayed in the second preset area can be directly controlled by the user, the user can accurately control the application program to be started through the second preset area, the interaction mode is simple and easy, and the user experience is good.
Description
Technical Field
The present disclosure relates to the field of communications, and in particular, to a data interaction method, apparatus, system, and electronic device.
Background
At present, for an intelligent electronic device (e.g., an intelligent television) having a remote control device, when control is needed again, voice recognition may be adopted, that is, voice recognition is started by pressing a key on the remote control device (or the intelligent electronic device), a voice recognition interface is presented on a display screen of the electronic device, specific content of a voice signal of a user may also be presented on the display screen, and the electronic device is controlled to execute a corresponding function according to a recognition result of the voice signal.
Although the electronic device may execute the function corresponding to the recognition result, the currently implemented function may not be the true idea of the user, and there are problems of low accuracy or inconvenient interaction mode, which brings poor experience to the user. For example, a user may want to watch a movie that is being played on a platform of multiple applications, and the randomly opened applications may not be the applications that the user wants to use.
Disclosure of Invention
In view of this, the embodiments of the present disclosure provide a data interaction method, apparatus, system and electronic device, so as to solve the following problems in the prior art: when the electronic equipment is controlled by voice, the currently realized function may not be the true idea of the user, and the problems of low precision or inconvenient interaction mode and the like exist, so that poor experience is brought to the user.
In one aspect, an embodiment of the present disclosure provides a data interaction method, including: detecting whether a touch signal from remote control equipment is received; under the condition that the touch signal is received, presenting a voice recognition interface in a first preset area on a display screen, and presenting an application icon interface in a second preset area; detecting whether an application program icon in the second preset area receives a starting triggering operation; and under the condition that the starting triggering operation is received, presenting an interface of an application program corresponding to the started application program icon on the display screen.
In some embodiments, the presenting a speech recognition interface in a first predetermined area and an application icon interface in a second predetermined area on the display screen includes: detecting whether a voice signal is received within preset time, and presenting the voice recognition interface in the first preset area and presenting the application icon interface in the second preset area on a display screen under the condition that the voice signal is received; or detecting whether the duration time of the touch signal reaches the preset time, and presenting a voice recognition interface in the first preset area and presenting an application icon interface in the second preset area on the display screen under the condition that the duration time reaches the preset time.
In some embodiments, the presenting the speech recognition interface in the first predetermined area on the display screen further comprises, after presenting the application icon interface in the second predetermined area: and under the condition that the voice signal is received, recognizing the received voice signal, and adjusting the application program icon presented in the second preset area according to the recognition result of the voice signal.
In some embodiments, further comprising: and under the condition that the voice signal is not received or the duration time does not reach the preset time, the interface before the touch signal is received is recovered and presented on the display screen.
In some embodiments, the detecting whether the application icon in the second predetermined area receives a start triggering operation includes: detecting whether the position of a cursor moving on a display screen is in an area corresponding to the application program icon when a touch signal from the remote control equipment is received; or detecting whether the recognition result of the received voice signal is matched with the name of the application program presented in the second preset area.
On the other hand, an embodiment of the present disclosure provides a data interaction apparatus, including: the first detection module is used for detecting whether a touch signal from the remote control equipment is received or not; the first presentation module is used for presenting a voice recognition interface in a first preset area on the display screen and presenting an application icon interface in a second preset area under the condition of receiving the touch signal; the second detection module is used for detecting whether the application program icon in the second preset area receives a starting triggering operation; and the second presentation module is used for presenting an interface of the application program corresponding to the started application program icon on the display screen under the condition of receiving the starting triggering operation.
In some embodiments, the first rendering module comprises: the first presentation unit is used for detecting whether a voice signal is received within preset time, presenting a voice recognition interface in the first preset area on the display screen under the condition that the voice signal is received, and presenting an application icon interface in the second preset area; and the second presentation unit is used for detecting whether the duration time of the touch signal reaches the preset time or not, presenting a voice recognition interface in the first preset area on the display screen and presenting an application icon interface in the second preset area under the condition that the duration time reaches the preset time.
In some embodiments, further comprising: and the adjusting unit is used for identifying the received voice signal under the condition that the voice signal is received, and adjusting the application program icon presented in the second preset area according to the identification result of the voice signal.
In some embodiments, further comprising: and the third presentation unit is used for resuming to present an interface before receiving the touch signal on the display screen under the condition that the voice signal is not received or the duration time does not reach the preset time.
In some embodiments, the second detection module is specifically configured to: detecting whether the position of a cursor moving on a display screen is in an area corresponding to the application program icon when a touch signal from the remote control equipment is received; detecting whether the recognition result of the received voice signal is matched with the name of the application program presented in the second preset area.
On the other hand, an embodiment of the present disclosure provides an electronic device, which at least includes: the data interaction device of any embodiment of the disclosure.
On the other hand, an embodiment of the present disclosure provides a data interaction system, including: the remote control device is configured to respond to touch operation received by the detector and send a touch signal to the electronic device; and sending a starting trigger operation aiming at the application program icon in a second preset area on the display screen to the electronic equipment.
According to the embodiment of the disclosure, when the touch signal of the remote control device is received, two areas with different functions are displayed on the display screen, the first preset area can receive the voice signal, the application program icon displayed in the second preset area can be directly controlled by the user, the user can accurately control the application program to be started through the second preset area, the interaction mode is simple and easy, and the user experience is good.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments described in the present disclosure, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of a data interaction method according to a first embodiment of the disclosure;
fig. 2 is a first schematic interface diagram of a display screen according to a first embodiment of the present disclosure;
fig. 3 is a second schematic interface diagram of a display screen according to a first embodiment of the disclosure;
fig. 4 is a schematic diagram illustrating an application icon adjustment according to a first embodiment of the disclosure;
fig. 5 is an exemplary flowchart of a data interaction method provided in a first embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a data interaction device according to a second embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described below clearly and completely with reference to the accompanying drawings of the embodiments of the present disclosure. It is to be understood that the described embodiments are only a few embodiments of the present disclosure, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the described embodiments of the disclosure without any inventive step, are within the scope of protection of the disclosure.
Unless otherwise defined, technical or scientific terms used herein shall have the ordinary meaning as understood by one of ordinary skill in the art to which this disclosure belongs. The use of "first," "second," and similar terms in this disclosure is not intended to indicate any order, quantity, or importance, but rather is used to distinguish one element from another. The word "comprising" or "comprises", and the like, means that the element or item listed before the word covers the element or item listed after the word and its equivalents, but does not exclude other elements or items. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", and the like are used merely to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships may also be changed accordingly.
To maintain the following description of the embodiments of the present disclosure clear and concise, a detailed description of known functions and known components have been omitted from the present disclosure.
A first embodiment of the present disclosure provides a data interaction method, where a flow of the method is shown in fig. 1, and the method includes steps S101 to S104:
s101, detecting whether a touch signal from the remote control equipment is received.
The remote control device may be a remote controller, or may be a trigger button associated with the electronic device itself, and the remote control device may be separated from the electronic device or integrated with the electronic device. In particular, for the electronic device with the remote controller, the electronic device is usually a smart television.
When a user presses a key on the remote control device, a touch signal is generated and sent to the electronic device. Of course, a position on the remote control device may be touched to generate the touch signal, which is not limited herein.
S102, under the condition that the touch signal is received, a voice recognition interface is presented in a first preset area on the display screen, and an application icon interface is presented in a second preset area.
If the touch signal is received, it is equivalent to that the user wants to perform one control on the electronic device, therefore, in the case of receiving the touch signal, the embodiment of the present disclosure divides the display screen that only presents the speech recognition interface in the prior art into two parts, namely, one part presents the speech recognition interface and the other part presents the application icon interface; the interface of the display screen may be shown in fig. 2, and of course, may also be shown in fig. 3, and the embodiment of the present disclosure is not particularly limited.
The voice recognition interface presented in the first predetermined area can still receive the voice signal of the user, and the application icon interface presented in the second predetermined area can receive a start triggering operation of the user.
In a specific implementation, in order to prevent a user from performing a false operation due to false triggering, in the embodiment of the present disclosure, before presenting the voice recognition interface and the application icon interface, it may be further detected whether a voice signal is received within a predetermined time, or whether the duration of the touch signal reaches the predetermined time. The predetermined time may be 2 seconds or 3 seconds, and if the voice signal is received or the duration reaches the predetermined time, it indicates that the user is actively operating, and there is no misoperation.
The voice signal is only required to be a complete voice signal as long as sound exists, and the voice signal is only used as a switch for simultaneously presenting the voice recognition interface and the application icon interface.
For the presented application icons, the application icons may be displayed in sequence according to the frequency of use of the application by the user, and in general, the application icons are sequentially sorted from high to low according to the frequency of use or sorted from long to short according to the duration of use, and then presented according to the number of icons and the sorting result that can be displayed in the application icon interface, for example, if the number of icons that can be displayed is 6, the icon corresponding to the 6 most frequently used applications is selected from the sorting as the content presented in the application icon interface.
And if the voice signal is not received within the preset time after the touch signal is received or the duration time of the touch signal does not reach the preset time, the user is considered to be misoperation, and the interface before the touch signal is received is restored and presented on the display screen.
S103, detecting whether the application program icon in the second preset area receives the starting triggering operation.
For the above-mentioned start trigger operation, it may be from the remote controller, and may also be from the received voice signal.
For example, it may be detected whether the position of a cursor moving on the display screen when a touch signal is received from the remote control device is in the area corresponding to the application icon. At this time, the touch signal of the remote control device is equivalent to a confirmation operation, that is, it is confirmed that the application program corresponding to the position of the cursor is the application program that the user wants to start. As an example, when a user wants to open an application named "ABC", the user can aim at the application he wants to select by moving a cursor on the display screen due to the presence of multiple different versions of the application associated with "ABC" in the electronic device, and then press a key on the remote control device to send a touch signal as a start trigger to start an application icon.
For another example, it may also be detected whether the recognition result of the received voice signal matches the name of the application program presented in the second predetermined area. In this case, the user determines which application to control to open through the voice signal. As an example, when a user wants to open an application named "ABC", the user may identify the electronic device by sending a voice signal, for example, if the voice signal is "open ABC universal version", which is an activation trigger, because there are multiple different versions of the application associated with "ABC" in the electronic device.
And S104, under the condition that the starting triggering operation is received, displaying an interface of the application program corresponding to the started application program icon on the display screen.
According to the embodiment of the disclosure, when the touch signal of the remote control device is received, two areas with different functions are displayed on the display screen, the first preset area can receive the voice signal, the application program icon displayed in the second preset area can be directly controlled by the user, the user can accurately control the application program to be started through the second preset area, the interaction mode is simple and easy, and the user experience is good.
In the above process, if the user receives the voice signal within the predetermined time, the received voice signal may be recognized, the recognition result of the voice signal may be presented in the first predetermined area, and the application icon presented in the second predetermined area may be adjusted according to the recognition result of the voice signal, and the adjustment process may refer to fig. 4. The process can adjust the content presented in the second predetermined area through the voice signal, so that the presented content can be closer to the intention of the user, and better user experience is achieved.
The above process is exemplarily described with reference to fig. 5, and the flow in fig. 5 does not limit the embodiment of the present disclosure, but is only one possible implementation process.
And S1, the terminal (which can be a television, an intelligent display, a whiteboard all-in-one machine and the like) is in an open state, and non-voice system interface content is displayed (the voice system interface is a voice recognition interface presented in the first preset area and an application icon interface presented in the second preset area).
S2, the user presses the voice button of the remote controller to send a touch signal.
S3, the user speaks into the microphone of the remote controller.
At the instant of speaking to the remote controller, the terminal switches from the current interface to the speech system interface and the speech recognition function is turned on S4.
S5, transmitting the voice signal to the terminal through various protocols (such as network, Bluetooth, etc.) of the remote controller and the terminal, recognizing the voice signal as characters on the interface of the voice system, displaying the characters in the first preset area, returning the result through the voice recognition function (both network and local), and displaying the result in the application area (the second preset area).
S6, the application area shows the return result (i.e., the adjustment process of the icon), and at this time, the voice recognition function is turned off.
S7, at which time the application area shows the synchronized cursor shown through the remote control gyroscope.
And S8, the cursor of the application area of the voice system interface can be selected by moving the remote controller (gyroscope function).
And S9, pressing the voice button under the condition that the cursor selects the application in the interface of the voice system.
And S10, the selected application pressed by the voice button is opened, and the voice system interface exits.
And S11, if the cursor is not in the application selection range in the application area, pressing the voice key of the remote controller.
S12, there is no interaction of the speech system interface at this time.
And S13, moving the cursor of the voice system interface to the voice recognition area synchronously by the upward displacement of the remote controller (gyroscope function).
S14, the speech recognition function is turned on again, and the application area shows the last returned result.
The speech system interface is now in a wait speech recognition state S15.
S16, if the speech recognition is successful, the flow returns to S5.
And S17, synchronously moving the cursor of the voice system interface to the application area by the downward displacement of the remote controller (the function of a gyroscope).
S18, return to S7.
And S19, pressing the voice button of the remote controller on the non-voice system interface, and selecting not to speak.
And S20, pressing the voice button for more than 3 seconds (one time limit can be defined by user).
And S21, keeping the interface display of the S1, and displaying the interface of the voice system after 3 seconds.
And S22, simultaneously when the voice system interface is entered, the voice recognition function is started.
S23, return to S15.
And S24, pressing the voice button to release the button within 3 seconds (a time limit can be defined).
And S25, returning to the display content of S1.
When the voice key of the remote controller is pressed on the voice system interface, the function is the same as that of the confirmation key; when the remote controller voice key releases the effect, only the non-voice system interface has the timing function.
A second embodiment of the present disclosure provides a data interaction apparatus, a structural schematic of the apparatus is shown in fig. 6, and the apparatus includes:
the first detection module 10 is configured to detect whether a touch signal from a remote control device is received; the first presentation module 20 is coupled to the first detection module 10, and configured to present a voice recognition interface in a first predetermined area on the display screen and present an application icon interface in a second predetermined area on the display screen when the touch signal is received; the second detection module 30 is coupled to the first presentation module 20, and is configured to detect whether the application icon in the second predetermined area receives a start trigger operation; and the second presenting module 40 is coupled to the second detecting module 30, and is configured to present an interface of an application corresponding to the launched application icon on the display screen in a case that the launch trigger operation is received.
The remote control device may be a remote controller, or may be a trigger button associated with the electronic device itself, and the remote control device may be separated from the electronic device or integrated with the electronic device. In particular, for the electronic device with the remote controller, the electronic device is usually a smart television.
When a user presses a key on the remote control device, a touch signal is generated and sent to the electronic device. Of course, a position on the remote control device may be touched to generate the touch signal, which is not limited herein.
If the touch signal is received, it is equivalent to that the user wants to perform a control on the electronic device, and therefore, in the case that the touch signal is received, the embodiment of the present disclosure divides the display screen that only presents the speech recognition interface in the prior art into two parts, namely, one part presents the speech recognition interface and the other part presents the application icon interface.
The voice recognition interface presented in the first predetermined area can still receive the voice signal of the user, and the application icon interface presented in the second predetermined area can receive a start triggering operation of the user.
In particular, in order to prevent a user from performing a false operation due to false triggering, the first presentation module according to the embodiment of the present disclosure may further include a first presentation unit, configured to detect whether a voice signal is received within a predetermined time, and present a voice recognition interface in a first predetermined area on the display screen and present an application icon interface in a second predetermined area on the display screen when the voice signal is received; and the second presentation unit is used for detecting whether the duration time of the touch signal reaches a preset time or not, presenting a voice recognition interface in a first preset area on the display screen and presenting an application icon interface in a second preset area under the condition that the duration time reaches the preset time.
The predetermined time may be 2 seconds or 3 seconds, and if the voice signal is received or the duration reaches the predetermined time, it indicates that the user is actively operating, and there is no misoperation.
The voice signal is only required to be a complete voice signal as long as sound exists, and the voice signal is only used as a switch for simultaneously presenting the voice recognition interface and the application icon interface.
For the presented application icons, the application icons may be displayed in sequence according to the frequency of use of the application by the user, and in general, the application icons are sequentially sorted from high to low according to the frequency of use or sorted from long to short according to the duration of use, and then presented according to the number of icons and the sorting result that can be displayed in the application icon interface, for example, if the number of icons that can be displayed is 6, the icon corresponding to the 6 most frequently used applications is selected from the sorting as the content presented in the application icon interface.
If the voice signal is not received within the predetermined time after the touch signal is received or the duration of the touch signal does not reach the predetermined time, it is determined that the user has performed the misoperation, and therefore, the first presentation module of the embodiment of the disclosure may further include a third presentation unit, configured to resume presenting the interface before the touch signal is received on the display screen when the voice signal is not received or the duration of the touch signal does not reach the predetermined time.
For the above-mentioned start trigger operation, it may be from the remote controller, and may also be from the received voice signal. Therefore, the embodiment of the present disclosure may further include a second detection module, specifically configured to: detecting whether the position of a cursor moving on a display screen is in a region corresponding to an application program icon when a touch signal from remote control equipment is received; it is detected whether the recognition result of the received voice signal matches the name of the application program presented in the second predetermined area.
For example, when the second detection module is configured to detect whether the position of the cursor moving on the display screen is located in the area corresponding to the application icon when the second detection module receives the touch signal from the remote control device, the touch signal of the remote control device is equivalent to a confirmation operation, that is, it is confirmed that the application program corresponding to the position of the cursor is the application program that the application program wants to be started. As an example, when a user wants to open an application named "ABC", the user can aim at the application he wants to select by moving a cursor on the display screen due to the presence of multiple different versions of the application associated with "ABC" in the electronic device, and then press a key on the remote control device to send a touch signal as a start trigger to start an application icon.
For another example, when the second detection module is configured to detect whether the recognition result of the received voice signal matches the name of the application program presented in the second predetermined area, the user may determine which application program to control to open through the voice signal. As an example, when a user wants to open an application named "ABC", the user may identify the electronic device by sending a voice signal, for example, if the voice signal is "open ABC universal version", which is an activation trigger, because there are multiple different versions of the application associated with "ABC" in the electronic device.
The embodiment of the disclosure may further include an adjusting unit, configured to recognize the received voice signal in a case where the voice signal is received within a predetermined time, and present a recognition result of the voice signal in the first predetermined area; and adjusting the application program icon presented in the second preset area according to the recognition result of the voice signal. The process can adjust the content presented in the second predetermined area through the voice signal, so that the presented content can be closer to the intention of the user, and better user experience is achieved.
According to the embodiment of the disclosure, when the touch signal of the remote control device is received, two areas with different functions are displayed on the display screen, the first preset area can receive the voice signal, the application program icon displayed in the second preset area can be directly controlled by the user, the user can accurately control the application program to be started through the second preset area, the interaction mode is simple and easy, and the user experience is good.
The third embodiment of the present disclosure further provides an electronic device, which at least includes the data interaction apparatus in the foregoing embodiments of the present disclosure, and a specific structure of the data interaction apparatus is not described herein again.
A fourth embodiment of the present disclosure further provides a data interaction system, where the system includes a remote control device and the electronic device in the foregoing embodiments, where the remote control device is configured to send a touch signal to the electronic device in response to a touch operation received by the detector; and sending a starting trigger operation aiming at the application program icon in the second preset area on the display screen to the electronic equipment.
For the electronic device, it may be a terminal with a voice recognition function, such as a television, an intelligent display, a whiteboard all-in-one machine, and the like. For the remote control equipment, the remote control equipment can be a remote controller with voice function keys, and a gyroscope is arranged in the remote control equipment; the voice signal can be sent to the terminal through channels such as a network and Bluetooth, and the built-in gyroscope can realize accurate movement of a cursor of the terminal.
According to the embodiment of the disclosure, more convenient user experience is achieved through an interactive mode of the remote controller and the voice recognition, so that the technical problem that the application is opened by mistake in the voice recognition is solved; after the voice search application, a plurality of related application results can be displayed, the user does not need to select the application to be opened step by using the up-down keys of the remote controller, and the application program can be opened by using a cursor or voice operation; by switching the voice interface and the application interface, the direct requirement of opening the application by a user is met; through the interaction mode, the speech error recognition rate of the user can be reduced.
Moreover, although exemplary embodiments have been described herein, the scope thereof includes any and all embodiments based on the disclosure with equivalent elements, modifications, omissions, combinations (e.g., of various embodiments across), adaptations or alterations. The elements of the claims are to be interpreted broadly based on the language employed in the claims and not limited to examples described in the present specification or during the prosecution of the application, which examples are to be construed as non-exclusive. It is intended, therefore, that the specification and examples be considered as exemplary only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.
The above description is intended to be illustrative and not restrictive. For example, the above-described examples (or one or more versions thereof) may be used in combination with each other. For example, other embodiments may be used by those of ordinary skill in the art upon reading the above description. In addition, in the foregoing detailed description, various features may be grouped together to streamline the disclosure. This should not be interpreted as an intention that a disclosed feature not claimed is essential to any claim. Rather, the subject matter of the present disclosure may lie in less than all features of a particular disclosed embodiment. Thus, the following claims are hereby incorporated into the detailed description as examples or embodiments, with each claim standing on its own as a separate embodiment, and it is contemplated that these embodiments may be combined with each other in various combinations or permutations. The scope of the disclosure should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.
While the present disclosure has been described in detail with reference to the embodiments, the present disclosure is not limited to the specific embodiments, and those skilled in the art can make various modifications and alterations based on the concept of the present disclosure, and the modifications and alterations should fall within the scope of the present disclosure as claimed.
Claims (10)
1. A method for data interaction, comprising:
detecting whether a touch signal from remote control equipment is received;
under the condition that the touch signal is received, presenting a voice recognition interface in a first preset area on a display screen, and presenting an application icon interface in a second preset area;
detecting whether an application program icon in the second preset area receives a starting triggering operation;
and under the condition that the starting triggering operation is received, presenting an interface of an application program corresponding to the started application program icon on the display screen.
2. The data interaction method of claim 1, wherein presenting a voice recognition interface in a first predetermined area and an application icon interface in a second predetermined area on the display screen comprises:
detecting whether a voice signal is received within preset time, and presenting the voice recognition interface in the first preset area on a display screen and presenting an icon interface of the application program in the second preset area under the condition that the voice signal is received; or,
and detecting whether the duration time of the touch signal reaches the preset time, and presenting a voice recognition interface in the first preset area and presenting an application icon interface in the second preset area on the display screen under the condition that the duration time reaches the preset time.
3. The data interaction method of claim 2, wherein the presenting the speech recognition interface in the first predetermined area on the display screen further comprises, after presenting the application icon interface in the second predetermined area:
and under the condition that the voice signal is received, recognizing the received voice signal, and adjusting the application program icon presented in the second preset area according to the recognition result of the voice signal.
4. The data interaction method of claim 2, further comprising:
and under the condition that the voice signal is not received or the duration time does not reach the preset time, the interface before the touch signal is received is recovered and presented on the display screen.
5. The data interaction method of any one of claims 1 to 4, wherein the detecting whether the application icon in the second predetermined area receives a start trigger operation comprises:
detecting whether the position of a cursor moving on a display screen is in an area corresponding to the application program icon when a touch signal from the remote control equipment is received;
detecting whether the recognition result of the received voice signal is matched with the name of the application program presented in the second preset area.
6. A data interaction device, comprising:
the first detection module is used for detecting whether a touch signal from the remote control equipment is received or not;
the first presentation module is used for presenting a voice recognition interface in a first preset area on the display screen and presenting an application icon interface in a second preset area under the condition of receiving the touch signal;
the second detection module is used for detecting whether the application program icon in the second preset area receives a starting triggering operation;
and the second presentation module is used for presenting an interface of the application program corresponding to the started application program icon on the display screen under the condition of receiving the starting triggering operation.
7. The data interaction device of claim 6, wherein the first rendering module comprises:
the first presentation unit is used for detecting whether a voice signal is received within preset time, presenting a voice recognition interface in the first preset area on the display screen under the condition that the voice signal is received, and presenting an application icon interface in the second preset area;
and the second presentation unit is used for detecting whether the duration time of the touch signal reaches the preset time or not, presenting a voice recognition interface in the first preset area on the display screen and presenting an application icon interface in the second preset area under the condition that the duration time reaches the preset time.
8. The data interaction device of claim 7, further comprising:
and the adjusting unit is used for identifying the received voice signal under the condition that the voice signal is received, and adjusting the application program icon presented in the second preset area according to the identification result of the voice signal.
9. An electronic device, characterized by comprising at least a memory, a processor, the memory having a computer program stored thereon, characterized in that the processor, when executing the computer program on the memory, implements the steps of the data interaction method of any one of claims 1 to 5.
10. A data interaction system, comprising:
a remote control device and the electronic device of claim 9,
the remote control device is configured to respond to the touch operation received by the detector and send a touch signal to the electronic device; and sending a starting trigger operation aiming at the application program icon in a second preset area on the display screen to the electronic equipment.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110995623.7A CN113778315A (en) | 2021-08-27 | 2021-08-27 | Data interaction method, device and system and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110995623.7A CN113778315A (en) | 2021-08-27 | 2021-08-27 | Data interaction method, device and system and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113778315A true CN113778315A (en) | 2021-12-10 |
Family
ID=78839568
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110995623.7A Pending CN113778315A (en) | 2021-08-27 | 2021-08-27 | Data interaction method, device and system and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113778315A (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050159950A1 (en) * | 2001-09-05 | 2005-07-21 | Voice Signal Technologies, Inc. | Speech recognition using re-utterance recognition |
CN104536647A (en) * | 2014-12-16 | 2015-04-22 | 广东欧珀移动通信有限公司 | Application icon position adjusting method and device |
CN104853250A (en) * | 2014-02-19 | 2015-08-19 | 三星电子株式会社 | Remote controller and method for controlling screen thereof |
CN106201427A (en) * | 2016-07-15 | 2016-12-07 | 东莞酷派软件技术有限公司 | A kind of application program launching method and terminal unit |
CN108121490A (en) * | 2016-11-28 | 2018-06-05 | 三星电子株式会社 | For handling electronic device, method and the server of multi-mode input |
CN110647274A (en) * | 2019-08-15 | 2020-01-03 | 华为技术有限公司 | Interface display method and equipment |
CN110825469A (en) * | 2019-09-18 | 2020-02-21 | 华为技术有限公司 | Voice assistant display method and device |
-
2021
- 2021-08-27 CN CN202110995623.7A patent/CN113778315A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050159950A1 (en) * | 2001-09-05 | 2005-07-21 | Voice Signal Technologies, Inc. | Speech recognition using re-utterance recognition |
CN104853250A (en) * | 2014-02-19 | 2015-08-19 | 三星电子株式会社 | Remote controller and method for controlling screen thereof |
CN104536647A (en) * | 2014-12-16 | 2015-04-22 | 广东欧珀移动通信有限公司 | Application icon position adjusting method and device |
CN106201427A (en) * | 2016-07-15 | 2016-12-07 | 东莞酷派软件技术有限公司 | A kind of application program launching method and terminal unit |
CN108121490A (en) * | 2016-11-28 | 2018-06-05 | 三星电子株式会社 | For handling electronic device, method and the server of multi-mode input |
CN110647274A (en) * | 2019-08-15 | 2020-01-03 | 华为技术有限公司 | Interface display method and equipment |
CN110825469A (en) * | 2019-09-18 | 2020-02-21 | 华为技术有限公司 | Voice assistant display method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108701000B (en) | Method for processing notification and electronic equipment | |
US11243615B2 (en) | Systems, methods, and media for providing an enhanced remote control having multiple modes | |
US8134538B2 (en) | Touch panel input device and processing execution method | |
US7817143B2 (en) | Method of inputting function into portable terminal and button input apparatus of portable terminal using the same | |
EP2041957B1 (en) | Voice remote control | |
RU2625439C2 (en) | Electronic device and method for providing user interface for it | |
CN102203705B (en) | A touch-off method and wireless hand-held device | |
US20100121876A1 (en) | Information entry mechanism for small keypads | |
RU2609101C2 (en) | Touch control assembly, device control method, controller and electronic device | |
CN103716451A (en) | Image display control apparatus, image display apparatus and image display control method | |
US20100245272A1 (en) | Mobile terminal apparatus and method of starting application | |
US12003804B2 (en) | Information processing device, information processing method, and computer program | |
US20090327979A1 (en) | User interface for a peripheral device | |
CN105809695A (en) | Terminal searching method and device based on wearable device | |
US10089899B2 (en) | Hearing and speech impaired electronic device control | |
KR20130097331A (en) | Apparatus and method for selecting object in device with touch screen | |
EP3261324B1 (en) | Method and device for application switching | |
CN110933772A (en) | Connection method of wireless device, mobile terminal and computer readable storage medium | |
US10498874B2 (en) | Display apparatus having ability of voice control and method of instructing voice control timing | |
CN108062952B (en) | Voice control method, device and system | |
CN105183159A (en) | Interface jump method and apparatus | |
CN113778315A (en) | Data interaction method, device and system and electronic equipment | |
CN111459272A (en) | Interaction method, interaction device, storage medium and electronic equipment | |
KR20060007148A (en) | Method for initiating voice recognition | |
CN110602325B (en) | Voice recommendation method and device for terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |