CN110620845A - Information processing method, electronic device and computing device - Google Patents
Information processing method, electronic device and computing device Download PDFInfo
- Publication number
- CN110620845A CN110620845A CN201910873664.1A CN201910873664A CN110620845A CN 110620845 A CN110620845 A CN 110620845A CN 201910873664 A CN201910873664 A CN 201910873664A CN 110620845 A CN110620845 A CN 110620845A
- Authority
- CN
- China
- Prior art keywords
- control
- display
- electronic device
- display data
- image frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 26
- 238000003672 processing method Methods 0.000 title claims abstract description 15
- 238000000034 method Methods 0.000 claims abstract description 36
- 230000004044 response Effects 0.000 claims abstract description 20
- 238000006243 chemical reaction Methods 0.000 claims description 12
- 230000006870 function Effects 0.000 description 19
- 238000010586 diagram Methods 0.000 description 16
- 238000004590 computer program Methods 0.000 description 14
- 230000003287 optical effect Effects 0.000 description 7
- 238000003384 imaging method Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 238000004806 packaging method and process Methods 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 239000000758 substrate Substances 0.000 description 2
- 108010001267 Protein Subunits Proteins 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72409—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
- H04M1/72412—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/7243—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
- H04M1/72439—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72469—User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/80—Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W76/00—Connection management
- H04W76/10—Connection setup
- H04W76/14—Direct-mode setup
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
The present disclosure provides an information processing method, including: the method comprises the steps of obtaining display data, wherein the display data correspond to an image frame, dynamically updating the display data, determining a display output mode, and displaying and outputting the image frame based on the determined display output mode and the display data. If the determined display output mode is the first display output mode, displaying at least one first control and displaying and outputting a first image frame based on first display data, and if the determined display output mode is the second display output mode, displaying and outputting a second image frame based on second display data, wherein the second image frame comprises at least one second control, and the response result for triggering the first control is consistent with the response result for triggering the second control. The disclosure also provides an electronic device and a computing device.
Description
Technical Field
The present disclosure relates to an information processing method, an electronic device, and a computing device.
Background
The electronic device in the related art has a plurality of functions, for example, a function of displaying a current interface of the other device, in other words, the other device may project the current interface into the electronic device for display. However, the interactivity between the electronic device and other devices is not considered in the related art, so that the interactive mode between the electronic device and other devices is not intelligent enough in the using process, the using requirement of the user cannot be met, and the using experience of the user is poor. For example, when the electronic device is a computer, the other devices are mobile phones, and the mobile phones use full-screen gestures, it is difficult to operate the current interface of the mobile phone displayed on the computer using a mouse or a keyboard of the computer after the current interface in the mobile phone is projected to the computer.
Disclosure of Invention
One aspect of the present disclosure provides an information processing method, the method including: the method comprises the steps of obtaining display data, wherein the display data correspond to an image frame, the display data are dynamically updated, a display output mode is determined, and the image frame is displayed and output based on the determined display output mode and the display data. If the determined display output mode is the first display output mode, displaying at least one first control and displaying and outputting a first image frame based on first display data, and if the determined display output mode is the second display output mode, displaying and outputting a second image frame based on second display data, wherein the second image frame comprises at least one second control, and the response result for triggering the first control is consistent with the response result for triggering the second control.
Optionally, the display area displaying the at least one first manipulation control and the display area displaying the output first image frame based on the first display data do not overlap.
Optionally, the obtaining display data includes: and obtaining display data of a display screen of the second electronic equipment in real time based on a connecting channel with the second electronic equipment.
Optionally, the determining the display output mode includes: configuration information of the second electronic device is obtained based on a connection channel with the second electronic device. And if the configuration information represents that the second electronic equipment determines the system instruction based on the input gesture, determining that the display output mode is the first display output mode, and if the configuration information represents that the second electronic equipment determines the system instruction based on the control, determining that the display output mode is the second display output mode.
Optionally, if the first display data and the second display data are the same display data, the displaying and outputting the first image frame based on the first display data further includes: and separating the control and the image picture in the image frame corresponding to the display data, and displaying and outputting the image picture.
Optionally, the method further includes: the method comprises the steps of determining a system instruction corresponding to a first control based on a trigger signal aiming at the first control, and responding to the system instruction corresponding to the first control to update a first image frame currently displayed and output. Or determining a system instruction corresponding to a second control based on a trigger signal for the second control, and responding to the system instruction corresponding to the second control to update a currently displayed and output second image frame.
Optionally, the above-mentioned trigger signal for the first control is a trigger signal generated by obtaining a click operation for the first control through an input device, where the updating the currently displayed and output first image frame in response to the system instruction corresponding to the first control includes: and sending a system instruction corresponding to the first control through a connecting channel of second electronic equipment, so that the second electronic equipment responds to the system instruction corresponding to the first control to update display data of a display screen of the second electronic equipment. Or, the determining, based on the trigger signal for the second control, the system instruction corresponding to the second control includes: and converting the input coordinates into conversion coordinates of a display area for displaying and outputting a second image frame based on second display data, and sending the conversion coordinates and an input operation through a connecting channel of a second electronic device, so that the second electronic device determines a click operation for a second control based on the conversion coordinates and the input operation to determine a system instruction corresponding to the second control. The responding to the system instruction corresponding to the second control to update the second image frame currently displayed and output comprises: and the second electronic equipment responds to the system instruction corresponding to the second control so as to update the display data of the display screen of the second electronic equipment.
Optionally, after the display output mode is determined, the display data obtained in real time is displayed and output based on the determined display output mode until the second electronic device leaves the scannable range.
Another aspect of the present disclosure provides an electronic device including: an operating system and a screen projection function. The screen projection function is a functional module of the operating system, or the screen projection function is an application program with the screen projection function installed based on the operating system.
Another aspect of the disclosure provides a computing device comprising: a display function module; and the screen projection functional module is used for realizing the method.
Another aspect of the disclosure provides a non-transitory readable storage medium storing computer-executable instructions for implementing the method as above when executed.
Another aspect of the disclosure provides a computer program comprising computer executable instructions for implementing the method as above when executed.
Drawings
For a more complete understanding of the present disclosure and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
fig. 1 schematically shows an application scenario of an information processing method and an information processing apparatus according to an embodiment of the present disclosure;
FIG. 2 schematically shows a flow chart of an information processing method according to an embodiment of the present disclosure;
3A-3B schematically illustrate display output diagrams in accordance with embodiments of the present disclosure;
FIG. 4 schematically shows a flow chart of an information processing method according to another embodiment of the present disclosure;
fig. 5 schematically shows a block diagram of an information processing apparatus according to an embodiment of the present disclosure;
fig. 6 schematically shows a block diagram of an information processing apparatus according to another embodiment of the present disclosure; and
FIG. 7 schematically shows a block diagram of a computer system for implementing information processing in accordance with an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable control apparatus to produce a machine, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.
Accordingly, the techniques of this disclosure may be implemented in hardware and/or software (including firmware, microcode, etc.). In addition, the techniques of this disclosure may take the form of a computer program product on a computer-readable medium having instructions stored thereon for use by or in connection with an instruction execution system. In the context of this disclosure, a computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the instructions. For example, the computer readable medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the computer readable medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
An embodiment of the present disclosure provides an information processing method, including: the method comprises the steps of obtaining display data, wherein the display data correspond to an image frame, dynamically updating the display data, determining a display output mode, and displaying and outputting the image frame based on the determined display output mode and the display data. If the determined display output mode is the first display output mode, displaying at least one first control and displaying and outputting a first image frame based on first display data, and if the determined display output mode is the second display output mode, displaying and outputting a second image frame based on second display data, wherein the second image frame comprises at least one second control, and the response result for triggering the first control is consistent with the response result for triggering the second control.
Fig. 1 schematically shows an application scenario of an information processing method and an information processing apparatus according to an embodiment of the present disclosure. It should be noted that fig. 1 is only an example of a scenario in which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, but does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios.
As shown in fig. 1, the application scenario 100 includes, for example, a first electronic device 110, a second electronic device 120, and a connection channel 130.
The first electronic device 110 may be, for example, a computer, a tablet, or the like. Similarly, the second electronic device 120 may be, for example, a cell phone or the like.
According to the embodiment of the present disclosure, the second electronic device 120 may transmit, for example, an image frame to the first electronic device 110 through the connection channel 130, so that the first electronic device 110 displays the image frame of the second electronic device 120, where the image frame may be, for example, a current display interface of the second electronic device 110.
In an initial stage, for example, the first electronic device 110 and the second electronic device establish a connection in a low power consumption mode, which may be, for example, a bluetooth broadcast mode. After the first electronic device 110 establishes a connection with the second electronic device 120, the first electronic device 110 may create a connection channel 130, and the connection channel 130 may be, for example, a wifi direct channel, so that the second electronic device 120 may send the image frame to the first electronic device 110 through the connection channel 130.
An information processing method according to an exemplary embodiment of the present disclosure is described below with reference to fig. 2 to 4 in conjunction with an application scenario of fig. 1. It should be noted that the above application scenarios are merely illustrated for the convenience of understanding the spirit and principles of the present disclosure, and the embodiments of the present disclosure are not limited in this respect. Rather, embodiments of the present disclosure may be applied to any scenario where applicable.
Fig. 2 schematically shows a flow chart of an information processing method according to an embodiment of the present disclosure.
Fig. 3A-3B schematically illustrate display output diagrams in accordance with embodiments of the disclosure. The method of embodiments of the present disclosure is set forth below in conjunction with fig. 2 and 3A-3B.
First, as shown in FIG. 2, the method includes operations S210 to S230. The method can be used for a first electronic device, and the first electronic device can be a computer.
In operation S210, display data is obtained, the display data corresponds to an image frame, and the display data is dynamically updated.
According to the embodiment of the present disclosure, the display data may be, for example, display data of the first electronic device, or may also be display data of the second electronic device. The first electronic device may be a computer, for example, and the second electronic device may be a mobile phone, for example.
Wherein the display data may be display data of the first electronic device. For example, when the application program of the first electronic device is running, if the application program is a play video program, image frames corresponding to the played video may be used as display data, and the image frames corresponding to the played video may be dynamically updated.
Alternatively, the display data may be display data of the second electronic device. The display data may be, for example, a current interface displayed on a display screen of the second electronic device, the display data including image frames of the display screen of the second electronic device, which may be dynamically updated. For example, when an operation of returning to the previous interface is performed for the second electronic device, the display data of the second electronic device is changed from the current interface to the previous interface, and dynamic update of the display data is realized.
According to the embodiment of the present disclosure, if the display data is the display data of the second electronic device, the first electronic device may obtain the display data of the display screen of the second electronic device in real time based on the connection channel with the second electronic device.
The connection channel may be created by the first electronic device, for example, the connection channel may be a wifi direct connection channel, and the second electronic device may send the display data to the first electronic device through the connection channel. Alternatively, the connection channel may include a plurality of sub-channels, and the second electronic device may transmit the display data to the first electronic device through a first sub-channel of the plurality of sub-channels. The first subchannel may, for example, transmit data having a relatively large amount of data.
In operation S220, a display output manner is determined.
In operation S230, the image frame is displayed and output based on the determined display output manner and the display data.
As shown in fig. 3A, if the determined display output mode of the first electronic device is the first display output mode, at least one first manipulation control 320a is displayed and the first image frame 310a is displayed and output based on the first display data.
For example, the display area displaying the at least one first manipulation control 320a and the display area displaying the output first image frame 310a based on the first display data do not overlap. That is, the first manipulation control 320a is separate from the first image frame 310a, for example, the first manipulation control 320a and the first image frame 310a are adjacent to each other, for example, the first manipulation control 320a may be displayed at a lower adjacent position of the first image frame 310 a.
As shown in fig. 3B, if the determined display output manner of the first electronic device is the second display output manner, a second image frame 310B is displayed and output based on the second display data, and the second image frame 310B includes at least one second manipulation control 320B. That is, the second image frame 310b itself contains a second manipulation control 320 b.
In the disclosed embodiment, the response result of triggering the first manipulation control 320a is consistent with the response result of triggering the second manipulation control 320 b. For example, when both first control 320a and second control 320b include a return navigation key, whether the return navigation key in first control 320a or the return navigation key in second control 320b is triggered, the response is, for example, to return to the previous interface.
The embodiment of the disclosure displays and outputs display data in real time through different display and output modes. For example, when the image frame of the mobile phone is projected to the computer, if the image frame of the mobile phone includes the control, the image frame projected to the computer also includes the control, and at this time, the related operation can be performed through the control displayed on the computer. On the other hand, if the mobile phone uses a full-screen gesture to operate, the image frames of the mobile phone generally have no control, so that the image frames projected to the computer have no control, and the process of operating the image frames displayed on the computer by a mouse or a keyboard by a user is complicated and very difficult to operate. Therefore, according to the embodiment of the disclosure, the control is additionally added in the display area of the computer, so that the related operations can be conveniently executed through the control, the interaction mode between the computer and the mobile phone is more intelligent, the use requirements of users are met, and the use experience of the users is improved.
According to an embodiment of the present disclosure, if the display data is display data of the second electronic device, the operation S220 includes, for example: configuration information of the second electronic device is obtained based on a connection channel with the second electronic device. The configuration information may be, for example, system configuration information of the second electronic device. The connection channel may include a plurality of sub-channels, and the second electronic device may send the configuration information to the first electronic device through a second sub-channel of the plurality of sub-channels. The second sub-channel may transmit data with a small data size, for example, and it is understood that the configuration information may also be transmitted through the first sub-channel.
On one hand, if the configuration information represents that the second electronic equipment determines the system instruction based on the input gesture, the display output mode is determined to be the first display output mode. That is, if the second electronic device is manipulated based on the input gesture, a manipulation control that triggers the system instruction is not displayed on a display screen of the second electronic device, and the manipulation control may be, for example, a navigation key of the second electronic device. At the moment, the first electronic device can display at least one first control and the first image frame, so that a user can conveniently execute related operations on the first electronic device through the first control, the problem that when the display screen content of the second electronic device (mobile phone) is projected to a computer, the operation is inconvenient through input gestures is solved, and therefore related input can be executed through the first control, and the user can conveniently use the electronic device.
For example, the second electronic device determining system instructions based on the input gesture may include: when the user slides on the display screen of the second electronic device from bottom to top, the display interface of the second electronic device returns to the desktop, or when the user slides from the side to the middle on the display screen of the second electronic device, the second electronic device returns to the previous level interface.
On the other hand, if the configuration information represents that the second electronic device determines the system instruction based on the control, it is determined that the display output mode is the second display output mode. That is, if the second electronic device is based on a manipulation control for manipulating the second electronic device, the second electronic device displays, on the display screen, a manipulation control for triggering a system instruction, where the manipulation control may be, for example, a navigation key of the second electronic device, and a user may implement, for example, an operation of returning to a desktop or returning to a previous interface through the navigation key. At this time, the first electronic device may display a second image frame, which includes the second manipulation control.
According to the embodiment of the disclosure, the display output mode is determined by the second electronic device based on the input gesture determination system instruction or the control determination system instruction, so that the display data is displayed and output in real time through different display output modes, the interaction mode between the first electronic device and the second electronic device is more intelligent, the use requirements of users are met, and the use experience of the users is improved.
According to an embodiment of the present disclosure, if the first display data and the second display data are the same display data, displaying and outputting the first image frame based on the first display data further includes: and separating the control and the image picture in the image frame corresponding to the display data, and displaying and outputting the image picture. For example, in a case where the first display data and the second display data both include the manipulation control, an image frame in the first image frame corresponding to the first display data is separated from the manipulation control, and the manipulation control is hidden.
For example, when the image frame corresponding to the display data is an image frame in an imaging scene, the image frame includes an image picture (e.g., a preview picture) and an imaging control, and imaging can be performed by clicking the imaging control. The camera control in the display data typically overlaps the image frame, resulting in an occlusion of the image frame. Therefore, according to the embodiment of the disclosure, the control can be hidden and the image picture can be displayed in the corresponding display area of the first electronic device by separating the control and the image picture, and the first control can be additionally displayed in an area (for example, an adjacent area) which is not overlapped with the display area of the image picture in the first electronic device, and shooting can be performed by clicking the first control. The method makes full use of the advantage of large screen display of the first electronic device, so that no control is shielded when the image is displayed in the first electronic device.
Alternatively, when the image frame corresponding to the display data is an image frame in a game application, the image frame includes an image frame (e.g., a game frame) and a game control, and a game operation can be performed through the game control. The game controls in the display data typically overlap the game view, resulting in an occlusion of the game view. Therefore, the game control and the game picture can be separated, the game control is hidden, the game picture is displayed in the corresponding display area of the first electronic device, the first control can be added and displayed in an area (for example, an adjacent area) which is not overlapped with the display area of the image picture in the first electronic device, and the game operation can be carried out by clicking the first control.
It can be understood that the display data in the shooting scene or the display data of the game application may be the display data of the second electronic device (mobile phone) or the display data of the first electronic device (computer). On one hand, when the display data is display data of a second electronic device (mobile phone), the display data is sent to the first electronic device by the second electronic device, the first electronic device separates a control and an image picture in an image frame corresponding to the display data, and finally the image picture is displayed and the control is shielded. On the other hand, if the display data is the display data of the first electronic device, for example, the first electronic device (computer) downloads and runs an application program conforming to the operation habit of the second electronic device (mobile phone), and the first electronic device separates the control and the image frame in the image frame corresponding to the display data, and then outputs the image frame and shields the control.
Fig. 4 schematically shows a flow chart of an information processing method according to another embodiment of the present disclosure.
As shown in FIG. 4, the method includes operations S210 to S230 and operations S410a to S420a or S410b to S420 b.
In operation S410a, a system instruction corresponding to the first manipulation control is determined based on the trigger signal for the first manipulation control. The trigger signal for the first control is a trigger signal generated by obtaining a click operation for the first control through the input device. The input device may be, for example, a touch display screen, a touch pad, a mouse, etc. of the first electronic device.
In operation S420a, the first image frame currently displayed and output is updated in response to a system instruction corresponding to the first manipulation control.
For example, a system instruction corresponding to the first manipulation control is sent through a connection channel of the second electronic device, so that the second electronic device responds to the system instruction corresponding to the first manipulation control to update display data of a display screen of the second electronic device.
For example, the display data is display data of the second electronic device (mobile phone) in a shooting scene. The first electronic equipment separates an image picture and a camera shooting control in a camera shooting scene, the image picture is displayed in the first electronic equipment, and the first control is additionally displayed in an area, which is not overlapped with the image picture, of the first electronic equipment. The first control is clicked to perform image pickup, for example, a system instruction is generated by clicking the first control in the first electronic device, the system instruction is sent to the second electronic device through the connection channel, after the second electronic device receives and executes the system instruction to perform photographing operation (for example, photographing), display data of the second electronic device is updated, for example, a photographed picture is stored in a storage space of the second electronic device, and a current preview picture of the second electronic device is updated. And at the moment, the updated preview picture is transmitted to the first electronic equipment as updated display data through the connecting channel, the first electronic equipment continuously separates an image picture and a control in the updated display data, displays the image picture in the first electronic equipment, and additionally displays the first control in an area which is not overlapped with the image picture in the first electronic equipment. Or the first electronic device may not add the first manipulation control again, but continue to use the first manipulation control that was added to the display before the update.
The connection channel for transmitting the system instruction may include a plurality of sub-channels, and the first electronic device may send the system instruction to the second electronic device through a third sub-channel of the plurality of sub-channels. The third sub-channel may be used, for example, to transmit control instructions, it being understood that the system instructions may also be transmitted via the first sub-channel or the second sub-channel.
Alternatively, in operation S410b, a system instruction corresponding to the second manipulation control is determined based on the trigger signal for the second manipulation control.
For example, the trigger signal for the second manipulation control is to obtain input coordinates and an input operation through an input device, which may be, for example, a touch display screen, a touch pad, a mouse, and the like of the first electronic device.
For example, the display data is display data of the second electronic device (mobile phone) in a shooting scene. And when the second image frame is output in the second display output mode, the second image frame comprises an image picture and a shooting control in a shooting scene. The second control is clicked to perform image pickup, for example, a system instruction is generated by clicking the second control in the first electronic device, and the system instruction is sent to the second electronic device through the connection channel, so that the second electronic device can receive the system instruction and perform photographing operation (for example, photographing).
Wherein the first electronic device first converts the input coordinates into converted coordinates of a display area displaying the output second image frame based on the second display data before transmitting the system command to the second electronic device through the connection channel. For example, the input coordinates of the input operation in the first electronic device are (a, b), which are converted into converted coordinates (c, d) in the display area of the second image frame in the first electronic device. Since the display area of the second image frame in the first electronic device corresponds to the current interface of the display screen of the second electronic device, the converted coordinates correspond to the coordinates of the current interface of the display screen of the second electronic device.
And sending the conversion coordinate and the input operation through a connecting channel of the second electronic device, so that the second electronic device determines a click operation for the second control based on the conversion coordinate and the input operation, and determines a system instruction corresponding to the second control. For example, by sending the conversion coordinates (c, d) to the second electronic device, the second electronic device can determine that the click operation on the first electronic device is a click operation for the image capture control based on the conversion coordinates, and thus can determine that the system instruction is an image capture instruction, and the second electronic device can execute the system instruction to perform a photographing operation (e.g., taking a picture).
The connection channel for transmitting the converted coordinates and the input operation may include a plurality of sub-channels, and the first electronic device may transmit the converted coordinates and the input operation to the second electronic device through a fourth sub-channel of the plurality of sub-channels. It will be appreciated that the transformed coordinates and input operations may also be transmitted through the first sub-channel, the second sub-channel, or the third sub-channel.
In operation S420b, the second image frame currently displayed and output is updated in response to the system command corresponding to the second manipulation control.
For example, the second electronic device responds to a system instruction corresponding to the second control to update the display data of the display screen of the second electronic device, so that the second image frame currently displayed and output by the first electronic device is updated based on the update operation of the display screen of the second electronic device.
For example, after the second electronic device executes the system instruction to perform a photographing operation (e.g., taking a picture), the display data of the second electronic device is updated, e.g., the taken picture is stored in a storage space of the second electronic device, and a current preview screen of the second electronic device is updated, the updated preview screen includes, for example, a control, at this time, the updated preview screen is transmitted to the first electronic device through a connection channel as updated display data, the first electronic device continues to output, based on a second display output mode, a second image frame corresponding to the updated display data, and the updated second image frame displayed in the first electronic device further includes, for example, a second control.
According to the embodiment of the disclosure, after the display output mode is determined, the display data obtained in real time is displayed and output based on the determined display output mode until the second electronic device leaves the scannable range.
For example, the first electronic device and the second electronic device establish a connection in a low power consumption manner, which may be, for example, a bluetooth broadcast manner. After the first electronic device establishes a connection with the second electronic device, the first electronic device may create a connection channel, which may be, for example, a wifi direct connection channel. On one hand, in the process that the first electronic device and the second electronic device transmit the image frames through the connecting channel, if the display output mode of the image frames is determined at the beginning, for example, the first display output mode or the second display output mode is determined, after the display data of the second electronic device is acquired in real time by the subsequent first electronic device, the image frames corresponding to the display data acquired later in real time are continuously output based on the initially determined display output mode. If the second electronic device leaves the scannable range, the second electronic device is disconnected from the first electronic device, and if the second electronic device is in the scannable range again and the second electronic device is reconnected with the first electronic device, the first electronic device can acquire the display data of the second electronic device again and re-determine the display output mode, so that the accuracy of the display output mode is improved after the devices are reconnected.
On the other hand, after the second electronic device and the first electronic device are connected again, the first electronic device may determine whether the second electronic device is a previous device or not through the device identifier of the second electronic device, and if the device identifier indicates that the second electronic device is the previous device, the image frames acquired in real time may be displayed and output based on the previously determined display output mode without re-determining the display output mode again, so as to save computing resources and processing efficiency.
In addition, the present disclosure also provides an electronic device, including: the screen projection function is a functional module of the operating system, or the screen projection function is an application program with the screen projection function installed based on the operating system. The electronic device can implement the method as described in fig. 1 to 4, for example, can implement operations S210 to S230 shown in fig. 2, and operations S410a to S420a or S410b to S420b shown in fig. 4.
Additionally, the present disclosure also provides a computing device comprising: the display function module and the projection screen function module. The screen projection function module is used to implement the methods described in fig. 1 to 4, and for example, can implement operations S210 to S230 shown in fig. 2, and operations S410a to S420a or S410b to S420b shown in fig. 4.
Fig. 5 schematically shows a block diagram of an information processing apparatus according to an embodiment of the present disclosure.
As shown in fig. 5, the information processing apparatus 500 includes an acquisition module 510, a first determination module 520, and a display output module 530.
The obtaining module 510 may be configured to obtain display data, where the display data corresponds to an image frame, and the display data is dynamically updated. According to the embodiment of the present disclosure, the obtaining module 510 may perform, for example, the operation S210 described above with reference to fig. 2, which is not described herein again.
The first determination module 520 may be used to determine a display output manner. According to the embodiment of the present disclosure, the first determining module 520 may perform, for example, operation S220 described above with reference to fig. 2, which is not described herein again.
The display output module 530 may be configured to display the output image frame based on the determined display output mode and the display data. According to the embodiment of the disclosure, the display output module 530 may perform the operation S230 described above with reference to fig. 2, for example, and is not described herein again.
According to the embodiment of the disclosure, if the determined display output mode is the first display output mode, at least one first control is displayed and a first image frame is displayed and output based on first display data, if the determined display output mode is the second display output mode, a second image frame is displayed and output based on second display data, the second image frame comprises at least one second control, and a response result of triggering the first control is consistent with a response result of triggering the second control.
According to an embodiment of the present disclosure, a display area displaying the at least one first manipulation control and a display area displaying the output first image frame based on the first display data do not overlap.
According to an embodiment of the present disclosure, obtaining display data includes: and obtaining display data of a display screen of the second electronic equipment in real time based on a connecting channel with the second electronic equipment.
According to an embodiment of the present disclosure, determining a display output mode includes: configuration information of the second electronic device is obtained based on a connection channel with the second electronic device. And if the configuration information represents that the second electronic equipment determines the system instruction based on the input gesture, determining that the display output mode is the first display output mode, and if the configuration information represents that the second electronic equipment determines the system instruction based on the control, determining that the display output mode is the second display output mode.
According to an embodiment of the present disclosure, if the first display data and the second display data are the same display data, displaying and outputting the first image frame based on the first display data further includes: and separating the control and the image picture in the image frame corresponding to the display data, and displaying and outputting the image picture.
Fig. 6 schematically shows a block diagram of an information processing apparatus according to another embodiment of the present disclosure.
As shown in fig. 6, the information processing apparatus 600 includes an acquisition module 510, a first determination module 520, a display output module 530, a second determination module 610a, a first update module 620a, a third determination module 610b, and a second update module 620 b.
The second determining module 610a may be configured to determine a system instruction corresponding to the first manipulation control based on the trigger signal for the first manipulation control. According to an embodiment of the present disclosure, the second determining module 610a may perform, for example, operation S410a described above with reference to fig. 4, which is not described herein again.
The first updating module 620a may be configured to update the first image frame currently displayed and output in response to a system instruction corresponding to the first manipulation control. According to the embodiment of the present disclosure, the first updating module 620a may perform, for example, the operation S420a described above with reference to fig. 4, which is not described herein again.
The third determining module 610b may be configured to determine a system instruction corresponding to the second manipulation control based on the trigger signal for the second manipulation control. According to an embodiment of the present disclosure, the third determining module 610b may perform, for example, operation S410b described above with reference to fig. 4, which is not described herein again.
The second updating module 620b may be configured to update the currently displayed and output second image frame in response to a system instruction corresponding to the second control. According to the embodiment of the present disclosure, the second updating module 620b may perform, for example, the operation S420b described above with reference to fig. 4, which is not described herein again.
According to the embodiment of the present disclosure, the trigger signal for the first control is a trigger signal generated by obtaining a click operation for the first control through an input device, wherein updating the currently displayed and output first image frame in response to the system instruction corresponding to the first control includes: and sending a system instruction corresponding to the first control through a connecting channel of the second electronic device, so that the second electronic device responds to the system instruction corresponding to the first control to update the display data of the display screen of the second electronic device. Or, the determining, based on the trigger signal for the second control, the system instruction corresponding to the second control includes: and converting the input coordinates into conversion coordinates of a display area for displaying and outputting a second image frame based on second display data, and sending the conversion coordinates and the input operation through a connecting channel of the second electronic device, so that the second electronic device determines a click operation for a second control based on the conversion coordinates and the input operation, and determines a system instruction corresponding to the second control. Responding to the system instruction corresponding to the second control to update the second image frame currently displayed and output comprises: and the second electronic equipment responds to the system instruction corresponding to the second control so as to update the display data of the display screen of the second electronic equipment.
According to the embodiment of the disclosure, after the display output mode is determined, the display data obtained in real time is displayed and output based on the determined display output mode until the second electronic device leaves the scannable range.
Any number of modules, sub-modules, units, sub-units, or at least part of the functionality of any number thereof according to embodiments of the present disclosure may be implemented in one module. Any one or more of the modules, sub-modules, units, and sub-units according to the embodiments of the present disclosure may be implemented by being split into a plurality of modules. Any one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in any other reasonable manner of hardware or firmware by integrating or packaging a circuit, or in any one of or a suitable combination of software, hardware, and firmware implementations. Alternatively, one or more of the modules, sub-modules, units, sub-units according to embodiments of the disclosure may be at least partially implemented as a computer program module, which when executed may perform the corresponding functions.
For example, any plurality of the obtaining module 510, the first determining module 520, the display output module 530, the second determining module 610a, the first updating module 620a, the third determining module 610b, and the second updating module 620b may be combined into one module to be implemented, or any one of the modules may be split into a plurality of modules. Alternatively, at least part of the functionality of one or more of these modules may be combined with at least part of the functionality of the other modules and implemented in one module. According to an embodiment of the present disclosure, at least one of the obtaining module 510, the first determining module 520, the display output module 530, the second determining module 610a, the first updating module 620a, the third determining module 610b, and the second updating module 620b may be at least partially implemented as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented by hardware or firmware in any other reasonable manner of integrating or packaging a circuit, or implemented by any one of three implementations of software, hardware, and firmware, or by a suitable combination of any of them. Alternatively, at least one of the obtaining module 510, the first determining module 520, the display output module 530, the second determining module 610a, the first updating module 620a, the third determining module 610b, and the second updating module 620b may be at least partially implemented as a computer program module, which may perform a corresponding function when executed.
FIG. 7 schematically shows a block diagram of a computer system for implementing information processing in accordance with an embodiment of the present disclosure. The computer system illustrated in FIG. 7 is only one example and should not impose any limitations on the scope of use or functionality of embodiments of the disclosure.
As shown in fig. 7, a computer system 700 implementing information processing includes a processor 701, a computer-readable storage medium 702. The system 700 may perform a method according to an embodiment of the present disclosure.
In particular, the processor 701 may include, for example, a general purpose microprocessor, an instruction set processor and/or related chip set and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), and/or the like. The processor 701 may also include on-board memory for caching purposes. The processor 701 may be a single processing unit or a plurality of processing units for performing the different actions of the method flows according to embodiments of the present disclosure.
Computer-readable storage medium 702 may be, for example, any medium that can contain, store, communicate, propagate, or transport the instructions. For example, a readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the readable storage medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
The computer-readable storage medium 702 may comprise a computer program 703, which computer program 703 may comprise code/computer-executable instructions that, when executed by the processor 701, cause the processor 701 to perform a method according to an embodiment of the disclosure, or any variant thereof.
The computer program 703 may be configured with, for example, computer program code comprising computer program modules. For example, in an example embodiment, code in computer program 703 may include one or more program modules, including for example 703A, modules 703B, … …. It should be noted that the division and number of the modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations, so that the processor 701 may execute the method according to the embodiment of the present disclosure or any variation thereof when the program modules are executed by the processor 701.
According to an embodiment of the present disclosure, at least one of the obtaining module 510, the first determining module 520, the display output module 530, the second determining module 610a, the first updating module 620a, the third determining module 610b, and the second updating module 620b may be implemented as a computer program module described with reference to fig. 7, which, when executed by the processor 701, may implement the respective operations described above.
The present disclosure also provides a computer-readable medium, which may be embodied in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The above-mentioned computer-readable medium carries one or more programs which, when executed, implement the above-mentioned information processing method.
According to embodiments of the present disclosure, a computer readable medium may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, optical fiber cable, radio frequency signals, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
While the disclosure has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents. Accordingly, the scope of the present disclosure should not be limited to the above-described embodiments, but should be defined not only by the appended claims, but also by equivalents thereof.
Claims (10)
1. An information processing method, the method comprising:
acquiring display data, wherein the display data corresponds to an image frame and is dynamically updated;
determining a display output mode;
displaying and outputting the image frame based on the determined display output mode and the display data;
wherein,
if the determined display output mode is the first display output mode, displaying at least one first control and displaying and outputting a first image frame based on first display data;
displaying a second image frame based on second display data if the determined display output mode is a second display output mode, the second image frame including at least one second manipulation control,
and the response result for triggering the first control is consistent with the response result for triggering the second control.
2. The method of claim 1, wherein a display area displaying the at least one first manipulation control and a display area displaying the output first image frame based on the first display data do not overlap.
3. The method of claim 1, wherein the obtaining display data comprises:
and obtaining display data of a display screen of the second electronic equipment in real time based on a connecting channel with the second electronic equipment.
4. The method of claim 1, wherein the determining a display output manner comprises:
obtaining configuration information of the second electronic device based on a connection channel with the second electronic device;
if the configuration information represents that the second electronic equipment determines a system instruction based on the input gesture, determining that the display output mode is a first display output mode;
and if the configuration information represents that the second electronic equipment determines a system instruction based on the control, determining that the display output mode is a second display output mode.
5. The method of claim 2, wherein said displaying output a first image frame based on first display data further comprises, if the first display data and the second display data are the same display data:
separating a control and an image picture in an image frame corresponding to the display data;
and displaying and outputting the image picture.
6. The method of claim 1 or 5, further comprising:
determining a system instruction corresponding to a first control based on a trigger signal for the first control;
responding to the system instruction corresponding to the first control to update a first image frame currently displayed and output;
or;
determining a system instruction corresponding to a second control based on a trigger signal for the second control;
and responding to the system instruction corresponding to the second control to update the currently displayed and output second image frame.
7. The method of claim 6, wherein the trigger signal for the first control is a trigger signal generated by obtaining a click operation for the first control through an input device, and wherein the updating the currently displayed first image frame in response to the system instruction corresponding to the first control comprises:
sending a system instruction corresponding to the first control through a connecting channel of second electronic equipment, so that the second electronic equipment responds to the system instruction corresponding to the first control to update display data of a display screen of the second electronic equipment;
or;
the triggering signal aiming at the second control is to obtain input coordinates and input operation through an input device;
the determining, based on the trigger signal for the second manipulation control, the system instruction corresponding to the second manipulation control comprises:
converting the input coordinates into converted coordinates of a display area displaying an output second image frame based on second display data;
sending the conversion coordinate and the input operation through a connection channel of a second electronic device, so that the second electronic device determines a click operation for a second control based on the conversion coordinate and the input operation, and determines a system instruction corresponding to the second control;
the responding to the system instruction corresponding to the second control to update the second image frame currently displayed and output comprises:
and the second electronic equipment responds to the system instruction corresponding to the second control so as to update the display data of the display screen of the second electronic equipment.
8. The method according to any one of claims 1-7, wherein after the display output mode is determined, performing display output based on the determined display output mode on the display data obtained in real time until the second electronic device leaves the scannable range.
9. An electronic device, comprising:
an operating system;
and the screen projection function is a functional module of the operating system, or the screen projection function is an application program with the screen projection function installed based on the operating system.
10. A computing device, comprising:
a display function module;
a screen projection function for implementing the method of any one of claims 1 to 8.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910873664.1A CN110620845B (en) | 2019-09-12 | 2019-09-12 | Information processing method, electronic device and computing device |
US17/018,832 US11349976B2 (en) | 2019-09-12 | 2020-09-11 | Information processing method, file transmission method, electronic apparatus, and computing apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910873664.1A CN110620845B (en) | 2019-09-12 | 2019-09-12 | Information processing method, electronic device and computing device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110620845A true CN110620845A (en) | 2019-12-27 |
CN110620845B CN110620845B (en) | 2021-01-15 |
Family
ID=68923070
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910873664.1A Active CN110620845B (en) | 2019-09-12 | 2019-09-12 | Information processing method, electronic device and computing device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110620845B (en) |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2224325A1 (en) * | 2009-02-27 | 2010-09-01 | Research In Motion Limited | A handheld electronic device having a touchscreen and a method of using a touchscreen of a handheld electronic device |
CN104077074A (en) * | 2013-03-28 | 2014-10-01 | 三星电子株式会社 | Electronic device including projector and method for controlling the electronic device |
CN104777991A (en) * | 2015-04-22 | 2015-07-15 | 四川华立德科技有限公司 | Remote interactive projection system based on mobile phone |
KR20160014759A (en) * | 2016-01-26 | 2016-02-11 | 삼성전자주식회사 | Apparatus and method for controlling image screen using portable terminal |
CN106293037A (en) * | 2015-06-12 | 2017-01-04 | 联想(北京)有限公司 | A kind of exchange method and electronic equipment |
CN107493375A (en) * | 2017-06-30 | 2017-12-19 | 北京超卓科技有限公司 | Mobile terminal expanded type throws screen method and throws screen system |
US20180151150A1 (en) * | 2016-11-28 | 2018-05-31 | Displaylink (Uk) Limited | Displaying Image Data Based on Level of System Performance |
CN108491131A (en) * | 2018-01-19 | 2018-09-04 | 广州视源电子科技股份有限公司 | Operation method and device of intelligent interaction panel and intelligent interaction panel |
CN108874342A (en) * | 2018-06-13 | 2018-11-23 | 深圳市东向同人科技有限公司 | Projection view switching method and terminal device |
CN108900697A (en) * | 2018-05-30 | 2018-11-27 | 武汉卡比特信息有限公司 | Terminal word information input system and method when mobile phone and computer terminal interconnect |
CN108932144A (en) * | 2018-06-11 | 2018-12-04 | 广州视源电子科技股份有限公司 | display interface control method, device, equipment and storage medium |
CN208367339U (en) * | 2018-04-18 | 2019-01-11 | 深圳众赢时代科技有限公司 | A kind of projection mobile phone based on virtual reality technology |
CN110109636A (en) * | 2019-04-28 | 2019-08-09 | 华为技术有限公司 | Throw screen method, electronic equipment and system |
CN110121158A (en) * | 2019-04-26 | 2019-08-13 | 天派电子(深圳)有限公司 | A kind of control method and device of vehicle multimedia navigation terminal |
CN110147199A (en) * | 2019-05-23 | 2019-08-20 | 北京硬壳科技有限公司 | Display on the same screen method, apparatus and touch control display |
-
2019
- 2019-09-12 CN CN201910873664.1A patent/CN110620845B/en active Active
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2224325A1 (en) * | 2009-02-27 | 2010-09-01 | Research In Motion Limited | A handheld electronic device having a touchscreen and a method of using a touchscreen of a handheld electronic device |
CN104077074A (en) * | 2013-03-28 | 2014-10-01 | 三星电子株式会社 | Electronic device including projector and method for controlling the electronic device |
CN104777991A (en) * | 2015-04-22 | 2015-07-15 | 四川华立德科技有限公司 | Remote interactive projection system based on mobile phone |
CN106293037A (en) * | 2015-06-12 | 2017-01-04 | 联想(北京)有限公司 | A kind of exchange method and electronic equipment |
KR20160014759A (en) * | 2016-01-26 | 2016-02-11 | 삼성전자주식회사 | Apparatus and method for controlling image screen using portable terminal |
US20180151150A1 (en) * | 2016-11-28 | 2018-05-31 | Displaylink (Uk) Limited | Displaying Image Data Based on Level of System Performance |
CN107493375A (en) * | 2017-06-30 | 2017-12-19 | 北京超卓科技有限公司 | Mobile terminal expanded type throws screen method and throws screen system |
CN108491131A (en) * | 2018-01-19 | 2018-09-04 | 广州视源电子科技股份有限公司 | Operation method and device of intelligent interaction panel and intelligent interaction panel |
CN208367339U (en) * | 2018-04-18 | 2019-01-11 | 深圳众赢时代科技有限公司 | A kind of projection mobile phone based on virtual reality technology |
CN108900697A (en) * | 2018-05-30 | 2018-11-27 | 武汉卡比特信息有限公司 | Terminal word information input system and method when mobile phone and computer terminal interconnect |
CN108932144A (en) * | 2018-06-11 | 2018-12-04 | 广州视源电子科技股份有限公司 | display interface control method, device, equipment and storage medium |
CN108874342A (en) * | 2018-06-13 | 2018-11-23 | 深圳市东向同人科技有限公司 | Projection view switching method and terminal device |
CN110121158A (en) * | 2019-04-26 | 2019-08-13 | 天派电子(深圳)有限公司 | A kind of control method and device of vehicle multimedia navigation terminal |
CN110109636A (en) * | 2019-04-28 | 2019-08-09 | 华为技术有限公司 | Throw screen method, electronic equipment and system |
CN110147199A (en) * | 2019-05-23 | 2019-08-20 | 北京硬壳科技有限公司 | Display on the same screen method, apparatus and touch control display |
Also Published As
Publication number | Publication date |
---|---|
CN110620845B (en) | 2021-01-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110597474B (en) | Information processing method and electronic equipment | |
CN111913628B (en) | Sharing method and device and electronic equipment | |
WO2020015333A1 (en) | Video shooting method and apparatus, terminal device, and storage medium | |
WO2021115194A1 (en) | Application icon display method and electronic device | |
CN112558825A (en) | Information processing method and electronic equipment | |
CN110602805B (en) | Information processing method, first electronic device and computer system | |
CN104866262B (en) | Wearable device | |
US11425466B2 (en) | Data transmission method and device | |
WO2021031623A1 (en) | Display apparatus, file sharing method, and server | |
CN112527174B (en) | Information processing method and electronic equipment | |
US11954200B2 (en) | Control information processing method and apparatus, electronic device, and storage medium | |
US9411488B2 (en) | Display apparatus and method for controlling display apparatus thereof | |
CN113325996A (en) | Split screen display method and device | |
US11606581B2 (en) | Method, system, and non-transitory computer-readable record medium for cancelling delay of guest broadcasting in live broadcast | |
US20230328197A1 (en) | Display method and apparatus based on augmented reality, device, and storage medium | |
CN112527222A (en) | Information processing method and electronic equipment | |
JP7236551B2 (en) | CHARACTER RECOMMENDATION METHOD, CHARACTER RECOMMENDATION DEVICE, COMPUTER AND PROGRAM | |
WO2023185647A1 (en) | Media content display method and apparatus, and device, storage medium and program product | |
WO2020248681A1 (en) | Display device and method for displaying bluetooth switch states | |
KR20140140720A (en) | Method and apparatus for application execution control | |
CN112004041B (en) | Video recording method, device, terminal and storage medium | |
US20140380161A1 (en) | Information processing apparatus, information processing method, and program | |
CN110022445B (en) | Content output method and terminal equipment | |
CN110908638A (en) | Operation flow creating method and electronic equipment | |
WO2021197260A1 (en) | Note creating method and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |