Nothing Special   »   [go: up one dir, main page]

CN115514883A - Cross-device collaborative shooting method, related device and system - Google Patents

Cross-device collaborative shooting method, related device and system Download PDF

Info

Publication number
CN115514883A
CN115514883A CN202210973390.5A CN202210973390A CN115514883A CN 115514883 A CN115514883 A CN 115514883A CN 202210973390 A CN202210973390 A CN 202210973390A CN 115514883 A CN115514883 A CN 115514883A
Authority
CN
China
Prior art keywords
image
shooting
master device
slave
slave device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210973390.5A
Other languages
Chinese (zh)
Other versions
CN115514883B (en
Inventor
冯可荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210973390.5A priority Critical patent/CN115514883B/en
Publication of CN115514883A publication Critical patent/CN115514883A/en
Application granted granted Critical
Publication of CN115514883B publication Critical patent/CN115514883B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

Provided are a cross-device collaborative shooting method, a related device and a system. In the method, the master device and the slave device can carry out cooperative shooting, and the master device can receive a control operation of adjusting the shooting effect of the slave device by a user, generate a control command and send the control command to the slave device. The slave device can adjust the shooting effect in response to the control command and then send the image acquired after adjustment to the master device. By implementing the cross-device collaborative shooting method, the main device can not only provide multi-view shooting experience for the user, but also control the shooting effect of the slave device in the collaborative shooting process, thereby meeting the requirement of the user for controlling the far-end shooting effect.

Description

Cross-device collaborative shooting method, related device and system
Technical Field
The present application relates to the field of photography technologies, and in particular, to a cross-device collaborative photography method, a related apparatus, and a system.
Background
Along with the development of intelligent mobile equipment, the shooting function of intelligent mobile equipment camera is more and more powerful.
Because the shooting visual angle that the unit was shot has the limitation, present intelligent Mobile device can carry out cross equipment with other equipment and shoot in coordination to acquire more polybasic shooting visual angle, reach better shooting effect. When shooting in cooperation across devices, a user wants to be able to control the shooting effect of other devices. How to meet the shooting requirements of users is a problem which needs to be solved urgently at present.
Disclosure of Invention
The application provides a cross-device collaborative shooting method, a related device and a system, and solves the problem that a master device cannot control the shooting effect of a slave device when cross-device collaborative shooting is performed.
In a first aspect, an embodiment of the present application provides a cross-device collaborative shooting method, where the method is applied to a master device, the master device establishes communication connection with m slave devices, m is an integer greater than or equal to 1, and the method includes: the main equipment displays an application interface; the method comprises the steps that a master device receives a first image sent by a slave device, wherein the first image is obtained by the slave device according to a first shooting parameter; the master device displays m first images on the interface; the master device receives at least one operation; the master device responds to at least one operation and sends a control command carrying second shooting parameters to the slave device, and the second shooting parameters are used for adjusting the shooting effect of the slave device; the master device receives a second image sent by the slave device, and the second image is an image obtained by the slave device according to the shooting parameters; the master device displays the second image on the interface and no longer displays the first image.
By implementing the method provided by the first aspect, in the process of cross-device collaborative shooting, the master device may first acquire an image acquired and processed by the slave device according to the first shooting parameter, and the user may find the shooting parameter to be adjusted through the image, and then the user may send a control command for adjusting the shooting effect to the slave device through the master device. After the shooting parameters are adjusted by the slave equipment, the master equipment can display the image which is transmitted back by the slave equipment and is adjusted according to the control command, so that a user can control the shooting effect of the slave equipment through the master equipment.
In combination with the first aspect, the method further comprises: the interface of one application displayed by the master device further comprises a plurality of shooting options corresponding to the slave devices, and the plurality of shooting options respectively correspond to the shooting capabilities of the slave devices; wherein the operations include: an operation acting on one of the plurality of shooting options; the second photographing parameters include: the method comprises the following steps: and operating the shooting parameters corresponding to the acted shooting options.
Therefore, the user can directly check the shooting capability of the slave equipment on the master equipment, and the slave equipment is controlled to be correspondingly adjusted through the user operation acting on the master equipment, so that the shooting effect of the master equipment in remote control on the slave equipment is realized.
In combination with the above embodiment, before the master device displays the multiple shooting options, the master device may further obtain the shooting capabilities of the slave devices; wherein the second shooting parameter is within the shooting capability range of the slave device.
In this way, the master device can acquire the shooting capabilities of the slave devices, and thus, the master device can display the capabilities of the slave devices to a user for viewing.
With reference to the first aspect, in some embodiments, the method further comprises: the main equipment collects and processes the image to obtain a third image; the master device also displays a third image on the interface.
In this way, the main device can display the images acquired and processed by the main device while displaying the images acquired and processed by the auxiliary device. The user can see the pictures shot by the master equipment and the slave equipment on the display screen of the master equipment simultaneously, so that the use experience of the user is improved, and the requirement of multi-angle and multi-picture shooting is met.
In combination with the above embodiment, the interface further includes a plurality of shooting options corresponding to the main device, where the plurality of shooting options corresponding to the main device correspond to shooting capabilities of the main device; the main equipment receives another operation acting on one shooting option in the multiple shooting options corresponding to the main equipment, and acquires and processes an image according to a shooting parameter corresponding to the shooting option acted by the other operation to obtain a fourth image; the master device displays the fourth image on the interface and no longer displays the third image.
Therefore, the user can control the shooting effect of the user on the main device, and more shooting choices are provided for the user.
With reference to the first aspect, in some embodiments, before the master device sends, in response to at least one operation, a control command carrying the second shooting parameter to the slave device, the method further includes: the main device determines a first number and a first type, wherein the first number is the number of image streams required for displaying the second image, and the first type comprises the type of the image streams required for displaying the second image; the master device determines a second number and a second type, wherein the second number is smaller than the first number, and the first type comprises the second type; wherein, the control command also carries: a second number, a second type; the second image includes: and the slave equipment acquires and processes the obtained second quantity and second type of image streams according to the shooting parameters.
In this way, the master device can reduce the number of image streams requested from the slave device, and accordingly, when the slave device transmits the image streams to the master device, the number of transmitted streams can be reduced, thereby reducing the network data transmission load and improving the transmission efficiency.
In combination with the above embodiment, after the master device receives the second image sent by the slave device, the method further includes: the master device processes the second image into a first number and a first type of image streams; the master device displays a second image on the interface, including: the master device displays a second image on the interface according to the first number and the first type of image streams.
In this way, the host device can reproduce more image streams with a smaller number of received image streams.
With reference to the first aspect, in some embodiments, the method further comprises: the main equipment runs the shooting application program, and the first interface is provided by the shooting application program.
With reference to the first aspect, in some embodiments, the method further comprises: the main equipment runs a live broadcast application program, and a first interface is provided by the live broadcast application program; after the master device receives the first image sent by the slave device, the method further comprises the following steps: the main device sends the first image to a server corresponding to the live broadcast application program, and the server sends the first image to the first device.
In a second aspect, an embodiment of the present application provides a cross-device collaborative shooting method, where the method is applied to a slave device, and the slave device establishes a communication connection with a master device, and the method includes: the slave equipment acquires and processes the first shooting parameter to obtain a first image; the slave device sends the first image to the master device; the method comprises the steps that the slave equipment receives a control command which is sent by the master equipment and carries second shooting parameters, and the second shooting parameters are used for adjusting the shooting effect of the slave equipment; the slave equipment acquires and processes a second image according to the second shooting parameter; the slave device transmits the second image to the master device.
By implementing the method provided by the second aspect, in the process of cross-device collaborative shooting, the slave device may first send the image acquired and processed according to the first shooting parameter to the master device for use by the master device. Then, the slave device may also respond to a control command for adjusting the photographing effect, which is transmitted by the master device. After the slave device adjusts the photographing parameters, the slave device may transmit the adjusted image to the master device, so that the user can use the image of the slave device on the master device and can control the photographing effect of the slave device through the master device.
With reference to the second aspect, in some embodiments, the method further comprises: displaying an interface of an application from the equipment; displaying a first image on an interface from a device; the slave equipment acquires and processes a second image according to the second shooting parameter; the slave device displays the second image at the application interface and no longer displays the first image.
Therefore, the slave equipment can display the image acquired by the camera according to the first shooting parameter, and can also display the adjusted image after responding to the control command for adjusting the shooting effect sent by the master equipment. Therefore, the user of the slave device can view the images acquired by the slave device at any time after agreeing to the cooperative shooting with the master device.
In a third aspect, an embodiment of the present application provides a cross-device collaborative shooting method, which is applied to a communication system including one master device and m slave devices. The method comprises the following steps: the m slave devices acquire and process the first images according to the first shooting parameters; the m slave devices send the first image to the master device; the master device displays m first images on the interface; the master equipment responds to at least one operation and sends a control command carrying second shooting parameters to the n slave equipment, wherein the second shooting parameters are used for adjusting the shooting effect of the slave equipment; the n slave devices respectively acquire and process the n second images according to the second shooting parameters to obtain n second images; the n slave devices respectively send the obtained second images to the master device; the master device replaces the first image of the ith slave device with the second image from the ith slave device in the interface; n is less than or equal to m, and i is more than or equal to 1 and less than or equal to n.
By implementing the method provided by the third aspect, in the process of cross-device collaborative shooting, the master device can establish connection with the multiple slave devices, send a control command for adjusting the shooting effect to the multiple slave devices, respond to the multiple slave devices receiving the control command and adjust the shooting effect according to the shooting parameters carried in the control command, and transmit the images acquired and processed after adjustment back to the master device. Thus, the user can simultaneously control the photographing effects of the plurality of slave devices through the master device and view images photographed by the plurality of slave devices on one master device.
With reference to the third aspect, the first shooting parameters corresponding to different slave devices are different. The first shooting parameter may be a default shooting parameter, or may be a shooting parameter carried in a control command sent by the master device and received by the slave device last time.
With reference to the third aspect, the second shooting parameters corresponding to different slave devices may be the same or different, and the second shooting parameters corresponding to one slave device depend on the operation of the slave device by the user.
And each slave device in the M slave devices corresponds to a first image, and the first image corresponding to the slave device is acquired and processed according to the first shooting parameter corresponding to the slave device.
And each slave device in the N slave devices corresponds to a second image, and the second image corresponding to the slave device is acquired and processed according to the second shooting parameter corresponding to the slave device.
With reference to the third aspect, the method further includes: the interface further comprises a plurality of shooting options, the plurality of shooting options respectively correspond to the m slave devices, and the shooting options correspond to the shooting capabilities of the slave devices; wherein the operations in the third aspect include: an operation acting on the shooting option; the second photographing parameters include: the shooting parameters corresponding to the shooting options acted on by the operation.
In this way, the user can directly view the shooting capabilities of the plurality of slave devices on one master device, and then the master device controls the plurality of slave devices to adjust accordingly.
In combination with the above embodiment, before the master device displays the plurality of shooting options, the master device may further obtain shooting capabilities of the m slave devices; and the second shooting parameter corresponding to one slave device is within the shooting capability range of the slave device.
In this way, the master device can acquire the shooting capabilities of the plurality of slave devices, and thus, the master device can display the shooting capabilities of the respective slave devices to a user for viewing.
In combination with the third aspect, in some embodiments the method further comprises: displaying an application interface from the equipment; displaying a first image on an interface from a device; the slave equipment acquires and processes a second image according to the second shooting parameter; the slave device displays the second image at the application interface and no longer displays the first image.
In this way, for each slave device, the slave device may display an image obtained by the camera thereof according to the first shooting parameter, and may also display the adjusted image after responding to a control command for adjusting the shooting effect sent by the master device. Therefore, the user of the slave device can view the images collected by the slave device at any time after agreeing to the cooperative shooting with the master device.
With reference to the third aspect, in some embodiments, when the master device displays m first images on the interface, the method further includes: the main equipment collects and processes the image to obtain a third image; the master device also displays a third image on the interface.
In this way, the main device can display the images acquired and processed by the main device while displaying the images acquired and processed by the plurality of slave devices. The user can see the picture that main equipment and a plurality of slave unit shot simultaneously on the display screen of main equipment to promote user's use and experience, satisfy the demand that multi-angle multipicture was shot.
In combination with the above embodiment, the interface further includes a plurality of shooting options corresponding to the main device, where the plurality of shooting options corresponding to the main device correspond to shooting capabilities of the main device; the main equipment receives another operation acting on one shooting option in the plurality of shooting options corresponding to the main equipment, and acquires and processes an image according to the shooting parameters corresponding to the shooting option acted by the second operation to obtain a fourth image; the master device displays the fourth image on the interface and no longer displays the third image.
Therefore, the user can control the shooting effect of the user on the main device, and more shooting choices are provided for the user.
With reference to the third aspect, before the master device sends, in response to the at least one operation, a control command carrying the second shooting parameter to the slave device, the method further includes: the main device determines a first number and a first type, wherein the first number is the number of image streams required for displaying the second image, and the first type comprises the type of the image streams required for displaying the second image; the master device determines a second number and a second type, wherein the second number is smaller than the first number, and the first type comprises the second type; wherein, the control command also carries: a second number, a second type; the second image includes: and the slave equipment acquires and processes the obtained second quantity and second type of image streams according to the shooting parameters.
In this way, the master device can reduce the number of image streams requested from the slave device, and accordingly, when the slave device transmits the image streams to the master device, the number of transmitted streams can be reduced, thereby reducing the network data transmission load and improving the transmission efficiency.
In conjunction with the above one embodiment, before the host device displays the second image on the interface, the method further comprises: the master device processes the second image into a first number and a first type of image streams; the master device displays a second image on the interface, including: the master device displays a second image on the interface according to the first number and the first type of image streams.
In this way, the host device can reproduce more image streams with a smaller number of received image streams.
In a fourth aspect, embodiments of the present application provide an electronic device that includes one or more processors and one or more memories; wherein the one or more memories are coupled to the one or more processors, the one or more memories being configured to store computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method as described in the first aspect or any one of the embodiments of the first aspect; alternatively, the method as described in the third aspect or any one of the embodiments of the third aspect.
In a fifth aspect, embodiments of the present application provide an electronic device, which includes one or more processors and one or more memories; wherein the one or more memories are coupled to the one or more processors, the one or more memories being configured to store computer program code comprising computer instructions which, when executed by the one or more processors, cause the electronic device to perform the method as described in the second aspect or any one of the embodiments of the second aspect.
In a sixth aspect, embodiments of the present application provide a computer program product containing instructions, which, when run on an electronic device, cause the electronic device to perform the method as described in the first aspect or any one of the implementation manners of the first aspect; alternatively, the method as described in the third aspect or any one of the embodiments of the third aspect.
In a seventh aspect, an embodiment of the present application provides a computer-readable storage medium, including instructions, which, when executed on an electronic device, cause the electronic device to perform a method as described in the first aspect or any one of the implementation manners of the first aspect; alternatively, the method as described in the third aspect or any one of the embodiments of the third aspect.
In an eighth aspect, the present application provides a computer program product containing instructions, which when run on an electronic device, causes the electronic device to perform the method as described in the second aspect or any one of the embodiments of the second aspect.
In a ninth aspect, the present application provides a computer-readable storage medium including instructions that, when executed on an electronic device, cause the electronic device to perform the method as described in the second aspect or any one of the embodiments of the second aspect.
In a tenth aspect, an embodiment of the present application provides a communication system, including: a master device and a slave device. Wherein the master device is configured to perform the method as described in the first aspect or any one of the embodiments of the first aspect; alternatively, the slave device is configured to perform the method as described in any of the embodiments of the second or third aspects.
By implementing the cross-device collaborative shooting method provided by the embodiment of the application, a user can be connected with one or more slave devices through the master device, so that not only can multi-view shooting experience be provided for the user, but also the shooting effect of the slave devices can be controlled, and the requirement of the user for controlling the far-end shooting effect is met. The implementation of the cross-device collaborative shooting method can also solve the problem of distributed control among devices with operating systems, extend the functions of the electronic devices to other common hardware shooting devices, and flexibly expand the lenses.
Drawings
Fig. 1A-1B are schematic diagrams of two cross-device collaborative shooting provided by an embodiment of the present application;
FIG. 2A is a block diagram of a system provided by an embodiment of the present application;
fig. 2B is a schematic hardware structure diagram of an electronic device 400 according to an embodiment of the present application;
fig. 3 is a software architecture framework of the main device 100 provided by the embodiment of the present application;
fig. 4 is a software architecture framework of the slave device 200 provided by the embodiment of the present application;
fig. 5 is a schematic view of a service scenario provided in an embodiment of the present application;
FIGS. 6A-6D, 7A-7B, 8A-8B, 9A-9D, 10A-10C, 11A-11D are some schematic illustrations of user interfaces provided by embodiments of the present application;
FIG. 12 is a flow chart of a method provided by an embodiment of the present application;
FIG. 13 is a schematic diagram of a dynamic pipeline processing provided by an embodiment of the present application;
fig. 14 is a schematic diagram of a multiplexing and demultiplexing processing principle provided in an embodiment of the present application;
fig. 15 is a schematic diagram illustrating a frame synchronization processing principle according to an embodiment of the present application.
Detailed Description
The terminology used in the following examples of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the listed items.
The term "User Interface (UI)" in the specification, claims and drawings of the present application is a medium interface for interaction and information exchange between an application program or operating system and a user, and it implements conversion between an internal form of information and a form acceptable to the user. The user interface of the application program is a source code written by a specific computer language such as java, extensible markup language (XML), and the like, and the interface source code is analyzed and rendered on the terminal device, and finally presented as content that can be identified by the user, such as controls such as pictures, characters, buttons, and the like. A control (control), also called a component (widget), is a basic element of a user interface, and typical controls are a toolbar (toolbar), a menu bar (menu bar), a text box (text box), a button (button), a scroll bar (scrollbar), a picture, and a text. The properties and contents of the controls in the interface are defined by tags or nodes, such as XML defining the controls contained by the interface by nodes < Textview >, < ImgView >, < VideoView >, and the like. A node corresponds to a control or attribute in the interface, and the node is displayed as content visible to a user after being parsed and rendered. In addition, many applications, such as hybrid applications (hybrid applications), typically include web pages in their interfaces. A web page, also called a page, can be understood as a special control embedded in an application program interface, where the web page is a source code written by a specific computer language, such as hypertext markup language (GTML), cascading Style Sheets (CSS), java script (JavaScript, JS), etc., and the web page source code can be loaded and displayed as content recognizable to a user by a browser or a web page display component similar to a browser function. The specific content contained in the web page is also defined by tags or nodes in the source code of the web page, for example, GTML defines elements and attributes of the web page by < p >, < img >, < video >, < canvas >.
A commonly used presentation form of the user interface is a Graphical User Interface (GUI), which refers to a user interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
Fig. 1A illustrates a process of cross-device collaborative shooting.
As shown in fig. 1A, cross-device collaborative shooting involves a master device and a slave device. First, the master device may discover the slave devices, and then the slave devices may register at the master device. After registration, the master device may send a command, for example, a command for turning on the camera, to the slave device through the wireless network, and the slave device may start the camera to capture a picture in response to the command, compress the captured picture, and send the compressed picture to the master device. Finally, the main device can display the pictures collected by the auxiliary device and the pictures collected by the camera of the main device on a preview interface, so that cross-device collaborative shooting is realized.
However, in the method shown in fig. 1A, the master device cannot acquire the capability information of the slave device, and therefore cannot control and adjust the shooting screen of the slave device, and for example, cannot issue a control command for adjusting the shooting effect, such as a command for zooming, flashing, and the like, to the slave device.
Fig. 1B shows a cross-device collaborative shooting scenario involved in an instant messaging process.
As shown in FIG. 1B, device A and device B may conduct a video call through an instant messaging server, for example
Figure BDA0003797099790000061
Provided is a video call. Specifically, the device a may acquire an image through a camera, and then send the image to the device B through the instant messaging server. The device B can directly display the image sent by the device A on the video call interface and can also display the image acquired by the device B, so that cross-device collaborative shooting is realized.
In the method shown in fig. 1B, similar to that in fig. 1A, device a cannot acquire capability information of device B, and therefore cannot control and adjust the shooting picture of device B, for example, cannot issue a control command for adjusting the shooting effect, such as a zoom command, a flash command, etc., to device B. Moreover, since the device a and the device B communicate via a network, a large time delay may be caused in the cross-device collaborative shooting process, which affects the user experience.
In order to solve the problem that the shooting effect of other devices cannot be controlled in the cross-device collaborative shooting, the following embodiments of the present application provide a cross-device collaborative shooting method, a related apparatus and a system. The cross-device cooperative shooting method relates to a master device and a slave device. In the method, the master device can receive control operation of a user for the slave device in the process of carrying out cooperative shooting by the master device and the slave device; the slave device can adjust the shooting effect in response to the control operation, and then send the image obtained after adjustment to the master device. The master device may then present the image returned from the slave device. In some cases, the main device may also display the images acquired and processed by itself at the same time. In addition, the main device may also perform processing such as photographing, recording, forwarding, and the like on the displayed image in response to a user operation.
By implementing the cross-device collaborative shooting method, the main device can not only provide multi-view shooting experience for the user, but also control the shooting effect of the slave device in the collaborative shooting process, thereby meeting the requirement of the user for controlling the far-end shooting effect. In addition, the master device can also implement the processes of preview, photographing, video recording, forwarding, clipping and the like of the respective screens of the slave device and the master device.
By implementing the cross-device collaborative shooting method, the control requirements of various cross-device collaborative shooting scenes can be met, such as the shooting effect of a mobile phone controlling a television, the shooting effect of a watch controlling a mobile phone, the shooting effect of a mobile phone controlling a tablet computer, and the like.
In the following embodiments of the present application, the number of master devices is one. The number of slave devices is not limited, and may be one or more. Thus, the image ultimately presented by the host device may include images from multiple devices, for example, a preview screen ultimately displayed by the host device includes: an image of the master device, and images returned by the plurality of slave devices.
The cooperative shooting referred to in the following embodiments of the present application means that the master device and the slave device establish a communication connection, and both the master device and the slave device use a camera to shoot and process, and the slave device transmits a shot image to the master device based on the communication connection, and the master device displays the shot image of the slave device. In some cases, in the collaborative shooting process, the main device can also display the images acquired and processed by the main device at the same time.
The communication connection between the master device and the slave device may be a wired connection, a wireless connection. The wireless connection may be a short-range connection such as a high fidelity wireless communication (Wi-Fi) connection, a bluetooth connection, an infrared connection, an NFC connection, a ZigBee connection, or a long-range connection (the long-range connection includes, but is not limited to, a mobile network supporting 2g,3g,4g,5g and subsequent standard protocols). For example, the master device and the slave device may log in to the same user account (for example, hua shi account), and then connect over a long distance through a server (for example, hua shi multi-device collaborative shooting server).
The adjustment of the shooting effect according to the following embodiments of the present application refers to adjustment of shooting parameters of an electronic device. The shooting parameters of the electronic equipment comprise: hardware parameters of the camera involved in acquiring the image, and/or software parameters involved in processing the image. The shooting parameters also include some combination of hardware parameters and software parameters. Such as compound zoom range, night mode, portrait mode, time-lapse shooting, slow motion, panoramic mode, HDR, and so forth.
The hardware parameters include one or more of: the number of cameras, the type of cameras, the optical zoom value, whether the optical image anti-shake is turned on, the aperture size, whether the flash is turned on, whether the fill light is turned on, the shutter time, the ISO photographic value, the pixel and the video frame rate, and the like. The types of the cameras can include but are not limited to a common camera, a wide-angle camera and an ultra-wide-angle camera; the optical zoom value may be 1-time zoom, 2-time zoom, 5-time zoom; the aperture size may be f/1. 8. f/1. 9. f/3.4; the shutter time may be 1/40, 1/60, 1/200, etc.
The software parameters include one or more of: digital zoom value, image cropping size, color temperature calibration mode of image, noise reduction mode of image, beauty/body type, filter type, sticker option, whether to turn on self-timer mirror, etc. Wherein, the digital zoom value can be 10 times zoom and 15 times zoom; the image cropping size may be 3; the color temperature calibration mode may be daylight, fluorescent, incandescent, shadow, cloudy day calibration mode; the skin/body type can be face slimming, body slimming, skin grinding, skin whitening, eye enlarging, acne removing, etc.; the filter type can be day series, texture, brightness, soft light, saibobonk, etc.; the sticker can be a sticker such as an expression, an animal, a landscape, a picture poster and the like.
When the electronic device is responding to a specific shooting mode or enabling a specific algorithm, the electronic device will adjust the shooting parameters. For example, when the camera uses the face mode function, the electronic device may reduce the focal length of the shooting parameters, increase the aperture, turn on the fill light, and use a default beauty algorithm.
When the electronic device shoots an image, the shooting parameters and the processing parameters used by the electronic device can also refer to the shooting capability of the camera or the camera group recorded in the subsequent embodiment. The parameter ranges of the shooting parameters and the processing parameters can be determined according to the shooting capabilities.
The default shooting parameters indicate parameters used by the master device and the slave device when the camera is enabled. The default shooting parameters can be preset parameters when the camera leaves a factory, and can also be parameters used when the user uses the camera last time. Similarly, the parameters include a plurality of hardware parameters used when the camera acquires an image and a plurality of software parameters used when the image processing module processes the image.
The system provided by the embodiment of the present application is first described below. Fig. 2A illustrates the structure of the system 10.
As shown, the system 10 includes: master 100, slave 200. The number of the slave devices 200 may be one or more, and fig. 2A illustrates one slave device 200 as an example.
The master apparatus 100 and the slave apparatus 200 are each an electronic apparatus equipped with a camera. The number of cameras of the master device 100 and the slave device 200 is not limited in the embodiment of the present application. For example, the slave device 200 may be configured with five cameras (2 front cameras and 3 rear cameras).
Electronic devices include, but are not limited to, smart phones, tablet computers, personal Digital Assistants (PDAs), wearable electronic devices with wireless communication functions (e.g., smart watches, smart glasses), augmented Reality (AR) devices, virtual Reality (VR) devices, and the like. Exemplary embodiments of the electronic device include, but are not limited to, a mount
Figure BDA0003797099790000081
Linux, or other operating system. The electronic device may also be other portable electronic devices such as a Laptop computer (Laptop) or the like. It should also be understood that in other embodiments, the electronic device may not be a portable electronic device, but may be a desktop computer or the like.
The master device 100 and the slave device 200 establish a communication connection therebetween, which may be a wired connection or a wireless connection.
In some embodiments, the wireless connection may be a high fidelity wireless communication (Wi-Fi) connection, a bluetooth connection, an infrared connection, an NFC connection, a ZigBee connection, or other close range connection. The master device 100 may directly transmit a control command for adjusting a photographing effect to the slave device 200 through the close connection. The slave device 200 may respond to the control command issued by the electronic device 100 and transmit the adjusted image back to the master device 100. Thereafter, the master device 100 may display the image transmitted back from the device 200. In addition, the main device 100 may also use the above images to complete the tasks of recording, photographing, and forwarding. Here, specific implementations such as the master device 100 sending a control command for adjusting the shooting effect to the slave device 200, and the slave device 200 adjusting the image according to the control command may refer to detailed descriptions of subsequent method embodiments, and are not repeated here.
In other embodiments, the wireless connection may also be a long range connection including, but not limited to, a mobile network supporting 2g,3g,4g,5g and subsequent standard protocols.
Optionally, the system 10 shown in fig. 2A may further include a server 300, and the master device and the slave device may log in to the same user account (for example, hua is an account), and then connect with each other over a long distance through the server 300 (for example, hua is a multi-device collaborative shooting server provided by hua is a server). The server 300 may be used for data transmission of the master device 100 and the slave device 200. That is, the master device 100 may transmit a control command to the slave device 200 through the server 300. Also, the slave device 200 may transmit an image to the master device 100 through the server 300.
Fig. 2B is a schematic structural diagram of an electronic device 400 according to an embodiment of the present disclosure. The electronic device 400 may be the master device 100 or the slave device 200 in the system 10 shown in fig. 2A.
The electronic device 400 may include: the mobile terminal includes a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the command operation code and the time sequence signal to complete the control of acquiring the command and executing the command.
A memory may also be provided in the processor 110 for storing commands and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold commands or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the command or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
The wireless communication function of the electronic device 400 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and detecting electromagnetic wave signals. Each antenna in electronic device 400 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch. The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied on the electronic device 400. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 can detect the electromagnetic wave from the antenna 1, filter, amplify, etc. the detected electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the detected electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio output device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 400, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 detects an electromagnetic wave via the antenna 2, performs frequency modulation and filtering processing on an electromagnetic wave signal, and transmits the processed signal to the processor 110. The wireless communication module 160 can also detect a signal to be transmitted from the processor 110, frequency-modulate it, amplify it, and convert it into electromagnetic waves via the antenna 2 to radiate it. Illustratively, the wireless communication module 160 may include a bluetooth module, a Wi-Fi module, and the like.
In some embodiments, antenna 1 of electronic device 400 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 400 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), wideband Code Division Multiple Access (WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long Term Evolution (LTE), 5G and subsequent standard protocols, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, and the like. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou satellite navigation system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 400 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, connected to the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program commands to generate or change display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. In some embodiments, the electronic device 400 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 400 may implement the camera function via the ISP, camera 193, video codec, GPU, display screen 194, application processor, etc.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to be converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats.
In some embodiments, electronic device 400 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 400 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 400 may support one or more video codecs. In this way, the electronic device 400 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation acting thereon or nearby. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194.
When the electronic device 400 shown in figure 2B is the master device 100 in figure 2A,
the mobile communication module 150 and the wireless communication module 160 may be used to provide communication services to the master device 100. Specifically, in the embodiment of the present application, the master device 100 may establish a communication connection with another electronic device (i.e., the slave device 200) having the camera 193 through the mobile communication module 150 or the wireless communication module 160. Further, through the above-described connection, the master device 100 may transmit a control command to the slave device 200 and receive an image transmitted back from the slave device 200.
The ISP, camera 193, video codec, GPU, display screen 194, and application processor, etc. provide the functions of capturing and displaying images for the main device 100. When the host 100 turns on the camera 193, the host 100 may obtain an optical image captured by the camera 193 and convert the optical signal into an electrical signal through the ISP. In the experiment for controlling the shooting effect in the embodiment of the application, the ISP can also adjust the shooting parameters such as exposure, color temperature and the like of the shooting scene, and optimize the image processing parameters of the noise point, brightness and skin color of the image.
In cross-device collaborative shooting, a video codec may be used to compress or decompress digital video. The master device 100 may encode the photographing file photographed in cooperation across devices into video files of various formats through a video codec.
Through the GPU, the display screen 194, and the application processor, etc., the host device 100 may implement a display function. Specifically, the electronic device may display an image or the like captured by the camera 193 through the display screen 194. The image transmitted from the slave device 200 and received by the master device 100 may be displayed by a device such as the display screen 194. In some embodiments, the display screen 194 may also display only images transmitted from the device 200. Meanwhile, the master device 100 may respond to a user operation acting on a user interface control through the touch sensor 180K, i.e., a "touch panel".
When the electronic device 400 shown in figure 2B is the slave device 200 of figure 2A,
the mobile communication module 150 and the wireless communication module 160 may be used to provide communication services for the slave device 200. Specifically, the slave device 200 may establish a communication connection with the master device 100 through the mobile communication module 150 or the wireless communication module 160. Through the above connection, the slave device 200 receives a control command for controlling a photographing effect transmitted by the master device 100, and can photograph an image in response to the control command, and transmit the image acquired and processed by the camera 193 to the master device 100.
Like the master device 100, the ISP, the camera 193, and the video codec may provide the slave device 200 with a function of capturing and transmitting images. In the cross-device cooperative shooting, a video codec may be used to compress or decompress digital video when the slave device 200 transmits an image to the master device 100. The slave device 200 may encode the photographing file photographed in cooperation across the devices into an image stream through a video codec and then transmit to the master device 100.
In some embodiments, the slave device 200 may also display images acquired by its own camera 193. At this time, the slave device 200 may be implemented with a GPU, a display screen 194, an application processor, and the like.
While the slave device 200 may respond to user operations acting on user interface controls through the touch sensor 180K.
The software systems of the master device 100 and the slave device 200 may each adopt a hierarchical architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present invention uses an Android system with a hierarchical architecture as an example to exemplarily illustrate software structures of a master device 100 and a slave device 200. Of course, in other operating systems (e.g., hong meng system, linux system, etc.), the solution of the present application can also be implemented as long as the functions implemented by the respective functional modules are similar to the embodiments of the present application.
Fig. 3 is a block diagram of the software configuration of the master device 100 according to the embodiment of the present invention.
As shown in fig. 3, the software structure block diagram of the host device 100 may include an application layer, a framework layer, a service layer, and a Hardware Abstraction Layer (HAL). The framework layer may further include a device virtualization kit (DVKit) and a device virtualization platform (DMSDP).
The DVkit is a Software Development Kit (SDK). The DVkit may provide a capability interface to the application layer. Through the above interfaces, the application layer may invoke services and capabilities provided in the DVkit, such as discovering slaves, etc. DMSDP is a framework layer service. After the DVkit initiates connection with the slave device, the DVkit may pull up the DMSDP service, and then the DMSDP may implement transmission of a control session, a data session, and the like in the process of connecting the slave device. The application layer may include a series of application packages. For example, applications may include cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc. In the embodiment of the present application, the application layer includes various applications using a camera, such as a camera application, a live application, a video call application, and the like. The video call application refers to an application having both a voice call and a video call, such as an instant messaging application
Figure BDA0003797099790000122
(not shown in fig. 3) and so on. The camera applications may include native camera applications, and third party camera applications. The application layer may request the framework layer to use the camera's shooting capabilities.
The framework layer provides an Application Programming Interface (API) and a programming framework for the application programs of the application layer. The framework layer includes some predefined functions.
As shown in fig. 3, the framework layer may include a Camera kit (CameraKit) and a Camera interface (cameraapi).
The camera kit (CameraKit) may include a mode management module, and the mode management module may be configured to adjust a shooting mode used when the main device 100 runs various applications using a camera. The capture mode may include, but is not limited to, a preview mode, a take picture mode, a record mode, a cross device mode, and the like. Wherein, after the master device 100 enters the cross-device mode, the device virtualization kit (DVKit) can be used to discover the slave device 200. These shooting modes can be realized by calling a Camera interface (Camera API).
The Camera interface (Camera API) may include two parts of a Camera management (Camera manager) and a Camera device (Camera device).
Among them, camera management (CameraManager) may be used to manage the photographing capability of the master device 100 and the photographing capability of the slave device 200 connected to the master device 100.
The shooting capabilities of the device may include the hardware capabilities of the camera and the software capabilities of image processing software such as an ISP/GPU. Hardware capability refers to some capability of the camera head that can be adjusted. The software capability refers to the capability of image processing modules such as ISPs and GPUs to process electric signal images. The shooting capabilities also include some capability to combine both hardware and software capabilities, such as hybrid zoom range, night mode, portrait mode, time-lapse shooting, slow motion, panoramic mode, and so on. Taking the face mode as an example: the master device 100 (or slave device 200) may adjust the camera focal length, add beauty algorithms, etc.
The hardware capabilities include one or more of: the number of cameras, the type of camera, the optical zoom range, the optical image anti-shake, the aperture adjustment range, the flash, the fill light, the shutter time, the ISO sensitivity, the pixel and video frame rate, and so on. The type of the camera may include, but is not limited to, a general camera, a wide-angle camera, a super wide-angle camera, and the like. The optical zoom range may be 1-5 times zoom; the aperture size may be f/1.8-f/17; the shutter time may be 1/40, 1/60, 1/200, etc.
The software capabilities include one or more of: digital zoom range, supported image cropping specifications, supported color temperature calibration of images, supported image noise reduction, supported beauty/body type, supported filter type, supported sticker, supported self-portrait mirror. Wherein, the digital zoom range can be 10 times to 15 times zoom; the image cropping size may be 3; the color temperature calibration mode can be sunlight, fluorescence, incandescent lamps, shadow, cloudy day calibration modes, facial algorithms such as face slimming, body slimming, skin polishing, whitening, eye enlarging, acne removing, body beautifying algorithms, filter algorithms such as day system, texture, brightness, soft light, saibobpunk, stickers such as expressions, animals, landscape, and picture insets, and the like.
Table 1 exemplarily shows respective photographing capabilities of the master device 100 and the slave device 200.
Figure BDA0003797099790000121
Figure BDA0003797099790000131
TABLE 1
As shown in table 1, camera management (CameraManager) may record the numbers of the cameras of the master device 100 and the cameras of the slave device 200. For example, the three cameras of the master device 100 may be numbered 1, 2, 3, respectively. The three cameras of the slave device 200 may be numbered 1001, 1002, 1003, respectively. This number is used to uniquely identify the camera. In particular, the numbering of the cameras of the slave device 200 may be done by the virtual camera HAL of the master device 100. For specific numbering rules, reference may be made to the following description of the virtual HAL, which is not further described herein.
It is understood that the above hardware capability and software capability may also include other capabilities, respectively, and are not limited to the above-mentioned ones, which is not limited by the embodiments of the present application.
A camera device (CameraDevice) may be used to forward control commands for the stream to the service layer for further processing in response to various applications in the application layer. A stream in the embodiments of the present application refers to a set of data sequences that arrive sequentially, massively, rapidly, and continuously. In general, a data stream may be viewed as a dynamic collection of data that grows indefinitely over time. The stream of the present application is an image stream composed of images of one frame. The control command of the stream is created by each application program of the application layer and is issued to the rest of the modules below the application layer.
The camera device (CameraDevice) may also cooperate with a device virtualization kit (DVKit) and a device virtualization platform (DMSDP) to establish a communication connection between the master device 100 and the slave device 200 and bind the slave device 200 and the master device 100.
Specifically, a device virtualization suite (DVKit) may be used to discover the slave device 200, and a device virtualization platform (DMSDP) may be used to establish a session channel with the discovered slave device 200. DMSDP may include a control session and a data session. Wherein the control session is used to transmit control commands (e.g., a request to use a camera of the slave device 200, a photographing command, a recording command, a control command to adjust a photographing effect, etc.) between the master device 100 and the slave device 200. The data session is used to transmit streams back from the device 200, such as preview streams, photo streams, video streams, and the like.
The service layer may include modules such as Camera service (CameraService), cameraDeviceClient, camera3Device, cameraProviderManager, device co-management, dynamic pipelining, and stream processing.
The Camera service (Camera service) provides various services for implementing interface functions for a Camera interface (Camera API).
CameraDeviceClient is an instantiation of camera. One camera deviceclient corresponds to one camera. The Camera service may include a function to create a camera device client, such as connected device (). When responding to a request of creating a camera instance by the application layer, the camera service calls the function, so as to create a camera device client instance corresponding to the camera.
The Camera3Device may be used to manage the life cycle of various types of flows including, but not limited to, creating, stopping, clearing, destroying flow information, and the like.
The CameraProviderManager can be used to acquire virtual camera HAL information including the shooting capabilities of the slave device 200. The detailed description of the photographing capability of the slave device 200 may refer to table 1.
The device cooperation management may be used to control the time delay of the respective screens of the master device 100 and the slave device 200. When the master device 100 and the slave device 200 perform cross-device cooperative shooting, since a certain time is required for the slave device 200 to transmit a shot image to the master device 100, the preview screens displayed by the master device 100 may not be displayed at the same time and may have a time delay. The device cooperation management module may copy a part of frames of a picture acquired by the main device 100 by adding a buffer (buffer), so that the preview pictures of both the two parties are acquired at time points with little difference, and thus, time delays generated by the preview pictures of both the parties can be controlled within a visual perception range of a user, and user experience is not affected.
The dynamic pipeline may be configured to generate a pending command queue in response to a stream creation command issued by a camera device (CameraDevice). In particular, the dynamic pipeline may generate one or more pending command queues based on the type of stream creation command, the device being acted upon. The types of stream creation commands may include, for example, preview commands, video recording commands, photo taking commands, and the like. The devices on which the stream creation command is acted may include the master device 100, the slave device 200, and may even be further detailed to the specific camera of the device. The specific workflow of the dynamic pipeline will be described in detail in the following method embodiments, which are not repeated herein.
The dynamic pipeline may add an Identification (ID) or tag (tag) of the device on which the command is acting to each command of the generated pending command queue. The dynamic pipeline may also add a single frame or a continuous frame of request tags to each command of the generated pending command queue to indicate the type of the command. The shooting command is a single-frame command, and the preview command or the video recording command is a continuous-frame command.
The dynamic pipeline may also be configured to distribute a command acting on the slave device 200 in the to-be-processed command queue to the stream processing module, and distribute a command acting on the master device 200 to the local camera HAL module of the HAL layer, which may refer to the description in fig. 13, and is not described herein again.
When the user controls the shooting effect through the application program of the application layer, the dynamic pipeline can also refresh or add various parameters for controlling the shooting effect, such as a zoom value, a beautifying algorithm (such as a skin grinding level, a whitening level and the like), a filter, a color temperature, exposure and the like, in the command of the command queue to be processed.
Stream processing may include: a life cycle management module, a pretreatment module, a post-treatment module, a frame synchronization module and the like.
The lifecycle management module can be used to monitor the entire lifecycle of the stream. When the camera device (CameraDevice) transmits a command to create a stream to the stream process, the lifecycle management can record information of the stream, such as a timestamp requesting creation of the stream, whether the slave device 200 responds to create the stream, and the like. When the main device 100 turns off the camera or ends running the application, the life cycle of the corresponding stream is stopped, and the life cycle management module may record the end time of the stream.
The preprocessing module is used for processing each command issued by the dynamic assembly line and comprises a multi-stream configuration module, a multiplexing control module and the like.
The multi-stream configuration may be used to configure the type and number of streams required according to the type of a stream creation command issued by a camera device (CameraDevice). The different types of control commands correspond to different types and numbers of required streams. The types of streams may include, but are not limited to, preview streams, photo streams, video streams, analysis streams, and the like. For example: the camera device (CameraDevice) may issue a control command "create a photographing stream", and the multi-stream configuration may configure four preview streams, one analysis stream, one photographing stream, and one recording stream for the control command.
The method of configuring multiple streams for commands to create streams can refer to table 2:
Figure BDA0003797099790000151
TABLE 2
It is understood that, in some embodiments, there may be other configuration methods for the multi-stream configuration, and this is not limited in this application.
The multiplexing control may be used to multiplex multiple streams requested by a camera device (CameraDevice), i.e., to thin down the multiple streams configured by the multi-stream configuration module. For example, one preview stream with a requested picture quality of 1080P and one analysis stream with a requested picture quality of 720P can be multiplexed into one preview stream of 1080P.
The following method embodiments will describe in detail the specific implementation of the multi-stream configuration module and the multiplexing control module, which is not described herein again.
The post-processing module is used to process the image stream returned from the device 200. The post-processing module may include a smart streaming and multi-streaming output module. Smart streaming can be used to expand the streams returned from device 200 into streams consistent with the type and number of streams requested by a camera device (CameraDevice). For example, a camera device (CameraDevice) requests a 1080P preview stream, a 720P parse stream. Through multiplexing control, the above-described control commands requesting two streams are multiplexed into a control command requesting one 1080P preview stream. Executing the above control command requesting a 1080P preview stream, the slave device 200 may transmit a 1080P preview stream back to the master device 100. After the stream processing receives the 1080P preview stream, the stream processing can restore the 1080P preview stream into a 1080P preview stream and a 720P analysis stream through the intelligent distribution module.
The multi-stream output may be used to output streams that are actually needed for smart streaming and send the output streams to a camera device (CameraDevice).
The post-processing module may also include processing for mirroring, rotating, etc. of the image, which is not limited herein.
The frame synchronization module can be used for performing frame synchronization processing on the image frame during photographing. In particular, the cross-device transfer may cause a delay in the arrival of the command at the slave device 200 from the master device 100. I.e. the slave device 200 will receive the same instruction later than the master device 100. Therefore, the slave device 200 may obtain the execution result when executing the above instruction, which is different from the result expected by the master device 100. For example, if the master device 100 issues a control command at a first time, the slave device 200 may respond to the control command at a second time, and the master device 100 may desire to receive the image stream of the slave device 200 at the first time. Thus, the frame synchronization may push the results from the device 200 back at the second time forward to a result closer to the first time (the user expected result), thereby reducing the effects of network latency.
The Hardware Abstraction Layer (HAL) may include a local camera HAL and a virtual camera HAL.
The local camera HAL may comprise a camera session (camera session) and a camera provider module. A camera session (camera session) may be used for master device 100 to issue control commands to the hardware. The camera provider module may be used to manage the shooting capability of the camera of the main device 100. The shooting capability of the master device 100 can refer to table 1 described above. The camera provider may manage only the shooting capability of the local camera of the main device 100.
The virtual camera HAL also includes a camera session (camera session) and a camera provider module. Among them, the camera session (camera session) may also be used to register the slave device 200, which establishes a communication connection with the master device 100, locally, feed back the connection status of the slave device 200 to the DVkit, and send a control command issued by the master device 100 to the slave device 200. The camera provider module is responsible for management of the shooting capability of the slave device 200. Similarly, the shooting capability of the slave device 200 can be managed according to table 1, which is not described herein.
Furthermore, the camera virtual HAL also provides the functionality of numbering the cameras of the registered slave devices 200. When the virtual camera HAL of the master device acquires the shooting capability of the slave device, the virtual camera HAL may acquire the number of cameras that the slave device has, and establish one ID for each camera. This ID may be used by the master device 100 to distinguish between multiple cameras of the slave device 200. The virtual camera HAL may adopt a different numbering method from that of the camera of the host device 100 itself when numbering the cameras. For example, when the master device 100 itself camera numbers from 1, the slave device 200 camera can number from 1000.
It is to be understood that the structure of the master device 100 illustrated in fig. 3 does not constitute a specific limitation of the slave device 200. In other embodiments of the present application, the slave device 200 may include more or fewer modules than shown, or combine certain modules, or split certain modules, or a different arrangement of modules.
Fig. 4 illustrates a system framework diagram of the slave device 200.
As shown in fig. 4, the slave device 200 may include an application layer, a framework layer, a service layer, and a Hardware Abstraction Layer (HAL).
The application layer may include a series of application packages, which may include, for example, a camera proxy service. The camera proxy service may include modules such as pipe control, multi-stream adaptation, multi-Operating System (OS) adaptation, and the like.
Among other things, pipe control can be used to establish a communication connection with the master device 100 (including establishing a control session and a data session), transmit control commands, and stream images.
Multi-stream adaptation may be used for the camera proxy service to return streaming data according to the configuration information of the stream sent by the master device 100. Referring to the introduction of the foregoing embodiment, the master device 100 may configure a required stream for a camera device session in response to a request to create a stream. Performing the configuration process described above may generate corresponding flow configuration information. The camera proxy service may generate a corresponding flow creation command according to the configuration information described above. Accordingly, in response to the above control command to create a stream, the underlying service of the slave device 200 may create a stream matching the above command.
Multiple Operating System (OS) adaptation may be used to solve compatibility issues for different Operating systems, such as android systems and damming systems, among others.
The framework layer comprises a camera management (Camera manager), a camera device (Camera device); the service layer includes Camera service (Cameraservice), cameraDeviceClient, camera3Device, and CameraProviderManager modules.
Wherein the Camera management (CameraManager), the Camera Device (CameraDevice), the Camera service (CameraService), the CameraDevice client, the camera3Device, the CameraProviderManager, and the corresponding respective modules in the host Device 100 have the same functions, reference may be made to the description of fig. 3 above.
The local camera HAL of the slave device 200 may refer to the local camera HAL of the master device 100. The embodiments of the present application are not described herein in detail. The camera HAL layer of slave device 200 may also include a camera session (camera session) and camera provider module. A camera session (camera session) may be used to control communication between commands and hardware. The camera provider module may be used to manage the photographing capability of the slave device 200. The photographing capability of the slave device 200 may refer to table 1 described above, and likewise, the camera provider of the slave device 200 may manage only the photographing capability of the local camera of the slave device 200.
It is to be understood that the structure of the slave device 200 illustrated in fig. 4 does not constitute a specific limitation of the slave device 200. In other embodiments of the present application, the slave device 200 may include more or fewer modules than shown, or combine certain modules, or split certain modules, or a different arrangement of modules.
Based on the software and hardware structures of the system 10, the master device 100, and the slave device 200 described above, the cross-device cooperative shooting method provided in the embodiment of the present application is described in detail below.
The cross-device collaborative shooting method provided by the embodiment of the application can be applied to various scenes, including but not limited to:
(1) Live broadcast scene
In a live scene, the master device 100 may be connected to the slave device 200, the master device 100 and the slave device 200 may take the same object or different objects at different angles, the slave device 200 may send the taken image to the master device 100, and the master device 100 may upload the images of both parties to a live server and distribute the images to more users for viewing by the live server. In this process, the master device 100 may control the shooting effect of the slave device 200, for example, the camera focal length of the slave device 200 or parameters such as a filter used may be adjusted by the master device 100.
By using the cross-device collaborative shooting method provided by the embodiment of the application in the live scene, the anchor initiating the live broadcast can conveniently control the shooting effect of the slave device 200 on the master device 100, and a user watching the live broadcast can see a plurality of images displayed at different angles for the same object.
(2) Camera application scenarios
After the master device 100 starts the camera application, the slave device 200 may be connected to the master device 100 and may capture images at different angles, and the slave device 200 may transmit the captured images to the master device 100. The master device 100 may control the photographing effect of the slave device 200, and may simultaneously display the photographed images of both sides and perform processes such as preview, photographing, and recording on the photographed images. The camera application may be a native camera application or a third party camera application.
Therefore, the functions of cross-device double-scene video recording, multi-scene video recording and the like can be realized, more free visual angles can be provided for users, the users can conveniently control the shooting effect of the slave device 200 on the master device 100, and the shooting interest is increased.
(3) Video call scenario
The master device 100 may be a cell phone and the slave device 200 may be a large screen television. The mobile phone can carry out video call with other equipment, in the process, the mobile phone can be connected with the large-screen television and controls the camera of the large-screen television to shoot, the large-screen television can send shot images to the mobile phone, and then the mobile phone sends the images to equipment at the other end of the video call. Therefore, the mobile phone can realize video call through the large-screen television, the user does not need to hold the mobile phone by hand to shoot images at a specific angle, and more convenient video call experience can be provided for the user.
(4) Scene that wearable equipment control intelligent electronic equipment shot
Wearable equipment such as intelligent wrist-watch can be connected with intelligent electronic equipment such as cell-phone to the camera of control cell-phone is shot, and the cell-phone can be sent the image of shooting for intelligent wrist-watch, makes the user directly watch the picture that the cell-phone was shot and was handled on intelligent wrist-watch. In the process, the user can also control the shooting effect of the mobile phone through the intelligent watch.
Therefore, a user can process the shot picture of the mobile phone through the intelligent watch, and the shooting can be conveniently and quickly finished without the help of other people in the process of shooting the co-shooting and the scene shooting.
It can be understood that the above scenarios are only examples, and the cross-device cooperative shooting method provided in the embodiment of the present application may also be applied to other scenarios, which are not limited herein.
Taking a live scene as an example, a cross-device collaborative shooting method is described in combination with a UI in the live scene.
Fig. 5 illustrates a live scene provided by an embodiment of the present application.
As shown in fig. 5, the live scene may include a master device 100, a slave device 200, an object a, and an object B.
The master device 100 and the slave device 200 establish a communication connection, and the master device 100 and the slave device 200 may be at different positions or angles. The master device 100 photographs the object a and the slave device 200 photographs the object B. Then, the slave device 200 may display the photographed image and process the image to transmit to the master device 100. The master device 100 can simultaneously display the image transmitted from the slave device 200 and the image obtained by itself capturing the object a.
During the live broadcast process, the main device 100 may also upload the two displayed images to the live broadcast server. Further, the server may distribute the two images to devices of other users entering the live room.
Based on the live broadcast scenario, some User Interfaces (UIs) on the master device 100 and the slave device 200 provided in the embodiment of the present application are described below. Fig. 6A-6D, 7A-7B, 8A-8B, 9A-9D, 10A-10C illustrate some user interfaces implemented on the master device 100 and the slave device 200 in a live scene.
Fig. 6A-6C illustrate one manner in which the master device 100 and the slave device 200 establish a communication connection. Fig. 6A to 6C are user interfaces implemented on the master device 100, and fig. 6D is a user interface implemented on the slave device 200.
Fig. 6A illustrates an exemplary user interface 60 on the host device 100 for exposing installed applications. The user interface 60 displays: status bar, calendar indicator, weather indicator, tray with frequently used application icons, navigation bar, icons 601 of live type applications, icons 602 of camera applications and icons of other applications, etc. Wherein, the status bar can include: one or more signal strength indicators for mobile communication signals (which may also be referred to as cellular signals), operator name (e.g., "china mobile"), one or more signal strength indicators for Wi-Fi signals, battery status indicators, time indicators, and the like. The navigation bar may include a return key, a home screen key, a multi-task key, and other system navigation keys. In some embodiments, the user interface 60 illustratively shown in FIG. 6A may be a Home screen (Home Screen).
As shown in fig. 6A, the host device 100 may detect a user operation on an icon 601 of the live-class application, and in response to the user operation, display a user interface 61 shown in fig. 6B.
The user interface 61 may be a main interface provided by a live-class application. The user interface 61 may include: a field 611, an interactive message window 612, a preview box 613, an add control 614, and a settings control 615.
Area 611 may be used to show some information of the anchor, such as the avatar, the duration of the live broadcast, the number of viewers, and the account number of the live broadcast, among others.
The interactive message window 612 can be used to display messages from the anchor or viewer during the live broadcast, or system messages generated by interactive operations such as "like" or "like".
The preview box 613 may be used to display an image that is captured and processed in real time by a camera of the main apparatus 100. The host device 100 may refresh the display content therein in real time, so that the user can preview the image captured and processed by the camera of the host device 100 in real time. The camera may be a rear camera of the main apparatus 100 or a front camera.
The settings control 615 may be used to adjust the capture effect of the primary device 100. When a user operation (e.g., a click operation, a touch operation, etc.) acting on the settings control 615 is detected, the primary device 100 may display: options for adjusting the shooting parameters of the main apparatus 100, and/or image processing parameters. These options can refer to the related description of the subsequent user interface, which is not repeated here.
An add control 614 can be used to find the slave device 200. When detecting the user operation on the add control 614, the main device 100 may discover other nearby electronic devices by using the aforementioned short-range communication technology, such as bluetooth, wi-Fi, and NFC, or may discover other remote electronic devices by using the aforementioned long-range communication technology, and query whether the discovered other electronic devices have cameras.
Upon receiving the responses of the other electronic devices, the host device 100 may display the found electronic device with a camera on the user interface 62. For example, referring to fig. 6C, the host device 100 may display a window 622, and the window 622 may include information for two electronic devices, including respective: icon, name, distance, location, etc. of the electronic device.
Icon 623 may be used to show the type of electronic device. For example, the first slave device displayed by the master device 100 may be a tablet computer. The user can quickly and conveniently preliminarily recognize whether the slave device is a device to which the user wants to connect through the icon 623.
Name 624 may be used to display the name of the slave device. In some embodiments, the name 624 may be the model of the slave device. In other embodiments, the name may also be a user-defined name for the slave device. The names may also be a combination of device model and user-defined names. This is not limited by the present application. It is understood that the names listed in the user interface 62, such as "pad C1", "Phone P40-LouS", etc., are exemplary names.
As shown in fig. 6C, the master device 100 may detect a user operation acting on the icon 623 of the electronic device, and in response to the user operation, the master device 100 transmits a request for establishing a communication connection to the electronic device (slave device 200) corresponding to the icon 623.
Referring to fig. 6D, fig. 6D shows a user interface 63 displayed by the electronic device (slave device 200) after the slave device 200 receives the request for establishing the communication connection sent by the master device 100. As shown in fig. 6D, the user interface 63 includes: device information 631 of the master device 100, a confirmation control 632, and a cancel control 633.
The device information 631 may be used to present identity information of the master device 100. That is, the user can determine the information of the master 100 that issued the request through the device information 631. When the user can determine the information of the host device 100 through the device information 631 and trust the host device, the user can approve the host device 100 to use the camera of the electronic device through the confirmation control 632. The electronic device may detect an operation acting on the confirmation control 632, and in response to the user operation, the slave device 200 may agree that the master device 100 uses its own camera, that is, the master device 100 may establish a communication connection with the electronic device.
The communication connection may be the wired connection or the wireless connection. For example, the master device and the slave device may log in to the same user account (for example, hua shi account), and then connect over a long distance through a server (for example, hua shi multi-device collaborative shooting server). The embodiments of the present application do not limit this. It should be understood that the electronic device corresponds to a slave device of the master device 100.
The user interface 63 also includes a cancel control 633. When the master 100 cannot be determined through the device information 631 or the master 100 is not trusted, the user can reject the request to use the camera of the electronic device, which is sent by the master 100, through the cancel control 633. The electronic device may detect an operation acting on the cancel control 633, and in response to the user operation, the electronic device may reject the host device 100 from using its own camera, i.e., the electronic device does not agree to establish a communication connection with the host device 100.
Fig. 7A-7B illustrate another way of establishing a communication connection between the master device 100 and the slave device 200. Fig. 7A is a user interface 71 implemented on the slave device 200, and fig. 7B is a user interface 72 implemented on the master device 100.
When the master device 100 detects an operation acting on the icon 623 in the user interface 62, in response to the user operation, the master device 100 transmits a request for establishing a communication connection to the electronic device (slave device 200) corresponding to the icon 623.
As shown in fig. 7A, the user interface 71 may include a validation code 712 and a cancel control 714.
The verification code 712 may be used for connection confirmation of the master device 100 and the slave device 200. In response to the above-described request sent by the master device 100, the slave device 200 may generate a verification code 712. In some embodiments, the verification code 712 may also be generated by the server 300 and then sent to the slave device 200 over a wireless network. The slave device 200 may then display the aforementioned verification code on the user interface 71.
The cancel control 714 can be used to deny a request sent by the master device 100 to use the slave device 200 camera. The slave device 200 may detect a user operation acting on the cancel control 714, and in response to the user operation, the master device 100 may close the dialog box 711.
Fig. 7B illustrates the user interface 72 where the master device 100 enters the verification code. When the slave device 200 displays the user interface 71, the master device 100 may display the user interface 72. The user interface 72 may display a dialog box 721.
Dialog box 721 may include a validation code 7211, a confirmation control 7212. The authentication code 7211 may represent an authentication code entered by a user into the host device 100. The master device 100 may detect an operation acting on the confirmation control 7212. In response to the user operation, the master device 100 may transmit the verification code 7211 to the slave device 200. The user operation is, for example, a click operation, a long press operation, or the like.
The slave device 200 may check whether the received authentication code 7211 matches the authentication code 712 displayed by itself. If the two verification codes are the same, the slave device 200 agrees to the master device 100 to use its own camera. Further, the slave device 200 may turn on its own camera, and transmit an image collected by the camera and processed by the ISP to the master device 100. Conversely, the slave device 200 may deny the master device 100 a request to use the camera of the slave device 200.
In some embodiments, when master device 100 enters authentication code 7211 different from authentication code 712 displayed by slave device 200, slave device 200 may maintain display of authentication code 712 and wait for master device 100 to enter a new authentication code. The slave device 200 may also agree to the master device 100 to use its own camera when the new authentication code received by the slave device 200 coincides with the authentication code 712.
In other embodiments, when the verification code 7211 transmitted by the master device 100 is different from the verification code 712 displayed by the slave device 200, the slave device 200 may regenerate another verification code M and the master device 100 may retrieve the verification code M. When the verification code N acquired by the master device 100 is identical to the verification code M, the slave device 200 may also agree that the master device 100 uses its own camera.
Not limited to the 2 described ways of establishing communication connection shown in fig. 6D-7B, the communication connection may also be established in other ways, for example, using Near Field Communication (NFC) technology, the master device 100 and the slave device 200 may complete authentication through a touch-and-click user operation. The authentication method of the present application is not limited to the 2 nd authentication method mentioned above.
After the connection is established, the master device 100 and the slave device 200 may display a prompt message, respectively. The prompt may prompt the user that the master device 100 and the slave device 200 have established a communication connection. As shown in fig. 8A, user interface 81 shows a user interface displaying prompt information from device 200.
When the authorization shown in fig. 6C (or the authorization shown in fig. 7A-7B) is completed, the slave device 200 may display the user interface 81. The user interface 81 may include a prompt box 811 and a preview box 812. Preview box 812 may be used to display images captured from a camera of device 200. The prompt box 811 may be used to display prompt information. The above-mentioned prompt information is, for example, "the camera of the slave device is being used by the master device 100".
After the slave device 200 grants the master device 100 to use the camera of the slave device 200, the slave device 200 may turn on its own camera. Then, the slave device 200 may display a screen captured and processed by its own camera in a preview frame 812. Above the display level of the preview pane 812, the slave device 200 may display a prompt pane 811.
In some embodiments, the slave device 200 may also display the image captured and processed by its own camera through the floating window. In particular, the slave device 200 may display a floating window in the upper right hand corner of the user interface 60 as shown in FIG. 6A. The floating window may display images captured and processed from the camera of device 200.
When the slave device 200 displays the user interface 81, the master device 100 may display the user interface 82 as shown in fig. 8B. User interface 82 may include a prompt window 821, a window 822, and a window 823.
The prompt window 821 may be used to display prompt information. The above-mentioned prompt information is, for example, "the master device 100 has connected the slave device 200". Window 822 may be used to display images captured and processed from the camera of device 200. The window 823 may display an image captured and processed by the camera of the main apparatus 100.
When the slave device 200 agrees that the master device 100 uses the slave device 200 camera, the master device 100 may obtain an image captured and processed by the slave device 200 camera from the slave device 200. Then, the main device 100 may display the above-described image on the window 822. Meanwhile, the main device 100 may also display a prompt window 821. The user can know that the master device 100 is connected to the slave device 200 through the contents of the hint displayed in the hint window 821.
Additionally, the user interface 82 may also add a setup control 824. The setup control may be used to display the shooting capability options of the slave device 200. For specific description, reference may be made to the following embodiments, which are not repeated herein.
In some embodiments, the master device 100 may also exchange the content displayed by window 823 and window 822. Specifically, the main apparatus 100 can detect a user operation acting on the window 822, and in response to the user operation, the main apparatus 100 can display an image captured and processed from the camera of the apparatus 200 in the window 823. Meanwhile, the main apparatus 100 may display an image captured and processed by the camera of the main apparatus 100 in the window 822. The user operation may be a click operation, a left slide operation, or the like.
In some embodiments, the master device may also divide the window 823 into two separate sections. One part is used for displaying the images collected and processed by the camera of the master device 100, and the other part is used for displaying the images collected and processed by the camera of the slave device 200. The present application does not limit the arrangement of the preview images of the master device 100 and the slave device 200 displayed on the master device 100.
Fig. 6A-8B illustrate a set of user interfaces for the master device 100 to establish a communication connection with the slave device 200 and to display images captured and processed by the camera of the slave device 200. After that, the master device 100 may acquire the capability of the slave device 200 to control the photographic effect, and may transmit a command to control the photographic effect to the slave device 200.
Fig. 9A to 9D exemplarily show a set of user interfaces by which the master device 100 controls the photographing effect of the slave device 200. Where fig. 9A-9C are user interfaces on the master device 100 and fig. 9D are user interfaces on the slave device 200.
When the cueing window 821 showing cueing contents shown in fig. 8B is closed, the main apparatus 100 may display the user interface 91 shown in fig. 9A. The user interface 91 may include a window 911, a window 912, a delete control 913, a settings control 915, a settings control 916.
The window 911 may display an image captured and processed by the camera of the main apparatus 100. Window 912 may display images captured and processed from a camera of device 200. Delete control 913 may be used to close window 912. Delete control 913 may be used to close window 912. The host device 100 may detect a user operation acting on the delete control 913, in response to which the host device 100 may close the window 912.
The setting control 915 may be used to display the capability option of the main apparatus 100 to control the photographing effect. The settings control 916 may be used to display the capability options for controlling the effects of the shot from the device 200. A prompt such as "click to adjust the far end screen" may also be displayed next to the settings control 916. The master device 100 may detect a user operation acting on the setting control 916, and in response to the user operation, the master device 100 may display a capability option of controlling the photographic effect from the device 200, referring to fig. 9B.
In some embodiments, the user interface 91 may also include a delete control 913, an add control 914. Delete control 913 may be used to close one or more windows in user interface 91, such as window 911, window 912. The add control 914 can be used to look up other slave devices for connection. Upon the host device 100 detecting a user operation on the add control 914, the host device 100 may display the query results shown in window 622 in fig. 6C.
When the master device 100 detects other slave devices acting on the display in the window 622, the master device 100 may send a request to use a camera to the slave devices. Similarly, the other slave devices may approve the request sent by the master device 100, and then the slave device may enable its own camera to capture and process the image according to the default shooting parameters, and further, send the processed image to the master device. Meanwhile, the main device 100 may add a window to display the image.
Implementing the above method, the master device 100 may display images transmitted from a plurality of slave devices, thereby providing a richer photographing experience to the user.
In some embodiments, the master device 100 may also multiplex the settings controls 915 and 916. The specific user interface 91 may display an overall settings control. When an image captured and processed by the main device 100 is displayed in the window 911, the setting control may display a capability option of the main device 100 to control a photographing effect. When an image captured and processed from the device 200 is displayed in the window 911, the setting control may display a capability option of the slave device 200 to control a photographing effect.
Fig. 9B exemplarily shows the user interface 92 in which the master device 100 displays the capability option of controlling the photographic effect of the slave device 200. The user interface 92 may include a photographic effect window 921 of the slave device 200. The window 921 may display various capability options that the slave device 200 has to control the effects of shooting, such as aperture, flash, smart follow, white balance 922, ISO sensitivity, zoom range, beauty, filters, and so on.
The present application specifically describes a user interface for the master device 100 to send a control command to the slave device 200 and the slave device 200 to execute the control command, taking the adjustment of the white balance 922 as an example. White balance can be used to calibrate the color temperature deviation of the camera. The white balance 922 may include daylight mode, incandescent light mode 923, fluorescent mode, cloudy mode, shadow mode. The master device 100 can detect the user operation acting on any of the above-described modes. When the master device 100 detects a user operation acting on the incandescent lamp mode 923, the master device 100 may issue a control command to the slave device 200 to change the white balance mode to the incandescent lamp mode in response to the user operation. The slave device 200 receiving the above command may change the white balance 922 to the incandescent lamp mode 923. Then, the master device 100 can receive and display the image of the replacement incandescent lamp mode 923 transmitted from the slave device 200. Refer to fig. 9C. At the same time, displaying an image from the viewfinder of the device 200 may also be adjusted to replace the image after the white balance mode. Refer to FIG. 9D
In some embodiments, the master device 100 may also set a dedicated page to display the capability option of the slave device 200 to control the effects of the shot. I.e., the master device 100 may use the capability options in a separate page display window 921. The embodiment of the present application does not limit this.
As shown in fig. 9C, the user interface 93 may include a window 931.
The window 931 may be used to display images captured and processed from the camera of the device 200. When the master device 100 can issue a control command to the slave device 200 to change the white balance mode to the incandescent lamp mode, the master device 100 can receive an image in which the slave device 200 can change the white balance mode to the incandescent lamp mode. The window 931 may display the above-described image.
While the user interface 93 displays the white-balanced image, the slave device 200 may also display the white-balanced user interface 94. Refer to fig. 9D. The user interface 94 is a user interface displayed on the slave device 200. The user interface 94 may include a preview window 941.
Preview window 941 may be used to display images captured and processed from a camera of device 200. Upon receiving a control command for the master device 100 to change the white balance mode to the incandescent lamp mode to the slave device 200, the slave device 200 may change the white balance mode to the incandescent lamp mode. Preview window 941 may then display images captured and processed from the device 200 camera in incandescent lamp mode. Meanwhile, the slave device 200 may transmit the above-described image to the master device 100. The master device 100 may display the above-described image as shown in fig. 9C.
In some embodiments, the capability of the slave device 200 to control the photographic effect may have some or all of the capabilities in the list above, or may also have other capabilities to control the photographic effect not mentioned in the window 921. This is not limited by the present application.
The master device 100 may also obtain the capability of the master device 100 to control the shooting effect, and issue a control instruction to itself. Fig. 10A-10C the master device 100 issues to itself a set of user interfaces that control the effects of the shot. Fig. 10A exemplarily shows a user interface 101 in which the main apparatus 100 acquires the capability of controlling the photographic effect by itself. User interface 101 may include a settings control 1011.
The setting control 1011 may be used to display the capability option of the main apparatus 100 to control the photographic effect. A prompt message such as "click adjust home screen" may also be displayed next to the settings control 1011. The main apparatus 100 may detect a user operation acting on the setting control 1011, and in response to the user operation, the main apparatus 100 may display a list of the photographic effect capabilities of the main apparatus 100, as shown in the user interface 102 shown in fig. 10B.
The user interface 102 may include a window 1021. The window may display capability options of the main device 100 to control the photographing effect, such as an iris 1022, a flash, a smart follower, beauty, a filter, and the like. The present embodiment takes the aperture as an example, and describes that the master apparatus 100 transmits a control command for adjusting the shooting effect to the master apparatus 100.
The size of aperture 1022 may be adjusted by dial 1024. The main apparatus 100 may detect a user operation acting on the dial 1024, and in response to the user operation, the main apparatus 100 may issue a control command to the main apparatus 100 to adjust the aperture.
Specifically, the initial scale of the dial 1024 may be "f/8". The user can slide the float on the dial 1024 to the f/17 aperture scale by a right slide operation. The host apparatus 100 may detect this user operation, and in response to this user operation, the camera of the host apparatus 100 may replace the aperture with "f/8" by "f/17". The replacement of the aperture with "f/17" can obtain a shallower depth of field, and accordingly, the window displaying the image captured by the main apparatus 100 can display the image captured by the main apparatus 100 with a shallower depth of field. As shown in window 1031 in fig. 10C.
Fig. 10C exemplarily shows the user interface 103 in which the preview window becomes shallow in depth of field of the main device 100. The user interface 103 may include a preview window 1031. The preview window 1031 may be used to display images captured and processed by the camera of the main apparatus 100.
Upon receiving a control command to the master device 100 to replace the aperture with "f/8" to "f/17" from the master device 100, the master device 100 may replace the aperture "f/8" with "f/17". Then, the preview window 1031 may display an image captured and processed by the main apparatus 100 having an aperture size of "f/17".
Fig. 6A to 6D, fig. 7A to 7B, fig. 8A to 8B, fig. 9A to 9D, fig. 10A to 10C, and fig. 11A to 11C describe a series of user interfaces for the master device 100 to establish a communication connection with the slave device 200 and control the shooting effect of the slave device 200 in a live scene. The above method of controlling the photographing effect of the slave device 200 may also be used in a photographing scene. A series of user interfaces for establishing a communication connection between the master device 100 and the slave device 200 and controlling the photographing effect of the slave device 200 in the photographing application scenario will be described below.
Fig. 11A illustrates the master device 100 displaying a user interface 111 to add a slave device. The user interface 111 may include an add control 1112, a dialog box 1113.
Host device 100 may detect a user operation acting on add control 1112, in response to which host device 100 may query an electronic device having a camera. When receiving the message with the camera transmitted back from the electronic device, the main device 100 may display a dialog box 1113. The dialog box 1113 may be used to present information for an electronic device having a camera. For example, the dialog box 1113 exemplifies information of two electronic devices (the electronic device 1114, the electronic device 1115) having cameras. Similarly, the information includes information such as the name and location of the slave device.
Taking the electronic apparatus 1115 as an example, the master apparatus 100 may detect a user operation acting on the electronic apparatus 1115, and in response to the user operation, the master apparatus 100 may transmit a request to use a camera of the electronic apparatus 1115 to the slave apparatus 200. The electronic device 1115 may detect an operation in which the user agrees to the use of the own camera by the host device 100, and in response to the operation, the electronic device 1115 may agree to the use of the camera of the electronic device 1115 by the host device 100. The user interface of the process for granting the use right to the master device 100 by the electronic device 1115 may refer to the user interface in the live scene. As shown in fig. 6C-6D, or fig. 7A-7B. This is not described in detail herein.
When the user of the slave device 200 agrees that the master device 100 uses the camera of the slave device 200, the slave device 200 may display the user interface 112, referring to fig. 11B. Meanwhile, the main device 100 may display a user interface 113, as shown in fig. 11C.
The user interface 112 is a user interface on the slave device 200. User interface 112 illustratively shows a user interface for displaying reminder information from device 200. The user interface 112 may include a prompt window 1121 and a preview window 1122.
After the master device 100 is granted access to the camera of the slave device 200, the slave device 200 may turn on its own camera, and further, the preview window 1122 may display an image captured and processed by the camera of the slave device 200. The user interface 112 may also display a prompt window 1121. The prompt window 1121 prompts the user that the camera of the device has been used by other devices. For example, "the current picture is being used by LISA".
As shown in fig. 11C, the user interface 113 is a user interface on the master device 100. The user interface 113 exemplarily shows a user interface in which the main device 100 displays the prompt information.
When the slave device 200 displays the user interface 112, the master device 100 may display the user interface 113. The user interface 113 may include a window 1131, a window 1132, and a prompt window 1134.
The window 1131 may be used to display images captured and processed by the camera of the main device 100. Window 1132 may display images captured and processed from a camera of device 200. The window 1132 may display the image transmitted from the device 200 when receiving the image transmitted from the device 200. The prompt window 1134 may be used to display prompt information such as "connected camera: phone P40-LouS ". The above prompt information may be used to prompt the user that the master device 100 has connected the slave device 200.
In some embodiments, window 1131 and window 1132 may also exchange display content. I.e., window 1131, may display images captured and processed from the camera of device 200. The window 1132 may be used to display an image captured and processed by a camera of the main apparatus 100. In particular, when the window 1131 may display an image captured and processed from the camera of the device 200, the main device 100 may be provided with the above-described delete key 1133 in the window 1131. In response to a user operation acting on the delete key 1133, the main apparatus 100 may close the window 1131.
In some embodiments, window 1132 may be displayed over window 1131 in the form of a floating window in user interface 113. In another embodiment, window 1132 may also be tiled with window 1131. Refer to fig. 11D.
Fig. 11D illustrates another user interface 114 for the host device 100 to display reminder information. The user interface 114 can include a window 1141, a window 1142, a prompt window 1143, a settings control 1147, a settings control 1148.
The window 1141 may be used to display images captured and processed by the camera of the main device 100. Window 1142 may be used to display images captured and processed from a camera of device 200. Likewise, window 1141 and window 1142 may be interchanged to display content. I.e., window 1141, may display images captured and processed from the camera of device 200. The window 1142 may be used to display images captured and processed by the camera of the host device 100. This is not limited by the present application. The prompt window 1143 is used for displaying prompt information.
The setting control 1147 may be used to display shooting capability options of a camera of the host device 100. A settings control 1148 may be used to display the shooting capability options of the slave device 200's camera.
The master device 100 may detect a user operation acting on the setting control 1148, and in response to the user operation, the master device 100 may display the shooting capability option of the slave device 200. The shooting capability option described above may refer to the shooting capability option displayed in the dialog 921 in the user interface 92 shown in fig. 9B. This is not described in detail herein.
Further, the master device 100 may detect a user operation acting on a certain shooting capability option, and in response to the user operation, the master device 100 may transmit a control command to adjust a shooting effect to the slave device 200.
Also taking the user interface 92 shown in fig. 9B as an example, in response to a user operation of the incandescent lamp 923 that is performed in the above-described dialog 921, the master device 100 may send a control command to the slave device 200 to adjust the white balance mode to the incandescent lamp mode. In response to the above control commands, the slave device 200 may perform incandescent lamp mode color temperature calibration on the images captured by the camera. Then, the slave device 200 may transmit an image for color temperature calibration in the incandescent lamp mode to the master device 100.
The master device 100 may display the image after adjusting the photographing parameters in response to the image transmitted from the slave device 200.
In some embodiments, user interface 114 may also include a delete control 1144, an add control 1145, and a capture control 1146. Delete control 1144 may be used to close window 1142. Add control 1145 may be used by master device 100 to discover other electronic devices with cameras.
The host device 100 can detect a user operation acting on the photographing control 1146. In response to the user operation, the host device 100 may store the contents displayed by the windows 1131, 1132 as pictures or videos. In a live broadcast or video call scenario, the live broadcast application/video call application may also obtain the picture or video and send it to a server providing the live broadcast/video call.
In some embodiments, the number of slave devices may also not be limited to one. For example, the host device 100 may detect user operation of an add control, such as the add control 914 of the user interface 91, the add control 1145 of the user interface 114, and so on. In response to the user operation, the main apparatus 100 can establish connection with other electronic apparatuses with cameras. Further, the master device 100 may establish a connection with a plurality of slave devices and use images transmitted by the plurality of slave devices.
In some embodiments, the master device 100 may also only display images sent by the slave devices. For example, when the master device 100 establishes a communication connection with the slave device 200, the master device 100 may turn off its camera and display only an image transmitted by the slave device 200.
The following describes a detailed flow of the cross-device collaborative shooting method provided in the embodiment of the present application. Fig. 12 shows a detailed flow of the cross-device cooperative shooting method. As shown in fig. 12, the method may include the steps of:
s101: the master device 100 establishes a communication connection with the slave device 200.
In a specific implementation, the communication connection established between the master device 100 and the slave device 200 may be the aforementioned wired connection or wireless connection.
In some embodiments, the host device 100 may first discover other camera-equipped electronic devices in response to a received user operation (e.g., the user operation shown in fig. 6B acting on the add control 614), and then send a connection request to the user-selected electronic device. After the electronic device responds to the user action to grant the request (e.g., the user action on confirmation control 632 shown in fig. 6D), host device 100 and the electronic device successfully establish a communication connection.
In other embodiments, the host device 100 may scan the two-dimensional code of the electronic device 200 and establish a connection with the electronic device 200. Specifically, the electronic device can display the two-dimensional code using the camera of the electronic device. The host device 100 may acquire the two-dimensional code, and in response to the operation, the host device 100 may transmit a use request to the electronic device and obtain consent of the electronic device.
Not limited to the above method, the master device 100 may also establish a communication connection by other means. For example a bump-and-bump operation based on NFC technology. The embodiments of the present application do not limit this.
Specifically, the main device 100 may search for other electronic devices having cameras through a device virtualization kit (DVKit), send a request for establishing a communication connection to the found electronic devices, and after receiving a message of agreeing to establish a communication connection fed back by the other electronic devices, the DVKit may establish a communication connection with the electronic devices. Further, the DVKit establishes a communication connection with the electronic device through a distributed device virtualization platform (DMSDP), where the DMSDP is specifically used for establishing a session with the electronic device. The sessions described above include control sessions and data sessions.
At this time, the above-described electronic device establishing a session with the master device 100 may be referred to as a slave device 200 of the master device 100.
S102: the master device 100 acquires the shooting capability information of the slave device 200 through communication connection.
Based on the session channel established by the master device 100 and the slave device 200, the DMSDP may register the slave device into the virtual camera HAL. Meanwhile, the DMSDP may request the photographing capability of the slave device 200 from the slave device 200.
Specifically, the slave device 200 may acquire its own photographing capability through its own camera service module (CameraService) module. The shooting capabilities include hardware capabilities of the camera and software capabilities of image processing modules such as ISP and GPU, and some shooting capabilities combining the hardware capabilities and the software capabilities, which can be specifically referred to in table 1 in fig. 3.
Then, the slave device 200 may transmit the above-described photographing capability to the master device 100. Upon receiving the shooting capability information transmitted from the device 200, the DMSDP may transmit the shooting capability information to the HAL layer virtual HAL module. Further, the virtual camera HAL may also send the above-described photographing capability to a camera manager (CameraManager).
Table 1 shows hardware capabilities that the slave device 200 may include, and a photographing function combining the hardware capabilities and the software capabilities. Such as the number of cameras, camera ID, pixels, aperture size, zoom range, filters, beauty, and various shooting modes that the slave device 200 is equipped with. In particular, the shooting mode such as the night view mode, the portrait mode, and the like includes not only the hardware capability of the camera of the slave device 200 but also the image processing capability of the slave device 200.
The device co-management and dynamic pipeline may know that the slave device 200 is currently registered in the virtual camera HAL.
When the slave device 200 is registered in the virtual camera HAL, the device cooperation management requires cooperation management of the slave device 200. Specifically, when the master device 100 simultaneously displays images from the master device 100 and the slave device 200, the device cooperation management can distinguish which camera of which device the image comes from, for example, the camera 1 of the master device, the camera 1001 of the slave device, according to the ID of the camera. Then, the device cooperation management may make the time delay of displaying the images from the master device 100 and the slave device 200 within a range acceptable to the human eye by repeating or buffering the image of the master device 100.
Similarly, the dynamic pipeline may distinguish a control command to the master device 100 or the slave device 200 based on information such as the ID of the camera, and transmit the control command transmitted to the slave device 200 to the virtual camera HAL.
After completion of the registration, the virtual HAL may send a notification to the DVKit to change the connection state. Specifically, the virtual HAL may notify the DVKit to change the unconnected state to the connected state. The unconnected state means that the host apparatus 100 does not establish a connection with another electronic apparatus using the camera of the electronic apparatus. Accordingly, the connected state described above means that the main apparatus 100 has established a connection with another electronic apparatus using the camera of the electronic apparatus.
S103: the slave device 200 collects and processes an image and then transmits the collected and processed image to the master device 100.
After the slave device 200 and the master device 100 successfully establish a communication connection, the slave device 200 may automatically start capturing and processing images.
Refer to the user interfaces shown in fig. 6D and 8A. When the slave device 200 agrees to connect with the master device 100, the slave device 200 may open its own camera application. The user interface of the camera application is shown in fig. 8A. Meanwhile, the slave device 200 may display an image captured and processed by its own camera, referring to a preview box 812 of fig. 8A.
Then, the slave device 200 may transmit the above-described image to the master device 100. The master device 100 may display the image. As shown in fig. 8B, the master device 100 may add a window 822 in the user interface 82. The preview window may display images captured and processed from the device 200. Therefore, the main device 100 can realize cross-device collaborative shooting, and simultaneously display images acquired by a plurality of cameras.
In some embodiments, the master device 100 may also only control and use the cameras of the slave devices 200, i.e. the master device 100 only displays the images of the slave devices 200 and not the captured and processed images of the master device 100.
In other embodiments, after the master device 100 and the slave device 200 establish a communication connection, the master device 100 may transmit a control command to turn on the camera to the slave device 200 in response to an operation of a user to turn on the camera of the slave device 200. The slave device 200 may respond to the command and turn on the camera.
Host device 100 may also display a prompt window, for example, following user interface 63 as shown in fig. 6D. The prompt window may ask the user whether to turn on the camera of the slave device 200. When the master device 100 detects a user operation acting on opening the slave device 200, the master device 100 may transmit a control command to open the camera to the slave device 200. In response to the control command, the master device 100 may turn on the camera and obtain an image captured and processed by the camera.
Further, the slave device 200 may transmit the above-described image to the master device 100. The master device 100 may display the image as described above, such as a window 822 shown in fig. 8B. Meanwhile, the slave device 200 may also display the above-described image on its own display screen. Such as user interface 81 shown in fig. 8A.
In some embodiments, the slave device 200 may not display the image captured and processed by its own camera.
In both schemes, the shooting parameters used by the camera of the slave device 200 may be default, for example, the slave device 200 may default to use a rear normal camera for collection, the camera uses one-time focal length, the color temperature calibration uses a default daylight mode, and the aperture size is f/1. 6. Turn on optical anti-shake, turn off flash, shutter time 1/60, ISO sensitivity 400, pixel 8192 × 6144, crop box size 3, turn off beauty/body algorithm, no filter, no sticker, etc.
Then, the master device 100 may send a series of control commands for cooperative shooting, such as a control command for taking a picture, recording a video, or adjusting a shooting effect, such as replacing a filter, to the slave device 200. The slave device 200 may adjust the acquisition and processing of images according to the commands described above.
S104: the master device 100 transmits a command for controlling a photographing effect to the slave device 200 in response to the received user operation.
The control command includes the following information: shooting parameters adjusted in response to user-specific operations, the type of stream creation command (i.e., preview command, video recording command, photograph command). Multi-stream configuration may configure different numbers and types of streams according to different stream creation commands.
The user-adjusted photographing parameters depend on the user operation received by the main device 100, and may include, but are not limited to: hardware parameters of the camera involved in acquiring the image, and/or software parameters involved in processing the image. The shooting parameters also include some combination of hardware parameters and software parameters. Such as compound zoom range, night mode, portrait mode, time-lapse shooting, slow motion, panoramic mode, HDR, and so forth.
Wherein the hardware parameters include one or more of: the ID of the camera, the optical zoom range, whether to turn on the optical image anti-shake, the aperture size adjustment range, whether to turn on the flash, whether to turn on the fill light, the shutter time, the ISO sensitization value, the pixel and video frame rate, and the like.
The software parameters include one or more of: digital zoom value, image cropping size, color temperature calibration mode of image, whether to perform noise reduction on image, beauty/body type, filter type, sticker option, whether to turn on self-timer mirror, etc.
In some embodiments, the control command may further include default values of other photographing parameters. The other shooting parameters are shooting parameters other than the parameters adjusted by the user operation.
The control command carrying the photographing parameters may be transmitted to the slave device 200 through a communication connection established by the master device 100 and the slave device 200. In particular, the virtual camera HAL may send the created control commands to a device virtualization platform (DMSDP), which may include a data session channel and a control session channel of the master device 100 and the slave device 200. The control command may be transmitted to the slave device 200 through a control session channel.
For example, the "set the white balance mode of the slave device 200 to the incandescent lamp mode" control command may include: modified shooting parameters (white balance = incandescent mode), the stream creation command may be a preview command. The control commands may also include default shooting parameters such as one-time focus, no filter, close beauty, etc.
As shown in dialog 921 of user interface 92. According to the shooting capability information of the slave device 200 acquired in S102, the master device 100 may display a shooting capability option corresponding to the shooting capability on the display screen. The main device 100 can detect a user operation applied to a certain shooting-capability option. In response to the user operation, the master device 100 may transmit a command to control a photographic effect to the slave device 200.
Taking the control command of "setting the white balance mode of the slave device 200 to the incandescent lamp mode" as an example, the master device 100 may detect a user operation acting on the incandescent lamp mode, for example, a user operation acting on the incandescent lamp 923 in the user interface 92, and in response to the user operation, the master device 100 may transmit the above-described control command to the slave device 200, the control command being for controlling the slave device 200 to set the white balance mode to the incandescent lamp mode.
In the process of creating and sending the control command, the master device 100 also performs various processes on the command, such as dynamic pipelining, multi-stream configuration, and multiplexing control.
Dynamic pipeline processing is described with reference to fig. 13.
When the application program generates a control command to set the white balance mode of the slave device 200 to the incandescent lamp mode, the camera application 131 may generate a control command with preview control 1311 and adjust the capture effect 1312. The control command corresponds to a repeat frame control 1314. Repeating the frame control may indicate that the control command is to act on multiple frames. The repeat frame control may include fields Cmd and Surfaces. The Cmd field may be used to represent a control command. In some embodiments, the cmd may also include the number of the control command. The Surfaces can be used for receiving the view of the rendered picture and sending the generated effect to the Surfaces flickers for image synthesis and display to the screen.
The dynamic pipeline may tag the control commands described above. The tags may include an as-needed tag, a flow-to tag, and a repeat-frame tag. The camera service 132 may include a pending command queue 1321. When the repeated frame control command reaches the pending command queue 1321, the repeated frame control command may replace the basic command 1322 (e.g., "cmd + streamIds + buffer") in the original pending command queue 1321, and the basic command further includes other default parameters for controlling the shooting effect, such as a default mode of white balance: daylight mode, default filter: no filter, etc.
After the replacement, a repeated frame control command 1324 (e.g., "cmd + streamIds + buffer + incandescent lamp mode + IsNew + repating") is added to the pending command queue 1321. The repeat frame control command 1324 may add two fields IsNew and reproducing. IsNew may be used to indicate that the command is an on-demand control issued by the application. Repeating may be used to indicate that the command is a repeat frame control. Meanwhile, the repeat frame control commands 1324 may replace the original default white balance mode (e.g., daylight mode described above) with an incandescent lamp mode (i.e., white balance = incandescent lamp mode).
In addition, the dynamic pipeline may also mark the flow label that sends the control commands described above to the slave device 200. For example, the dynamic pipeline may add a flow direction tag Device to the control command described above. The Device can represent an object on which a control command acts through a camera number (ID). Referring to table 1, when Device =1, the above control command may represent a control command flowing to the front lens of the master Device 100. The local camera HAL of the host device 100 may receive the above control command. When Device =1002, the above control command may indicate a control command flowing to the rear common lens of the slave Device 200. The virtual camera HAL of the host device 100 may receive the above-described control command. Accordingly, a Device according to a control command that sets a white balance mode of the slave Device 200 to an incandescent lamp mode may be set to 1002.
Then, the multi-stream configuration in the stream processing module may add stream configuration information to the control command. The control commands include a preview control 1311 and a photographing effect 1312 of "setting the white balance mode to the incandescent lamp mode". That is, the multi-stream configuration may configure one preview stream (1080P) and one analysis stream (720P) for preview control 1311. The specific configuration rule of the multi-stream configuration may refer to the introduction of table 2 in fig. 3, which is not described herein again.
The multiplexing control may multiplex the plurality of streams configured by the multi-stream configuration module. Through multiplexing control, the number of streams transmitted back from the slave device 200 to the master device 100 can be reduced, thereby reducing network load and improving transmission efficiency. In particular, the multiplexing control may use a high quality stream to cover a low quality stream. For example, a split stream with a picture quality of 720P may multiplex a preview stream of 1080P. Thus, for control commands requiring a preview stream (1080P) and an analysis stream (720P), the slave device 200 may only pass back a 1080P preview stream.
Further, the above control commands may be sent to the virtual camera HAL module of the HAL layer according to the flow direction tag Device = 1002. As shown in fig. 13, the virtual camera HAL may filter out on-demand control commands 1324. The filtered on-demand control commands may be stored in send command queue 1331. The master device 100 may transmit the control command in the transmission queue 1331 to the slave device 200.
In some embodiments, when the control command is a command to start photographing/recording, the multi-stream configuration may further configure a photographing stream and a recording stream. The multiplexing control will change accordingly according to the multi-stream configuration.
Fig. 14 illustrates an example of multiplexing and demultiplexing of a different stream when the control command is a photograph command. Fig. 14 may include a pre-processing portion and a post-processing portion. The pre-processing part can be completed by a pre-processing module of the stream processing, and the post-processing part can be completed by a post-processing module of the stream processing, which can be referred to as the introduction of fig. 3.
When the camera device session 141 sends a photo taking command, the multi-stream configuration 142 module may configure a preview stream (1080P), an analysis stream (720P), a video stream (4K), and a photo taking stream (4K) for the command. Then, the multiplexing control 143 module may multiplex the information configuring four streams into two streams: one preview stream (1080P), one video stream (4K). Finally, the master device 100 transmits the shunted and multiplexed photographing control command to the slave device 200.
In the process in which the master device 100 transmits a command for controlling a photographic effect to the slave device 200 in response to a received user operation, the master device 100 may multiplex an image stream (stream) required in the control command. In response to the multiplexed control command, the slave device 200 may only send the multiplexed image stream, thereby reducing the image streams transmitted in the network, reducing the network load, and further improving the transmission efficiency.
S105: a command for controlling the photographing effect is received from the slave device 200, and in response to the command, the slave device 200 may capture and process an image.
The camera agent service of the slave device 200 may receive a control command for adjusting a photographing effect transmitted by the master device 100. Specifically, DMSDP can establish a data session channel and a control session channel between the master device 100 and the slave device 200. The camera agent service module of the slave device 200 may receive the control command transmitted through the control session channel. In response to the control command, the slave device 200 may acquire and process an image according to the photographing parameters carried in the control command.
Specifically, the slave device 200 may perform a corresponding operation according to the shooting parameters carried in the control command. When the control command carries a hardware parameter, the slave device 200 may acquire an image according to the hardware parameter, which may include but is not limited to: the slave device 200 captures images using the front or rear camera, optical zoom range, turn on optical image anti-shake, turn on flash, turn on fill light, frame rate of 30fps indicated in the control command. When the control command carries a software parameter, the slave device 200 may process the acquired image according to the software parameter, which may include but is not limited to: and (4) cutting, color temperature calibration, noise reduction, filter effect addition and the like are carried out on the acquired image.
Also taking the control command of "setting the white balance mode of the slave device 200 to the incandescent lamp mode" as an example, after the slave device receives the command, the slave device may adapt and parse the command, and then send the command to the local camera HAL of the slave device 200, and finally obtain the processing result.
Fig. 13 illustrates, in part, the slave device 200, which is an example of the processing procedure of the control instruction in the slave device.
The camera proxy service 134 may include a receive command 1341, a surfaces mapping 1344 table, and a repeat frame control 1342, and the receive command queue 1341 may receive a control command sent by the master device 100 to set the white balance mode of the slave device 200 to the incandescent lamp mode. The control command may refer to the command 1343 in the figure (i.e., "cmd + streamIds + reproducing + incandescent lamp pattern +1002, a 1080P preview stream"). Where the incandescent lamp mode may indicate that the white balance mode of the slave device 200 is set to the incandescent lamp mode, 1002 may identify the object to which the command is sent as the camera 1002 of the slave device 200.
First, the camera proxy service 134 may convert StreamIds of control commands in the received command queue into Surface for the slave device 200 according to the Surface mapping table 1344. In the camera service, there is a one-to-one correspondence between streamId and surfeid, i.e., one streamId corresponds to one surfeid. The surfeid may be used to identify surfaces. The above-mentioned surfaces can serve as a carrier for displaying images of streams generated from the device 200. With reference to the description of the previous embodiments, streamId may indicate a particular stream and surfaces may indicate a particular carrier of display image rendering. Since the slave device 200 needs to generate a corresponding stream according to the request transmitted from the master device 100, the slave device 200 needs to generate StreamIds corresponding to the StreamIds according to the mapping table 1344 after receiving the control command.
The pending command queue 1351 may then replace the base command in the original queue with the repeat frame control described above, resulting in a pending command 1352. The slave Device 200 can transmit the above-described command to the command processing queue 1361 of the camera 1002 according to Device = 1002. In response to the above commands, the ISP of camera 1002 may set the white balance mode to the incandescent lamp mode so that the ISP of camera 1002 may perform incandescent lamp color temperature calibration on the image captured by camera 1002. The ISP may be an Image Video Processor (IVP)/Natural Processing Unit (NPU)/Digital Signal Processing (DSP), and the like, which is not limited in the present application.
After adjusting the white balance mode, a new image may be obtained from the device 200.
Referring to the user interface 91 shown in fig. 9A, an image captured and processed from the camera 1002 of the device 200 may display content with reference to the window 912 before responding to the above-described control command for setting the white balance mode of the slave device 200 to the incandescent lamp mode. As shown in window 912, under a default white balance option (e.g., daylight mode), the image captured and processed by camera 1002 may be less bright and gray. After responding to a control command to set the white balance mode of the slave device 200 to the incandescent lamp mode, the images captured and processed by the camera 1002 may be referenced as shown in window 931 in the user interface 93. At this time, the image has high brightness, and the overall tone of the picture is closer to the color observed by human eyes.
It is to be understood that the above description of the image effects before and after the slave device 200 responds to the control command for adjusting the photographing effect is an example. The above description should not be construed as limiting the embodiments of the present application.
S106: the slave device 200 displays the processed image.
This step is optional. As described in S103, in some embodiments, after the slave device 200 uses the own camera simultaneously with the master device 100, the slave device 200 may display an image acquired and processed by the own camera on a display screen of the device. Refer to the user interface 81 shown in fig. 8A. Accordingly, the slave device 200 may also display the updated image when receiving a control command in response to adjusting the photographing effect. Such as user interface 94 shown in fig. 9D.
In other embodiments, the slave device 200 may not display the image captured and processed by its own camera when the camera is turned on. Therefore, after the slave device 200 responds to the control command for adjusting the photographing effect transmitted by the master device 100, the slave device 200 may not display the adjusted image. That is, the slave device only sends the obtained image to the master device 100 for use, and the display screen of the slave device does not realize the image acquired and processed by the camera of the slave device.
S107: the slave device 200 transmits the processed image to the master device 100.
In response to a control command for adjusting a control photographing effect transmitted by the master device 100, the slave device 200 may obtain a new image. In response to the type and number of streams required by the master device 100, the slave device 200 may generate the streams required by the master device 100 accordingly. In some embodiments, a set of image streams may be streamed from device 200. The set of image streams is the set with the highest image quality among the types and numbers of streams required by the main apparatus 100. The above-mentioned image quality may be the highest resolution, and so on. The slave device 200 may then copy the captured set of image streams into a plurality of streams, and compress, adjust, etc. the copied streams according to the type and number of streams required by the master device 100, such as copying a set of 720P analysis streams from a captured set of 1080P video streams.
Also taking the control command of "setting the white balance mode of the slave device 200 to the incandescent lamp mode" as an example, in the foregoing description, the multiplexing control module may multiplex two streams configured by the multi-stream configuration module into one stream. Therefore, the master device 100 finally transmits a control command requiring one preview stream of 1080P to the slave device 200. Reference is made specifically to the introduction of S103.
In response to the above control command sent by the master device 100, the local camera HAL of the slave device 200 may generate a 1080P preview stream. The preview stream described above has set the white balance mode to the incandescent lamp mode. The local camera HAL may then send the stream to the camera proxy service of the slave device 200, which may in turn transmit the stream back to the master device 100.
In some embodiments, the flow transmitted back from slave device 200 to master device 100 is greater than 1 flow, such as the multiple flow scenario illustrated by example in fig. 14. When the master device 100 sends a photographing control command to the slave device 200, the multi-stream configuration and multiplexing control finally configures one preview stream (1080P) and one video stream (4K) for the photographing control command, which may be referred to in the description of S103.
In response to the above-described photographing control command, the slave device 200 can generate a 1080P preview stream (stream 1), a 4K video stream (stream 2) and a photographed image stream (stream 3). In an alternative embodiment, slave device 200 may acquire a set of image streams from stream 2. According to the stream 2, the slave device 200 can copy out two other sets of streams (stream 1', stream 2'), and according to the requirements of the master device 100 on the stream 1 and the stream 2, the slave device 200 can process the stream 1 'and the stream 2' to obtain the stream 1 and the stream 2, for example, compress the stream 1 'of 4K to obtain the stream 1 (1080P), obtain a photographed image from the stream 2' of 4K, and so on. Then, the camera proxy service of the slave device 200 may transmit the above 3 streams to the master device 100.
S108: the master device 100 displays the image transmitted from the slave device 200.
The master device 100 can restore the streams returned from the slave device 200 to the number and type of streams actually required by the master device 100. The analysis stream may be used for image processing, the preview stream may be used for display, etc. The master device 100 may send each stream to the corresponding module for processing and utilization according to the type of the recovered stream. The master device 100 may implement a function of displaying an image by transmitting a preview stream from the slave device 200.
The post-processing module of the master device 100 can perform smart streaming and multi-streaming output to the slave device 200.
First, the intelligent streaming module may record the number and types of streams configured by the multi-stream configuration module. After receiving the multiplexed stream returned from the device 200, the intelligent distribution module may duplicate the received stream into multiple streams according to the aforementioned record. The replicated multiple streams may have the same quality as the received stream or a slightly lower quality than the received stream, so as to obtain multiple streams having the same number and type as the streams configured by the multi-stream configuration module. Then, the plurality of streams obtained after the splitting can be sent to corresponding modules for processing and utilization.
For example, in an example where the white balance mode of the slave device 200 is set to the incandescent lamp mode, the master device 100 may receive a 1080P preview stream transmitted back from the slave device 200. At this time, the intelligent distribution module can restore the preview stream to a preview stream (1080P) and an analysis stream (720P) configured by the multi-stream configuration module.
Fig. 14 also shows another example of smart splitting. This figure shows the splitting process where master device 100 receives multiple streams back from devices 200.
As shown, the host device 100 may receive one 1080P stream (stream 1), one 4K stream (stream 2), and one picture stream (stream 3). The intelligent splitting module can split stream 1 into a 1080P preview stream and a 720P analysis stream. The stream 2 can be divided into a 1080P stream (stream 1) and a 4K video stream, and the 1080P stream (stream 1) can be further divided, referring to the division of the stream 1. Stream 3 may be divided into a 4K photo stream.
The shunting process can restore the multiplexed stream to the stream required by the original main device 100, thereby reducing the network load and improving the transmission efficiency brought by multiplexing, and simultaneously meeting the original requirement of the application program without influencing the normal use of the application program.
And the plurality of streams obtained after the splitting can be respectively sent to corresponding modules of the application layer for application programs to use. Based, the application may display the preview stream. In some embodiments, master device 100 may only display images transmitted back from device 200. In other embodiments, the master device 100 may display the images transmitted back from the device 200 and the images captured and processed by its own camera at the same time, referring to the user interface 93 shown in fig. 9C.
As shown in fig. 9C, the master device 100 displays that the image transmitted back from the slave device 200 may pass through a floating window, referred to as window 931. In some embodiments, the main device 100 may detect a user operation on the window 931, in response to which the main device 100 may move the position of the floating window to any position of the user interface 93. The operation is, for example, a long press drag operation. In some embodiments, the main apparatus 100 may detect another user operation acting on the window 931, in response to which the main apparatus 100 may exchange the contents of the floating window and the preview window displays. The operation is, for example, a click operation. In addition, the main device 100 may also adjust the size of the floating window, etc., in response to other operations by the user.
In other embodiments, when the master device 100 simultaneously displays the image transmitted back from the slave device 200 and the image captured and processed by the own camera, the display may be further divided into two tiled windows, such as the user interface 114.
It is understood that when the master device 100 connects a plurality of slave devices, the master device 100 may display images transmitted back by the plurality of slave devices. Similarly, the main device 100 may be displayed through a floating window, and may be displayed in a tiled manner, which is not limited in this embodiment of the application.
In some embodiments, when the control command is a photograph command after adjusting parameters, the frame synchronization module may synchronize the received stream returned from the device 200 before post-processing splits the stream returned from the device 200. Frame synchronization can reduce delay errors due to network transmission. Particularly for the photo stream, the photo result obtained by the frame-synchronized photo stream can be closer to the user's requirement.
Referring to fig. 15, fig. 15 illustrates a frame synchronization method diagram.
As shown in fig. 15, the frame sync may include three parts. The first portion 151 and the third portion 153 may represent processes occurring on the master device 100. The second portion 152 may represent a process occurring on the slave device 200. Taking a stream composed of 7 frames and a photographing command as an example, the frame synchronization process of the host apparatus 100 to the photographing command follows.
At the second frame 1511, the host device 100 may create a take command 1512. The master device 100 may then send the commands to the slave device 200 via a wired connection or a wireless connection. The slave device 200 receives the above command at the third frame 1521. Accordingly, the slave device 200 takes the image of the third frame 1521 as a result of executing the photographing command 1512. The slave device 200 having finished processing the photograph command may transmit the processing result back to the master device 100. The processing result received by the master device 100 is a third frame 1521 image. At this time, the master device 100 may perform synchronization processing on the above results.
In the photographing scene, the frame synchronization may advance the processing result. The processing result received by the master device 100 is, for example, a third frame 1521 image. The master device 100 may push forward one frame with the second frame 1511 as the synchronized processing result. It will be appreciated that the receipt of a command from the device 200 one frame time later is an exemplary network delay. That is, in some embodiments, the time of receipt of the command from device 200 may also be a fourth frame, a fifth frame, etc. The time delay for receiving the command from the device 200 varies depending on the actual network communication quality. The same master device 100 shifts forward one frame as a result of synchronization is also exemplary.
In some embodiments, the master device 100 may also not push forward the second frame 1511 as a result of the synchronized processing. For example, in the second frame, the host device 100 may create a take command. The slave device 200 then receives the command at the fourth frame. Accordingly, the slave device 200 takes the image of the fourth frame as a result of the execution of the photographing command. The master device 100 performs forward shift synchronization processing on the received fourth frame image to obtain a third frame image.
When the control command is a command for adjusting only parameters or a video recording command, such as the above-described control command for controlling the photographing effect of "setting the white balance mode to the incandescent lamp mode", the host device 100 may not frame-synchronize the received image.
S109: the host device 100 performs processing such as photographing, recording, or forwarding of the displayed image.
When the application of the master device 100 obtains the image transmitted from the slave device 200, the master device 100 may further utilize the image.
As shown in fig. 9C, in the live application scenario, the live application of the host device 100 may forward the obtained image. Specifically, the host device 100 may forward the image to a live server. The server may distribute the images to user devices watching the live broadcast.
As shown in fig. 11D, in the application scene of photographing or recording, the host device 100 may detect a user operation acting on the photographing control 1146, and in response to the user operation, the host device 100 may perform photographing storage or recording storage on the obtained image captured from the camera of the device 200.
In the embodiments of the present application:
in the method S103, before receiving the control command for adjusting the shooting effect sent by the master device, the image acquired and processed by the slave device may be referred to as a first image, for example, an image shown in a window 912 in the user interface 91.
In the method S108, after the master device sends a control command for adjusting the shooting effect to the slave device, an image acquired and processed by the slave device according to the control command may be referred to as a second image. Such as the image shown in window 921 in user interface 93.
The first shooting parameter: the shooting parameters used by the slave device to acquire and process the first image may be referred to as first shooting parameters, and the first shooting parameters may be default parameters or shooting parameters carried in a control command sent by the master device received last time.
The second shooting parameter is as follows: the shooting parameters used by the secondary device to acquire and process the second image can be called as second shooting parameters, that is, the shooting parameters carried in the control command for adjusting the shooting effect are sent to the secondary device by the primary device.
An image acquired and processed by the main device according to the default shooting parameters of the camera of the main device may be referred to as a third image, for example, an image displayed in the window 911 in the user interface 91.
The user can adjust the shooting effect of the main equipment, and the main equipment can control the camera of the main equipment to adjust the shooting parameters in response to the adjustment operation. And acquiring and processing by the main equipment according to the adjusted shooting parameters to obtain a new image, wherein the new image can be called as a fourth image. Such as an image displayed in window 1031 in user interface 103.
In the method S104, the number of streams configured by the master device according to the requirement of the application layer is a first number, and the number of streams sent to the slave device, which are obtained by multiplexing the reusable streams by the master device, is a second number. The type of the stream configured by the main device according to the requirement of the application layer is a first type, the main device multiplexes the reusable stream to obtain a stream, and the type of the stream sent to the slave device is a second type.
Another possible software structure of the master device 100 is described below.
The difference from the software structure shown in fig. 3 is that: the software framework provided by the embodiment puts the management of the whole flow life cycle into the HAL layer implementation. That is, in the embodiment of the present application, the flow processing module of the service layer in fig. 3 is moved to the HAL layer. The other modules are unchanged. Other parts of another software structure provided in the embodiment of the present application may refer to the description of fig. 3, and are not described again in the embodiment of the present application.
The above-described pre-processing and post-processing operations on the streams, such as multi-stream configuration and multiplexing control, are implemented in the HAL layer, and thus higher processing efficiency can be achieved.
In the cross-device cooperative shooting, the master device can be connected with one or more slave devices, so that not only can a multi-view shooting experience be provided for a user, but also the shooting effect of the slave devices can be controlled, for example, the focusing, exposure, zooming and the like of the slave devices are controlled, and the requirement of the user for controlling the far-end shooting effect is met. Further, by implementing the cross-device collaborative shooting method, the master device can acquire all preview pictures, shooting results and video recording results of the slave devices. For example, the master device may store the screen of the slave device by taking a picture or recording a video, or forward the screen of the slave device to a third-party server in a live application scenario.
The implementation of the cross-device cooperative shooting method can also solve the problem of distributed control among devices with operating systems, extend the functions of the electronic devices to other common hardware shooting devices and flexibly expand the lenses. For example, the method can support the data stream of multiple devices obtained through mobile phone control, and realize the collaborative recording of the mobile phone, a large screen, a watch, a vehicle machine and the like.
In addition, the multiplexing and shunting of the stream in the cross-device collaborative shooting can also be used for carrying out network transmission, so that the transmission efficiency is improved, the transmission time delay is further reduced, and the clear and smooth image quality is ensured.
The method, the system and the device for cross-device collaborative shooting can be further expanded into a distributed audio scene, for example: the distributed camera framework is acted on by the distributed audio framework. The distributed audio scene can realize the unification of distributed audio and video, and the whole cross-device communication efficiency is improved.
As used in the above embodiments, the term "when 8230; may be interpreted to mean" if 8230, "or" after 8230; or "in response to a determination of 8230," or "in response to a detection of 8230," depending on the context. Similarly, the phrase "at the time of determination of \8230;" or "if (a stated condition or event) is detected" may be interpreted to mean "if it is determined 8230;" or "in response to the determination of 8230;" or "upon detection (a stated condition or event)" or "in response to the detection (a stated condition or event)" depending on the context.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer commands. The procedures or functions described in accordance with the embodiments of the present application are generated in whole or in part when the computer program commands are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer commands may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer commands may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), among others.
Those skilled in the art can understand that all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer readable storage medium and can include the processes of the method embodiments described above when executed. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.

Claims (11)

1. A cross-device cooperative shooting method is applied to a master device, the master device establishes communication connection with m slave devices, m is an integer greater than or equal to 1, and the method comprises the following steps:
the main equipment displays an application interface;
the master device receives a first image sent by a slave device, wherein the first image is an image obtained by the slave device according to a first shooting parameter;
the master device displays the m first images on the interface;
the master device receives at least one operation;
the master device responds to the at least one operation and sends a control command carrying a second shooting parameter to the slave device, wherein the second shooting parameter is used for adjusting the shooting effect of the slave device;
the master device receives a second image sent by the slave device, wherein the second image is an image obtained by the slave device according to the shooting parameters;
the primary device displays the second image on the interface.
2. The method of claim 1,
the interface further comprises a plurality of shooting options corresponding to the slave devices, and the shooting options respectively correspond to the shooting capabilities of the slave devices;
wherein the operations comprise: an operation acting on one of the plurality of shooting options; the second photographing parameter includes: and shooting parameters corresponding to the shooting options acted by the operation.
3. The method of claim 2, wherein prior to the master device displaying a plurality of shooting options on the interface, the method further comprises:
the master device acquires the shooting capability of the slave device; wherein the second shooting parameter is within a shooting capability range of the slave device.
4. The method according to any one of claims 1-3, further comprising:
the main equipment acquires and processes the image to obtain a third image;
the master device also displays the third image on the interface.
5. The method of claim 4,
the interface further comprises a plurality of shooting options corresponding to the main equipment, and the plurality of shooting options corresponding to the main equipment correspond to the shooting capability of the main equipment;
the main equipment receives another operation acting on one shooting option in a plurality of shooting options corresponding to the main equipment, and acquires and processes an image according to a shooting parameter corresponding to the shooting option acted by the other operation to obtain a fourth image;
and the master equipment displays the fourth image on an interface and does not display the third image any more.
6. The method according to any one of claims 1-5, wherein before the master device sends a control command carrying second photographing parameters to the slave device in response to the at least one operation, the method further comprises:
the master device determining a first number and a first type, wherein the first number is the number of image streams required for displaying the second image, and the first type comprises the type of the image streams required for displaying the second image;
the master device determining a second number and a second type, the second number being smaller than the first number, the first type containing the second type;
wherein, the control command also carries: the second number, the second type;
the second image includes: and the slave equipment acquires and processes the obtained second quantity and second type of image streams according to the shooting parameters.
7. The method of claim 6, wherein after the master device receives the second image sent by the slave device, the method further comprises:
the master device processing the second image into the first number and the first type of image streams;
the master device displays the second image on an interface, including: the master device displays the second image on the interface according to the first number and the first type of image streams.
8. The method according to any one of claims 1-7, wherein said applying comprises: the shooting type application program, and the interface comprises a use interface of the shooting type application program.
9. The method according to any of claims 1-8, wherein said applying comprises: the live-broadcast application program comprises a use interface of the live-broadcast application program;
after the master device receives the second image sent by the slave device, the method further comprises:
and the main equipment sends the second image to a server corresponding to the live broadcast application program, and the server sends the second image to other equipment.
10. An electronic device, comprising one or more processors and one or more memories; wherein the one or more memories are coupled to the one or more processors for storing computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method of any of claims 1-10.
11. A computer-readable storage medium comprising instructions that, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-10.
CN202210973390.5A 2021-02-04 2021-02-04 Cross-equipment collaborative shooting method, related device and system Active CN115514883B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210973390.5A CN115514883B (en) 2021-02-04 2021-02-04 Cross-equipment collaborative shooting method, related device and system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110154962.2A CN114866681B (en) 2021-02-04 2021-02-04 Cross-equipment collaborative shooting method, related device and system
CN202210973390.5A CN115514883B (en) 2021-02-04 2021-02-04 Cross-equipment collaborative shooting method, related device and system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202110154962.2A Division CN114866681B (en) 2021-02-04 2021-02-04 Cross-equipment collaborative shooting method, related device and system

Publications (2)

Publication Number Publication Date
CN115514883A true CN115514883A (en) 2022-12-23
CN115514883B CN115514883B (en) 2023-05-12

Family

ID=82623054

Family Applications (3)

Application Number Title Priority Date Filing Date
CN202210973390.5A Active CN115514883B (en) 2021-02-04 2021-02-04 Cross-equipment collaborative shooting method, related device and system
CN202110154962.2A Active CN114866681B (en) 2021-02-04 2021-02-04 Cross-equipment collaborative shooting method, related device and system
CN202280009678.9A Pending CN116724560A (en) 2021-02-04 2022-01-07 Cross-equipment collaborative shooting method, related device and system

Family Applications After (2)

Application Number Title Priority Date Filing Date
CN202110154962.2A Active CN114866681B (en) 2021-02-04 2021-02-04 Cross-equipment collaborative shooting method, related device and system
CN202280009678.9A Pending CN116724560A (en) 2021-02-04 2022-01-07 Cross-equipment collaborative shooting method, related device and system

Country Status (2)

Country Link
CN (3) CN115514883B (en)
WO (1) WO2022166521A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116320597A (en) * 2022-09-06 2023-06-23 北京字跳网络技术有限公司 Live image frame processing method, device, equipment, readable storage medium and product
CN115379126B (en) * 2022-10-27 2023-03-31 荣耀终端有限公司 Camera switching method and related electronic equipment
CN118741301A (en) * 2023-03-28 2024-10-01 荣耀终端有限公司 Image shooting method, electronic equipment and system
CN116471429B (en) * 2023-06-20 2023-08-25 上海云梯信息科技有限公司 Image information pushing method based on behavior feedback and real-time video transmission system

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130093907A1 (en) * 2011-10-14 2013-04-18 Canon Kabushiki Kaisha Image capturing apparatus and method of controlling the same
CN104601960A (en) * 2015-01-30 2015-05-06 深圳市视晶无线技术有限公司 Video shooting control and management method and system
US20160050351A1 (en) * 2014-08-14 2016-02-18 Samsung Electronics Co., Ltd. Image photographing apparatus, image photographing system for performing photographing by using multiple image photographing apparatuses, and image photographing methods thereof
CN106803879A (en) * 2017-02-07 2017-06-06 努比亚技术有限公司 Cooperate with filming apparatus and the method for finding a view
CN107707862A (en) * 2017-05-25 2018-02-16 北京小米移动软件有限公司 Treating method and apparatus, first terminal, the second terminal of Video Remote assistance
CN108668071A (en) * 2017-03-29 2018-10-16 至美世界(北京)网络科技有限公司 A kind of image pickup method, device, system and a kind of mobile terminal
CN108900764A (en) * 2018-06-06 2018-11-27 三星电子(中国)研发中心 Image pickup method and electronic device and filming control method and server
CN110291774A (en) * 2018-03-16 2019-09-27 深圳市大疆创新科技有限公司 A kind of image processing method, equipment, system and storage medium
CN111050072A (en) * 2019-12-24 2020-04-21 Oppo广东移动通信有限公司 Method, equipment and storage medium for remote co-shooting
CN111083379A (en) * 2019-12-31 2020-04-28 维沃移动通信(杭州)有限公司 Shooting method and electronic equipment
CN111327865A (en) * 2019-11-05 2020-06-23 杭州海康威视系统技术有限公司 Video transmission method, device and equipment
CN111988528A (en) * 2020-08-31 2020-11-24 北京字节跳动网络技术有限公司 Shooting method, shooting device, electronic equipment and computer-readable storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104427228B (en) * 2013-08-22 2017-09-08 展讯通信(上海)有限公司 Cooperate camera system and its image pickup method
KR20150051776A (en) * 2013-11-05 2015-05-13 삼성전자주식회사 Display apparatus and method for controlling of display apparatus
CN103634524A (en) * 2013-11-15 2014-03-12 北京智谷睿拓技术服务有限公司 Control method and control equipment of camera system and camera system
CN104113697B (en) * 2014-08-01 2017-10-13 广东欧珀移动通信有限公司 Cooperate with take pictures treating method and apparatus, treating method and apparatus of taking pictures
CN106657791A (en) * 2017-01-03 2017-05-10 广东欧珀移动通信有限公司 Method and device for generating synthetic image
CN109120504B (en) * 2017-06-26 2023-05-09 深圳脸网科技有限公司 Image equipment sharing method and social contact method thereof
CN112261430A (en) * 2020-10-21 2021-01-22 深圳市炫刷刷网络科技有限公司 Live broadcast system with more than one mobile camera device and live broadcast method thereof

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130093907A1 (en) * 2011-10-14 2013-04-18 Canon Kabushiki Kaisha Image capturing apparatus and method of controlling the same
US20160050351A1 (en) * 2014-08-14 2016-02-18 Samsung Electronics Co., Ltd. Image photographing apparatus, image photographing system for performing photographing by using multiple image photographing apparatuses, and image photographing methods thereof
CN104601960A (en) * 2015-01-30 2015-05-06 深圳市视晶无线技术有限公司 Video shooting control and management method and system
CN106803879A (en) * 2017-02-07 2017-06-06 努比亚技术有限公司 Cooperate with filming apparatus and the method for finding a view
CN108668071A (en) * 2017-03-29 2018-10-16 至美世界(北京)网络科技有限公司 A kind of image pickup method, device, system and a kind of mobile terminal
CN107707862A (en) * 2017-05-25 2018-02-16 北京小米移动软件有限公司 Treating method and apparatus, first terminal, the second terminal of Video Remote assistance
CN110291774A (en) * 2018-03-16 2019-09-27 深圳市大疆创新科技有限公司 A kind of image processing method, equipment, system and storage medium
CN108900764A (en) * 2018-06-06 2018-11-27 三星电子(中国)研发中心 Image pickup method and electronic device and filming control method and server
CN111327865A (en) * 2019-11-05 2020-06-23 杭州海康威视系统技术有限公司 Video transmission method, device and equipment
CN111050072A (en) * 2019-12-24 2020-04-21 Oppo广东移动通信有限公司 Method, equipment and storage medium for remote co-shooting
CN111083379A (en) * 2019-12-31 2020-04-28 维沃移动通信(杭州)有限公司 Shooting method and electronic equipment
CN111988528A (en) * 2020-08-31 2020-11-24 北京字节跳动网络技术有限公司 Shooting method, shooting device, electronic equipment and computer-readable storage medium

Also Published As

Publication number Publication date
CN114866681A (en) 2022-08-05
CN114866681B (en) 2023-12-01
CN116724560A (en) 2023-09-08
WO2022166521A1 (en) 2022-08-11
CN115514883B (en) 2023-05-12

Similar Documents

Publication Publication Date Title
US11765463B2 (en) Multi-channel video recording method and device
WO2020168956A1 (en) Method for photographing the moon and electronic device
US12096120B2 (en) Photographing method in telephoto scenario and mobile terminal
CN114866681B (en) Cross-equipment collaborative shooting method, related device and system
EP4064684A1 (en) Method for photography in long-focal-length scenario, and terminal
WO2022160985A1 (en) Distributed photographing method, electronic device, and medium
CN116074634B (en) Exposure parameter determination method and device
WO2022222773A1 (en) Image capture method, and related apparatus and system
CN115359105B (en) Depth-of-field extended image generation method, device and storage medium
WO2023160295A1 (en) Video processing method and apparatus
CN115529413A (en) Shooting method and related device
CN114466131B (en) Cross-device shooting method and related device
CN114866659A (en) Shooting method and electronic equipment
CN114827439A (en) Panoramic image shooting method and electronic equipment
WO2023143171A1 (en) Audio acquisition method and electronic device
CN117082295B (en) Image stream processing method, device and storage medium
WO2023142731A1 (en) Method for sharing multimedia file, sending end device, and receiving end device
CN115802144B (en) Video shooting method and related equipment
CN117479008B (en) Video processing method, electronic equipment and chip system
CN116777740A (en) Screen capturing method, electronic equipment and system
CN118264889A (en) Image processing method and electronic equipment
CN118695090A (en) Image processing method and related device by end cloud cooperation
CN117221707A (en) Video processing method and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant