Nothing Special   »   [go: up one dir, main page]

CN118450249A - Camera switching method and electronic equipment - Google Patents

Camera switching method and electronic equipment Download PDF

Info

Publication number
CN118450249A
CN118450249A CN202311267256.4A CN202311267256A CN118450249A CN 118450249 A CN118450249 A CN 118450249A CN 202311267256 A CN202311267256 A CN 202311267256A CN 118450249 A CN118450249 A CN 118450249A
Authority
CN
China
Prior art keywords
camera
session
service
management service
link
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311267256.4A
Other languages
Chinese (zh)
Inventor
徐秀月
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202311267256.4A priority Critical patent/CN118450249A/en
Publication of CN118450249A publication Critical patent/CN118450249A/en
Pending legal-status Critical Current

Links

Landscapes

  • Studio Devices (AREA)

Abstract

The application discloses a camera switching method and electronic equipment, wherein the method comprises the following steps: switching the invoked camera from the first camera of the second device to a scene of the second camera of the second device for the first device, a second session may be created between the first device and the second device to keep alive the first link; the first link may carry the second session and the session for transmitting the video stream. Thus, when the session for transmitting the video stream is closed/newly established, the first link is not disconnected and is not required to be rebuilt, so that the time delay of camera switching is reduced.

Description

Camera switching method and electronic equipment
Technical Field
The embodiment of the application relates to the technical field of terminals, in particular to a camera switching method and electronic equipment.
Background
In some applications in electronic devices such as tablet, notebook, etc., it may be necessary to call the camera to take a picture or video at any time. For example, when a video conference is held by a notebook computer, it may be necessary to take video by an own camera of the notebook computer and display the video while the notebook computer shows a document of the video conference. However, due to the large size of the notebook computer or limited imaging capability of the self-contained camera, the flexibility of imaging by the self-contained camera is not high.
In order to improve the flexibility of shooting, shooting can be performed by calling a camera across devices. For example, when a video conference is held by a notebook computer, the notebook computer calls a camera of a mobile phone to shoot, and displays a picture shot by the camera of the mobile phone. However, when the camera is called across devices, if the camera is switched, for example, the front camera of the called mobile phone is switched to the rear camera of the mobile phone, the link between the notebook computer and the mobile phone needs to be disconnected when the front camera of the mobile phone is closed, then a new link is re-established, and the rear camera of the mobile phone is opened. In this process, the link disconnection and reestablishment process may result in a larger delay in camera switching. How to reduce the time delay of camera switching when calling the camera across devices becomes one of the problems to be solved.
Disclosure of Invention
The application provides a camera switching method and electronic equipment, which can reduce the time delay of camera switching when a camera is called across equipment.
In a first aspect, the present application provides a camera switching method, applied to a first device, where the method includes: displaying a first control under the condition that the first device receives a first video stream acquired by a first camera of the second device; the first equipment establishes a first link with the second equipment; the first video stream is transmitted via a first session carried in a first link; the first control is used for switching the camera called by the first device into a second camera of the second device; responding to the operation of the user on the first control, and establishing a second session with the second equipment; the second session is carried on the first link and is used for keeping the first link alive; sending a first closing request to the second device; the first closing request is used for indicating the second equipment to close the first session and closing the first camera; after the first session is successfully closed, establishing a third session with the second device, and sending a first opening request to the second device; the first opening request is used for indicating the second equipment to start the second camera; the third session is carried on the first link; receiving a second video stream; the second video stream is captured via a second camera of the second device and transmitted to the first device via a third session.
In summary, for a scenario in which a first device switches a invoked camera from a first camera of a second device to a second camera, the present application may create a second session between the first device and the second device to keep alive the first link before sending a first close request to the second device. Based on this, even after the second device turns off the first camera and disconnects the first session with the first device, the first link does not disconnect, and accordingly, it is not necessary to create a new link carrying the third session before the second device turns on the second camera and creates the third session. Therefore, the application can reduce the time delay of camera switching by omitting the steps of disconnecting and rebuilding the first link.
In one possible implementation, the keep-alive duration of the second session is greater than or equal to a time interval between a time when the second session was successfully established and a time when the third session was successfully established.
For the first link, once a session is carried on the first link, the first link is not broken. Therefore, when the second session is created, the keep-alive time of the second session can be set, and the keep-alive time of the second session is set until the third session is successfully established, so that the second session can fully ensure that the first link is not disconnected before the third session is successfully created, i.e. the first link does not need to be reestablished.
In one possible implementation, the second device is a device that logs in the same user account within the near field communication range of the first device and/or in the first device.
In this way, cross-device camera invocation and video streaming may be implemented through near field communication between mutually trusted devices.
In one possible implementation, the first control includes a camera flip control, where the camera flip control is configured to implement flip switching between the first camera and the second camera; or the first control comprises an option control of the second camera.
That is, the first device may display the camera flip control, and the user may directly click on the camera flip control to switch the camera invoked by the first device from the first camera to the second camera. Or the first device displays option controls of the plurality of cameras, wherein the option controls of the plurality of cameras comprise option controls of the second camera, so that a user can directly click on the option controls of the second camera (which is equivalent to the selection of the second camera by the user), and the camera called by the first device is switched from the first camera to the second camera.
In a second aspect, the present application provides a camera switching method, applied to a second device, where the method includes: establishing a second session with the first device while sending a first video stream acquired by a first camera of the second device to the first device; the first equipment establishes a first link with the second equipment; the first video stream is transmitted via a first session carried in a first link; the second session is carried on the first link and is used for keeping the first link alive; receiving a first closing request sent by first equipment, and closing a first session and a first camera in response to the first closing request; after the first session is successfully closed, establishing a third session with the second device; receiving a first opening request sent by first equipment, and starting a second camera in response to the first opening request; transmitting a second video stream to the first device; the second video stream is acquired via the second camera and transmitted to the first device via the third session.
In summary, for a scenario in which a first device switches a called camera from a first camera of a second device to a second camera, the present application may create a second session between the first device and the second device to keep alive the first link before the second device successfully closes the first camera. Based on this, even after the second device turns off the first camera and disconnects the first session with the first device, the first link does not disconnect, and accordingly, it is not necessary to create a new link carrying the third session before the second device turns on the second camera and creates the third session. Therefore, the application can reduce the time delay of camera switching by omitting the steps of disconnecting and rebuilding the first link.
In one possible implementation, the keep-alive duration of the second session is greater than or equal to a time interval between a time when the second session was successfully established and a time when the third session was successfully established.
In this way, the keep-alive time of the second session can be set until the third session is successfully established, so that the second session can fully ensure that the first link is not disconnected before the third session is successfully established, i.e. the first link does not need to be reestablished.
In one possible implementation, the first device includes a camera management service, and the second device includes a camera business service and a camera management service; the establishing of the second session with the first device includes: the camera management service of the second device establishes a second session with the camera management service of the first device; the closing of the first session and the first camera includes: after the camera management service of the second device establishes a second session with the camera management service of the first device, the camera management service of the second device detects whether a first instruction is received; the first instruction is used for indicating to close the first session and the first camera; if the camera management service of the second device receives the first instruction, the camera management service of the second device closes the first session and the first camera; or if the camera management service of the second device does not receive the first instruction, the camera management service of the second device queries the service state of the first camera to the camera service; if the service state of the first camera is the no-service state, the camera management service of the second device closes the first session and the first camera.
In this way, the camera management service of the second device may perform closing the first session and the first camera by detecting whether the first instruction is received after the camera management service of the second device establishes the second session with the camera management service of the first device. Meanwhile, in the case that the first instruction is not received, the first instruction may not be received by the camera management service due to transmission abnormality or transmission delay in the process that the first instruction is sent to the camera management service by the camera service, so that the camera management service can actively query the service state of the first camera for the camera service, and execute closing of the first session and the first camera according to the service state of the first camera. Based on the above, when the first instruction is not received, the service state of the first camera is actively queried, so that the working state (such as starting/closing) of the first camera is consistent with the service state.
In one possible implementation manner, before the closing the first session and the first camera, the method further includes: responding to a first closing request sent by first equipment, and updating the service state of a first camera from a service state to a no-service state by using the camera business service of second equipment; the camera business service of the second device sends a first instruction to the camera management service of the second device.
In this way, when the second device receives the first closing request, in the camera service of the second device, the service state of the first camera is updated to be the no-service state, and a first instruction is sent to the camera management service of the second device, so that the camera management service of the second device can determine whether to execute closing of the first camera and the first session based on the first instruction or the service state of the first camera.
In one possible implementation manner, the starting the second camera in response to the first opening request sent by the first device includes: responding to a first opening request sent by the first equipment, and sending a second instruction to a camera management service of the second equipment by a camera business service of the second equipment; the second instruction is used for indicating to start the second camera; the camera management service of the second device responds to the second instruction to inquire the working state of the first camera and the working state of the second camera; if the working state of the first camera is a starting state and the working state of the second camera is a closing state, closing the first camera, and starting the second camera after the first camera is successfully closed; or if the working state of the first camera is the closed state and the working state of the second camera is the closed state, starting the second camera.
When the first opening request is received, the camera management service of the second device does not execute the closing process of the first camera, so that before the second camera is opened, the working states of the first camera and the second camera need to be queried, so that errors or anomalies cannot occur when the second camera is opened, and the accuracy of camera switching is improved.
In one possible implementation, the second device is a device that logs in the same user account within the near field communication range of the first device and/or in the first device.
In this way, cross-device camera invocation and video streaming may be implemented through near field communication between mutually trusted devices.
In a third aspect, the present application provides an electronic device comprising one or more processors and one or more memories. The one or more memories are coupled to the one or more processors, the one or more memories being configured to store computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the camera switching method of the first aspect and any of the possible implementations thereof, or to perform the camera switching method of the second aspect and any of the possible implementations thereof.
In a fourth aspect, the present application provides a communication system comprising a first device and a second device in any of the possible implementation manners of the first aspect or the second aspect, and configured to perform a camera switching method in any of the possible implementation manners of the first aspect, the second aspect, and the second aspect.
In a fifth aspect, the present application provides a camera switching device comprising a function/unit for performing the camera switching method in the first aspect and any of its possible implementation manners, or a function/unit for performing the camera switching method in the second aspect and any of its possible implementation manners.
In a sixth aspect, the present application provides a chip system, the chip system being applied to an electronic device, the chip system including at least one processor and an interface, the interface being configured to receive computer instructions and transmit the computer instructions to the at least one processor; the at least one processor executes computer instructions to cause the electronic device to perform the camera switching method in any of the possible implementations of the first aspect and any of the possible implementations thereof, or to perform the camera switching method in any of the possible implementations of the second aspect and any of the possible implementations thereof.
In a seventh aspect, the present application provides a computer readable storage medium having stored therein computer instructions that, when executed on an electronic device, cause the electronic device to perform the camera switching method in any of the possible implementations of the first aspect and any of the possible implementations of the second aspect.
In an eighth aspect, the present application provides a computer program product which, when run on a computer, causes the computer to perform the camera switching method of the first aspect and any of its possible implementation manners, or to perform the camera switching method of the second aspect and any of its possible implementation manners.
It may be appreciated that the above-provided electronic device, communication system, camera switching device, chip system, computer readable storage medium, and computer program product may refer to the advantages of the first aspect, the second aspect and any one of the possible implementation manners thereof, which are not described herein.
Drawings
Fig. 1 is a schematic diagram of a communication system according to an embodiment of the present application;
fig. 2A is a schematic diagram of a camera switching scenario provided in an embodiment of the present application;
Fig. 2B is a schematic diagram of another camera switching scenario provided in an embodiment of the present application;
fig. 3 is a hardware structure diagram of an electronic device according to an embodiment of the present application;
Fig. 4A is a software architecture diagram of a first device according to an embodiment of the present application;
fig. 4B is a software architecture diagram of a second device according to an embodiment of the present application;
Fig. 5 is a schematic flow chart of a camera switching method according to an embodiment of the present application;
fig. 6 is a schematic view of a scene for displaying a first control according to an embodiment of the present application;
fig. 7 is a software interaction flow chart in a camera switching method according to an embodiment of the present application;
fig. 8 is a block diagram of a chip system according to an embodiment of the present application.
Detailed Description
The technical solutions of the embodiments of the present application will be clearly and thoroughly described below with reference to the accompanying drawings. Wherein, in the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B; the text "and/or" is merely an association relation describing the associated object, and indicates that three relations may exist, for example, a and/or B may indicate: the three cases where a exists alone, a and B exist together, and B exists alone, and furthermore, in the description of the embodiments of the present application, "plural" means two or more than two.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the application, unless otherwise indicated, the meaning of "a plurality" is two or more.
The term "User Interface (UI)" in the following embodiments of the present application is a media interface for interaction and information exchange between an application program or an operating system and a user, which enables conversion between an internal form of information and a form acceptable to the user. The user interface is a source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, and the interface source code is analyzed and rendered on the electronic equipment to finally be presented as content which can be identified by a user. A commonly used presentation form of a user interface is a graphical user interface (graphic user interface, GUI), which refers to a graphically displayed user interface that is related to computer operations. It may be a time, date, text, icon, button, menu, tab, text box, dialog box, status bar, navigation bar, widget, etc. visual interface element displayed in the display of the electronic device.
For easy understanding, the following describes the communication system in the embodiment of the present application:
An embodiment of the present application provides a communication system, referring to fig. 1, the communication system includes a first device (such as a notebook 110 shown in fig. 1) and at least one second device (such as a mobile phone 120 and a tablet 130 shown in fig. 1).
The user can call the camera of any second device to shoot through the first device, any second device returns image data (such as video stream) collected by the called camera to the first device, and the first device displays the image data.
When a user calls a camera of any one of the second devices through the first device to shoot, the method comprises the conditions of opening the camera, closing the camera and switching the camera. Taking the notebook computer 110 as an example, call the camera of the tablet 130:
Opening a camera: when a user needs to open any camera of the tablet 130 through the notebook computer 110, the notebook computer 110 can send an opening request of the camera to the tablet 130; after receiving the opening request, the tablet 130 starts the corresponding camera to take a picture in response to the opening request. The tablet 130 may then send the image data acquired by the activated camera to the notebook computer 110 for display.
Closing the camera: when a user needs to close any camera of the tablet 130 through the notebook computer 110, the notebook computer 110 may send a closing request of the camera to the tablet 130, and after receiving the closing request, the tablet 130 closes the corresponding camera in response to the closing request, and stops sending the image data collected by the camera.
Switching cameras: when the user needs to switch the camera of the invoked tablet 130 through the notebook computer 110, the notebook computer 110 may send a request for closing the first camera (e.g., front camera) and a request for opening the second camera (e.g., rear camera) to the tablet 130. The tablet 130, after receiving the close request of the first camera and the open request of the second camera, closes the first camera and activates the second camera. The tablet 130 may then send the image data acquired by the activated second camera to the notebook computer 110 for display. Based on this, the switching of the notebook computer 110 to call the camera across devices can be achieved (the switching of the called camera from the front camera of the tablet 130 to the rear camera of the tablet 130 is achieved).
Before the first device invokes the camera of any second device to shoot, communication connection can be established between the first device and any second device, and the Communication connection can be near field Communication connection, wherein near field Communication comprises bluetooth, wireless fidelity (WIRELESS FIDELITY, wi-Fi), near field Communication (NEAR FIELD Communication, NFC) and the like.
Through the communication connection established between the first device and any second device, the first device may send the related request (such as a closing request and an opening request) to any second device based on a control link in a point-to-point manner, so that any second device turns on or off the camera. And, any of the second devices may send image data (e.g., video stream) collected by the camera to the first device based on the data link in a point-to-point manner. It will be appreciated that the data link may also be used to transmit other data, such as text data, audio data, etc.
For the data link between the first device and any second device, different types of sessions may be created in advance on the data link to transmit different types of data. For example, session a is of video type for transmitting video streams; session B is a bytes type for transmitting audio data or the like; session C is of the HTML type for transmitting text data. When there is no session in the data link between the first device and any of the second devices, the data link is disconnected.
For the case of switching cameras described above: when the tablet 130 closes the first camera, the session for transmitting the image data acquired by the first camera is also closed. In this case, since the session for transmitting the image data collected by the first camera is closed, the data link between the notebook 110 and the tablet 130 is disconnected. Therefore, before the tablet 130 opens the second camera, it is necessary to reestablish the data link between the notebook 110 and the tablet 130, then create a session for transmitting the image data collected by the second camera, and open the second camera. It follows that the disconnection and reconstruction of the data link involved in this process increases the delay in camera switching.
By way of example, the first device/second device may be a handheld device, a vehicle-mounted device, etc., such as a mobile phone (mobile phone), a tablet, a laptop, a palmtop computer, a mobile internet device (mobile INTERNET DEVICE, MID), a Virtual Reality (VR) device, an augmented reality (augmented reality, AR) device, a wireless device in industrial control (industrial control), a wireless device in unmanned (SELF DRIVING), a wireless device in teleoperation (remote medical surgery), a wireless device in smart grid (SMART GRID), a wireless device in transportation security (transportation safety), a wireless device in smart city (SMART CITY), a wireless device in smart home (smart home) device, a cellular phone, a cordless phone, a session initiation protocol (session initiation protocol, SIP) phone, a wireless local loop (wireless local loop, WLL) station, a personal digital assistant (personal DIGITAL ASSISTANT, PDA), a handheld device with wireless communication function, a personal computer (personal computer, PC), a computing device or a modem connected to a wireless processor, a mobile terminal (PLMN) of the present application, an evolution (mobile terminal) of the future, a mobile terminal (mobile terminal) of the present application, or the like, is not limited to this application.
The first device/second device may also be a wearable device. For example: the intelligent watch or the intelligent glasses and the like and only focus on certain application functions, can be matched with other equipment such as a smart phone for use, such as various intelligent bracelets, intelligent jewelry and the like for physical sign monitoring.
The first device/second device may also be a terminal device in an internet of things (internet of things, ioT) system.
The application provides a camera switching method and electronic equipment, which can reduce the time delay of camera switching when a camera is called across equipment.
The method provided by the application can be applied to switching scenes of calling cameras across devices. When the camera is called across equipment, the switching scene of the camera is described by way of example:
In one example, referring to fig. 2A, when the front camera of the tablet 130 is invoked for photographing, the notebook computer 110 may display the interface 20 shown in fig. 2A, where the interface 20 includes a photographed picture (e.g., a portrait), call information of the camera (e.g., "in front camera invocation of the tablet 130"), a flip control 210, a pause control, and a disconnect control. The tablet 130 displays an interface 21 as shown in fig. 2A, the interface 21 including camera usage information (e.g., "front camera in use"), device connection information (e.g., "connected to notebook 110"), a pause control, and a disconnect control. The notebook computer 110 may receive a click operation of the flip control 210 by a user, and switch the camera of the tablet 130 called by the notebook computer 110 from the front camera to the rear camera. For example, after the switch is completed, the notebook computer 110 displays the interface 22 shown in fig. 2A, where the interface 22 includes a captured picture (e.g., a puppy), call information for a camera (e.g., "in a post camera call for the tablet 130"), a flip control 210, a pause control, and a disconnect control. Tablet 130 displays interface 23 as shown in FIG. 2A, with interface 21 including camera usage information (e.g., "rear camera in use"), device connection information (e.g., "connected to notebook 110"), a pause control, and a disconnect control.
In another example, referring to fig. 2A, when notebook computer 110 invokes a front-facing camera of tablet 130 to take a photograph, a user may view a list of selectable cameras of notebook computer 110 at notebook computer 110. For example, before the notebook computer is switched, the interface 24 as shown in fig. 2B is displayed, the interface 24 includes a selectable camera list including a front camera of the notebook computer 110, a rear camera of the notebook computer 110, a front camera of the mobile phone 120, a rear camera of the mobile phone 120, a front camera of the tablet 130, and a selection control 220 of a rear camera of the tablet 130. The front-facing camera selection control of tablet 130 is also used to prompt the user that the front-facing camera of tablet 130 is in call. The user may click on the rear camera selection control 220 of the tablet 130 to switch the camera invoked by the notebook computer 110 to the rear camera of the tablet 130. After the switching is completed, the notebook computer 110 may display an interface 25 as shown in fig. 2B, where the interface 25 still includes the selection control of each camera, and the selection control 220 of the rear camera of the tablet 130 is used to prompt the user that the rear camera of the tablet 130 is in the call. In this example, the interface 21 and the interface 23 displayed by the tablet 130 may be described with reference to fig. 2A.
It will be appreciated that in addition to the two examples described above, the user may trigger switching the camera's scene across devices in other ways, neither of which is limiting in this application.
In addition, the cross-equipment calling camera can be applied to video conferences, video teaching, picture insertion and the like. For example, the present application may invoke a camera across devices when using a teleconferencing application, a live application, a remote teaching application, and the like.
The electronic device according to the present application includes a first device and a second device, and the relationship between the first device and the second device may be referred to the description in fig. 1, and the method according to the present application may be applied to the communication system in fig. 1. The hardware architecture and software architecture of the first device/second device are described below:
Referring to fig. 3, a hardware configuration diagram of an electronic device (e.g., a first device/a second device) according to an embodiment of the present application is provided. As shown in fig. 3, taking an example that the electronic device is a mobile phone, it may include: processor 310, external memory interface 320, internal memory 321, universal serial bus (universal serial bus, USB) interface 330, charge management module 340, power management module 341, battery 342, antenna 1, antenna 2, mobile communication module 350, wireless communication module 360, audio module 370, speaker 370A, receiver 370B, microphone 370C, headset interface 370D, sensor module 380, keys 390, motor 391, indicator 392, camera 393, display 394, and subscriber identity module (subscriber identification module, SIM) card interface 395, among others.
It is to be understood that the configuration illustrated in this embodiment does not constitute a specific limitation on the electronic apparatus. In other embodiments, the electronic device may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 310 may include one or more processing units, such as: the processor 310 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a memory, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and command center of the electronic device. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
The charge management module 340 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. The battery 342 is charged by the charge management module 340, and the electronic device may be powered by the power management module 341.
The power management module 341 is configured to connect the battery 342, the charge management module 340 and the processor 310. The power management module 341 receives input from the battery 342 and/or the charge management module 340 to power the processor 310, the internal memory 321, the external memory, the display screen 394, the camera 393, the wireless communication module 360, and the like. The power management module 341 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance), and other parameters.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 350, the wireless communication module 360, a modem processor, a baseband processor, and the like.
The electronic device implements display functions through the GPU, display screen 394, and application processor, etc. The GPU is a microprocessor for image processing, connected to the display screen 394 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 310 may include one or more GPUs that execute program instructions to generate or change display information.
The electronic device may implement shooting functions through the ISP, the camera 393, the video codec, the GPU, the display screen 394, the application processor, and the like.
In some embodiments, when the electronic device is the second device, it may invoke the camera 393 to take an image based on the request of the first device and return to the first device.
The external memory interface 320 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device. The external memory card communicates with the processor 310 through an external memory interface 320 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 321 may be used to store computer executable program code comprising instructions. The processor 310 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 321. The internal memory 321 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device (e.g., audio data, phonebook, etc.), and so forth. In addition, the internal memory 321 may include a high-speed random access memory, and may also include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device may implement audio functionality through an audio module 370, a speaker 370A, a receiver 370B, a microphone 370C, an ear-headphone interface 370D, and an application processor, among others. Such as music playing, recording, etc.
The keys 390 include a power on key, a volume key, etc. Key 390 may be a mechanical key. Or may be a touch key. The electronic device may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device.
The motor 391 may generate a vibration alert. The motor 391 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 391 may also correspond to different vibration feedback effects by touch operations applied to different areas of the display screen 394. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 392 may be an indicator light, which may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 395 is for interfacing with a SIM card. The SIM card may be inserted into the SIM card interface 395 or removed from the SIM card interface 395 to enable contact and separation with the electronic device. The electronic device may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 395 may support Nano SIM cards, micro SIM cards, and the like.
In the embodiment of the present application, a software system of an electronic device (such as a first device/a second device) may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. Herein, in a layered architectureThe system is exemplified by the software architecture of the first device and the second device.
Referring to fig. 4A, a software architecture diagram of a first device according to an embodiment of the present application is provided. Referring to fig. 4B, a software architecture diagram of a second device according to an embodiment of the present application is provided. As shown in fig. 4A and 4B, the layered architecture divides the software into several layers, each with a clear role and division of labor. The layers are in data transmission and communication through a software interface. In some embodiments, it willThe system is divided into four layers, from top to bottom, an application layer, an application framework layer, a hardware abstraction layer (hardware abstract layer, HAL), and a kernel layer.
① The application layer may comprise a series of application packages. As shown in fig. 4A, the application layer of the first device includes a three-way application and a camera business service. As shown in fig. 4B, the application layer of the second device includes a camera business service.
Where a three-way application refers to an application that may invoke a camera of a trusted device, for example, three-way applications include, but are not limited to, teleconferencing applications, live applications, remote teaching applications, and the like. The camera business service is used to provide relevant capabilities for invoking a camera across devices, such as discovery, enabling, starting/ending a camera service, updating a service state (e.g., no service state/serviced state) of a camera, etc. of a trusted device. In the application, the camera service of the first device is used for receiving the related instruction sent by the camera management service of the first device, so as to inform the camera service of the second device to manage the camera service provided by the second device.
② The application framework layer provides an application programming interface (application programming interface, API) framework and various services and management tools for application developers to access core functions. The application framework layers of the first device and the second device each include a camera management service, which may be a virtualization platform (distributedmobile sensing development platform, DMSDP), for providing virtual (e.g., cross-device call) camera capabilities, including: the camera is turned off/on, creating or closing a session for carrying image data acquired by the camera. In the application, the camera management service of the first device is used for sending related instructions to the camera service of the first device, so as to inform the camera service of the second device to manage the camera service provided by the second device; and the camera management service of the first device is for managing the related session between the first device and the second device. The camera management service of the second device is configured to receive a related instruction sent by the camera service of the second device, so as to close/open the camera, and close the session.
③ The hardware abstraction layer is an interface layer between the operating system kernel and the hardware circuitry, which aims at abstracting the hardware. It hides the hardware interface details of the specific platform and can provide a virtual hardware platform for the operating system. The hardware abstraction layer of the first device comprises communication hardware abstraction and display hardware abstraction; the hardware abstraction layer of the second device includes a communication hardware abstraction, a display hardware abstraction, and a camera hardware abstraction.
Communication hardware abstraction is used to enable data transfer across devices, self-discovery of devices, etc. By way of example, through communication hardware abstraction, a first device may discover a second device located in the near field, establish a data link with the second device, and control the link. The image data may then be transmitted over the data link and the control link may transmit the associated request.
The display hardware abstraction is used to implement the display function of the device, control the display area, and so on. For example, the first device may display a frame corresponding to the received image data through a display hardware abstraction, and display scheduling information of the camera.
The camera hardware abstraction is used to control camera startup/shutdown in response to the camera management service.
④ The kernel layer is the layer between hardware and software. The kernel layers of the first device and the second device may each include a display driver, a communication driver, etc., including but not limited to a bluetooth driver, a Wi-Fi driver, etc. The kernel layer of the second device may also include a camera driver. These individual drivers are used to respond to the instructions of the corresponding hardware abstraction in the hardware abstraction layer, thereby implementing the corresponding functions. A camera driver as in the second device is used to control camera on/off in response to the camera hardware abstraction layer.
It can be understood that the above software architecture is merely an example, and in a specific implementation, the first device/second device may further include further functional modules in the above layers, which is not described in detail in the present disclosure.
The camera switching method provided by the embodiment of the application can be completed in a communication system formed by the first device and the second device with the software and hardware structures, and the image shooting method provided by the embodiment of the application is further described in detail below with reference to the accompanying drawings.
Referring to fig. 5, the method for switching the camera provided by the embodiment of the application includes S501 to S513, where S501 to S505 are steps executed before the camera is switched when the camera is called across devices; s506 to S513 are steps executed when the camera is switched. The following first describes the steps performed before the camera is switched:
S501, the first device acquires an operation of opening the first camera.
In the embodiment of the application, communication connection is established between the first equipment and the second equipment, and the first camera is one camera of the second equipment. For example, the first device is the notebook computer 110 in fig. 1, the second device is the tablet 130, the tablet 130 includes a front camera and a rear camera, and then the first camera is the front camera or the rear camera of the tablet 130.
Optionally, the first device may obtain an operation of opening the first camera by the user when running the target application. The first camera may be a camera that was called when the second device was called last time across devices. Or the first camera is a camera manually selected when the user invokes the camera across the device. For example, the target application is any one of a teleconferencing application, a live broadcast application, a remote teaching application, and the like. Taking the target application as the teleconference application as an example, when a user is accessing a conference using the teleconference application, whether to activate the camera may be selected. If the user selects to start the camera and the user does not specify which camera to start, the first device may call the last camera called by the device, and the device corresponding to the last device, that is, the second device, and the last camera called by the device, that is, the first camera. If the user selects to start the camera, the first device displays a selectable camera list (the selectable camera list may refer to the interface 25 in fig. 2B), and the user selects to perform a selection operation on one camera of the second device in the selectable camera list, then the one camera of the second device selected by the user is the first camera.
Optionally, before the first device obtains the operation of opening the first camera, the first device needs to determine that the second device is an available device that supports remote invocation of the camera and that can transmit image data using near field communication.
The first device can determine that the second device can support remote call of the camera and record through receiving the device synchronization information. The device synchronization information may be sent when the first device and the second device pre-establish a connection. Or the second device logs in the same user account as the first device, the second device may synchronize to the first device that it has the ability to remotely invoke the camera.
Meanwhile, before the first device obtains the operation of opening the first camera, the first device needs to scan whether the recorded second device supporting the remote call camera is in the near field communication range or not so as to judge whether the first device and the second device can perform near field communication or not. If the first device and the second device both open bluetooth, or the first device and the second device are connected to the same Wi-Fi, the first device and the second device can be determined to be capable of near field communication, and accordingly, the second device is an available device (or called an online device) that calls a camera for the present time. It should be noted that, when the first device determines that the second device is the available device of the current cross-device call camera, the first device may display the operation of opening the first camera on the available camera list or determine that the first camera is one camera of the second device that was last cross-device call.
S502, a first session is established between the first device and the second device, and the first session is carried on a first link.
The first link is a data link established between the first device and the second device, the data link may be a P2P link, and the first device may determine that the second device is an available device that supports remote invocation of the camera and may utilize near field communication to transmit image data, and then establish the first link with the first device.
The first session is used to transmit image data, such as a first video stream, acquired by a first camera. In the present application, the first device may send a negotiation message to the second device, and accordingly, the second device receives the negotiation message and establishes the first session with the first device. The negotiation message includes the device identification, the service type of the session, and the service parameters of the session. Wherein the device identifier is used for indicating a peer device of the first device in the first session, the service type is used for indicating a type of data (including video type, bytes type, etc.) used for transmission of the first session, and the service parameter includes a keep-alive duration (or referred to as duration) of the first session, etc. Illustratively, the device identifier is an identifier of the second device, the service type is a video type, the keep-alive duration of the first session included in the service parameter is not set, or is set to a maximum value that the video type session can survive, etc., so as to ensure reliable transmission of the first video stream.
S503, the first device sends an opening request of the first camera to the second device.
Accordingly, the second device may receive an open request of the first camera.
In the application, the first device can send the opening request of the first camera to the second device in the process of establishing the first session with the second device. That is, the first device may perform S502 and S503 simultaneously. The first camera opening request comprises the identification of the first device and the identification of the first camera, and the first camera opening request is sent on a control link of the first device and the second device, and the control link and the first link are not the same link.
S504, the second device responds to the opening request of the first camera to start the first camera.
The second device may activate the first camera after receiving the open request of the first camera.
Optionally, the second device may first obtain the user authorization after receiving the opening request of the first camera, and restart the first camera after obtaining the user authorization.
S505, the second device sends a first video stream acquired by the first camera to the first device, and the first video stream is transmitted in the first session.
After the first session is successfully created and the first camera is successfully started, the first camera of the second device starts to collect the first video stream, and sends the collected first video stream to the first device through the first session. Accordingly, the first device may receive the first video stream and display according to the first video stream.
Alternatively, the second device may display a prompt interface for camera use, for example, the second device displays an interface 21 as in fig. 2A or as in fig. 2B for prompting the user that the second device is using the second camera (e.g., front camera) for shooting.
The steps S501 to S505 are steps executed by the first device and the second device before the camera is switched when the camera is called across devices. The following describes steps performed in the camera switching process in the present application:
s506, the first device displays a first control, and the first control is used for switching the camera called by the first device into a second camera of the second device.
In one possible implementation, the first control includes a camera flip control for implementing a flip switch between the first camera and the second camera. For example, the camera flip control is control 210 in fig. 2A.
In another possible implementation, the first control includes an option control (or referred to as a selection control) of the second camera. For example, the option control of the second camera is control 220 in fig. 2B.
Optionally, when the first device runs the target application, the main interface of the target application may be displayed first, and after receiving an operation on the main interface of the target application, the interface where the first control is located in the target application is displayed. For example, referring to fig. 6, the target application is a teleconferencing application, the first device displays the interface 60 shown in fig. 6, the interface 60 includes a document display area, a function area, and a video area 610 (which is a screen captured by the first camera), a video area 620, and the function area includes a camera management control 630, a microphone management control, and a document sharing control. Wherein the first device may display the interface 20 after receiving a click operation on the video area 610, the interface 20 including the camera flip control 210; or the first device may display interface 24 after receiving operation of camera management control 630, interface 24 including a selectable camera list including selection control 220 of the rear camera of tablet 130.
S507, the first device acquires operation on the first control.
S508, the first device establishes a second session with the second device, wherein the second session is carried on the first link and used for keeping the first link alive.
After the first device acquires the operation of the user on the first control, the first device establishes a second session with the second device; when the second session is established, the first device may still send a negotiation message corresponding to the second session to the second device, and accordingly, the second device receives the negotiation message and establishes the second session with the first device. The negotiation message includes the device identification, the service type of the session, and the service parameters of the session. Wherein the device identification is used to indicate the peer device of the first device in the second session, the service type is used to indicate the type of data (including video type, bytes type, etc.) used for transmission by the second session, and the service parameter includes a keep-alive period (or referred to as a duration period) of the second session, etc.
Optionally, the service type of the second session is different from the service type of the first session; the transmission resources required for the service type of the second session are smaller than the transmission resources required for the service type of the first session. For example, the service type of the second session may be set as bytes type in the negotiation message corresponding to the second session, where the bandwidth that needs to be occupied by the bytes type session is far smaller than the bandwidth that needs to be occupied by the video type session. Based on this, the consumption of transmission resources can be reduced.
Optionally, the keep-alive time period of the second session is not set, or is set to the maximum value that the video type session can allow to survive, or is set to an empirical value that is used to ensure that the first link can keep alive (i.e., the first link does not break) after the first session is closed and before the new video type session is successfully created. The case where the keep-alive period of the second session is set to an empirical value will be described in detail later.
S509, the first device sends a first closing request to the second device, wherein the first closing request is used for indicating the second device to close the first session and close the first camera.
In the embodiment of the present application, after the first device obtains the operation of the first control by the user, S502 and S503 described above may be executed simultaneously in response to the operation of the first control, that is, the first device sends the first close request to the second device when the first device establishes the second session with the second device. Accordingly, the second device may receive the first shutdown request.
S510, the second device closes the first session and closes the first camera.
The second device closes the first session and closes the first camera after receiving the first closing request. The process of closing the first session may be different from the process of closing the first camera, so that the second device may perform closing the first session and closing the first camera at the same time.
S511, the first device establishes a third session with the second device and sends a first opening request, wherein the third session is carried on the first link, and the first opening request is used for indicating the second device to start the second camera.
After the second device successfully closes the first session, the first device may establish a third session with the second device and send a first open request; the third session is used for transmitting a second video stream acquired by the second camera. Accordingly, the second device receives the first open request.
When the third session is established, the first device may still send a negotiation message corresponding to the third session to the second device, and accordingly, the third device receives the negotiation message and establishes the third session with the first device. The negotiation message includes the device identification, the service type of the session, and the service parameters of the session. Wherein the device identification is used to indicate the peer device of the first device in the third session, the service type is used to indicate the type of data (including video type, bytes type, etc.) used for transmission by the third session, and the service parameter includes a keep-alive period (or referred to as a duration period) of the second session, etc. Illustratively, the device identifier is an identifier of the third device, the service type is a video type, the keep-alive duration of the third session included in the service parameter is not set, or is set to a maximum value that the video type session can survive, etc., for ensuring reliable transmission of the second video stream.
Optionally, when the foregoing step S508 is performed to set the keep-alive time period of the second session to an empirical value, the empirical value is greater than or equal to a time interval between a time when the second session is successfully established and a time when the third session is successfully established. For example, the time when the second session is successfully established is t1, and the time when the third session is successfully established is t2. From the above description, it is known that in the period of time t 1-t 2, the first session on the first link is disconnected, so if there is no second session in the first link in the period from the disconnection of the first session to the successful establishment of the third session, the first link is disconnected, and the first device needs to reestablish the first link with the second device before establishing the third session with the second device. Therefore, when the keep-alive time of the second session is longer than or equal to the time interval of t 1-t 2, the second link is not disconnected, so that the time for reestablishing the link can be effectively reduced, and the time delay of switching the cameras is reduced.
S512, the second device opens the second camera.
The second device may turn on the second camera after confirming that the first camera was successfully turned off. How the second device confirms that the first camera is successfully turned off can be referred to the following embodiment corresponding to fig. 7.
S513, the second device sends a second video stream acquired by the second camera to the first device, and the second video stream is transmitted in the first session.
After the third session is successfully created and the second camera is successfully started, the second camera of the second device starts to collect the second video stream, and the collected second video stream is sent to the first device through the third session. Accordingly, the first device may receive the second video stream and display according to the second video stream.
Alternatively, the second device may display a prompt interface for camera use, for example, the second device displays an interface 23 as in fig. 2A or as in fig. 2B for prompting the user that the second device is using the second camera (e.g., a rear camera) for shooting.
In this embodiment, for a scenario in which the first device switches the invoked camera from the first camera of the second device to the second camera, a second session may be created between the first device and the second device to keep the first link alive before sending the first close request to the second device. Based on this, even after the second device turns off the first camera and disconnects the first session with the first device, the first link does not disconnect, and accordingly, it is not necessary to create a new link carrying the third session before the second device turns on the second camera and creates the third session. Therefore, the application can reduce the time delay of camera switching by omitting the steps of disconnecting and rebuilding the first link.
Fig. 5 illustrates a flow of the camera switching method according to the embodiment of the present application, and the camera switching method according to the embodiment of the present application will be further described with reference to software modules in fig. 4A and fig. 4B. The method mainly comprises the steps of introducing interaction of an application program layer and an application program framework layer, and omitting calling processes of the application program layer and the application program framework to a hardware abstraction layer and a kernel layer.
Referring to fig. 7, fig. 7 is a schematic flow chart of software interaction in a camera switching method according to an embodiment of the present application. S701-S709 are steps executed before camera switching when a cross-device camera is called; the subsequent step of S709 is a step performed when the camera is switched. S701 to S709 are described first:
S701, the third party application of the first device obtains an operation of opening the first camera.
The implementation of S701 may refer to the corresponding description in S501.
S702, the third-party application of the first device sends an opening request of the first camera to a camera management service of the first device.
Accordingly, the camera management service of the first device receives the opening request of the first camera.
S703, the camera management service of the first device and the camera management service of the second device establish a first session.
After the camera management service of the first device receives the opening request of the first camera, the camera management service of the first device may send a negotiation message to the camera management service of the second device to establish the first session. Illustratively, the negotiation message may include the following:
open P2P session1 Dv Service{
std STRING DEVICEID// device identification
DV SERVICETYPE// service type of session
Std < string >// service parameters of session
}
Illustratively, the device identifier is an identifier of the second device, the service type is a video type, the keep-alive duration of the first session included in the service parameter is not set, or is set to a maximum value that the video type session can survive, etc., so as to ensure reliable transmission of the first video stream. The specific implementation of S703 may refer to the corresponding description of S502, which is not described here in detail.
S704, the camera management service of the first device sends an opening request of the first camera to the camera business service of the first device.
After the camera management service of the first device receives the opening request of the first camera, the camera management service of the first device forwards the opening request of the first camera to the camera service of the first device. It is understood that the camera management service of the first device may perform S703 and S704 at the same time. Accordingly, the camera business service of the first device receives an opening request of the first camera.
S705, the camera business service of the first device sends an opening request of the first camera to the camera business service of the second device.
After the camera service of the first device receives the opening request of the first camera, the camera service of the first device may send the opening request of the first camera to the camera service of the second device through a control link between the first device and the second device.
Optionally, the service state of the first camera is recorded in the camera service of the first device. When the first equipment camera business service receives the opening request of the first camera, the service state of the first camera can be updated from the non-service state to the service state.
S706, the camera business service of the second device sends an opening instruction of the first camera to the camera management service of the second device.
After receiving the opening request of the first camera, the camera service of the second device may send an opening instruction of the first camera to the camera management service of the second device, where the opening instruction may be a Stream On instruction, and is used to instruct to start the first camera.
Optionally, the camera service of the second device may further obtain the user authorization first, and after obtaining the user authorization, send an opening instruction of the first camera to the camera management service of the second device.
Optionally, the service state of the first camera is recorded in the camera service of the second device, and the camera service of the second device can update the service state of the first camera from the no-service state to the service state.
S707, enabling the first camera by the camera management service of the second device.
The camera management service of the second device may start the first camera after receiving the opening instruction of the first camera.
S708, the camera management service of the second device sends the first video stream to the camera management service of the first device.
After the first camera is successfully started, the first camera collects a first video stream. The camera management service of the second device may send a first video stream to the camera management service of the first device over the first session. Accordingly, the camera management service of the first device receives the first video stream.
S709, the camera management service of the first device sends the first video stream to the third party application of the first device.
After the camera management service of the first device receives the first video stream, the first video stream can be sent to the three-party application of the first device for display.
S709 and subsequent steps are described below:
S710, the third party application of the first device displays the first control.
S711, the three-party application of the first device acquires the operation of the first control.
The implementation of S710 may refer to the corresponding description in S506, and the implementation of S711 may refer to the corresponding description in S507.
And S712, the third-party application of the first device sends a first closing request to the camera management service of the first device.
The method comprises the steps that a three-party application of first equipment responds to operation of a first control and sends a first closing request to camera management service of the first equipment; the first close request is used to instruct to close the first camera and close the first session.
Accordingly, the camera management service of the first device receives the first close request. The camera management service of the first device may simultaneously perform S713 and S714 after receiving the first close request.
S713, the camera management service of the first device establishes a second session with the camera management service of the second device.
After the camera management service of the first device receives the first close request, the camera management service of the first device may send a negotiation message corresponding to the second session to the camera management service of the second device to establish the second session. For example, the service type of the second session may be set as bytes type in the negotiation message corresponding to the second session, where the bandwidth that needs to be occupied by the bytes type session is far smaller than the bandwidth that needs to be occupied by the video type session. And the keep-alive time of the second session is an empirical value, and the empirical value can take the value of 5s, so that the first link is kept alive for 5s after the second session is successfully established.
S714, the camera management service of the first device sends a first closing request to the camera business service of the first device.
Accordingly, the camera business service of the first device receives the first closing request.
S715, the camera business service of the first device sends a first closing request to the camera business service of the second device.
After the camera service of the first device receives the first close request, it may be sent to the camera service of the second device through a control link between the first device and the second device. Accordingly, the camera business service of the second device receives the first closing request.
Optionally, the service state of the first camera is recorded in the camera service of the first device. When the first equipment camera business service receives the first closing request, the service state of the first camera can be updated from the service state to the no-service state.
S716, the camera business service of the second device updates the service state of the first camera from the service state to the no-service state.
S717, the camera business service of the second device sends a first instruction to the camera management service of the second device.
The first instruction is a far down instruction, and is used for indicating to close the first camera and the first session.
S718, the camera management service of the second device detects whether a first instruction is received.
In the application, after the camera management service of the second device establishes the second session with the camera management service of the first device, the camera management service of the second device continuously judges whether the first instruction sent by the camera business service of the second device is received. If the first instruction is received, the second device performs S719-1-1 and S719-1-2; if the first instruction is not received, the second device performs S719-2-1, S719-2-2, S719-2-3, S719-2-4.
Optionally, the camera management service of the second device further detects whether the instruction of the session abnormal end is ended. If an instruction of abnormal end of the session is received, the first camera and the first session can be closed, so that communication safety is ensured.
S719-1-1, if the camera management service of the second device receives the first instruction, closing the first camera.
S719-1-2, if the camera management service of the second device receives the first instruction, closing the first session.
Optionally, the process of closing the first camera is different from the process of closing the first session, so that the camera management service of the second device may execute S719-1-1 and S719-1-2 simultaneously when receiving the first instruction.
S719-2-1, if the camera management service of the second device does not receive the first instruction, inquiring the service state of the first camera from the camera service of the second device.
S719-2-2, the camera business service of the second device sends the service state of the first camera to the camera management service of the second device.
The camera service of the second device may send the service state (i.e., the no-service state) of the first camera updated in S716 to the camera management service of the second device.
S719-2-3, if the service state of the first camera is the no-service state, the camera management service of the second device turns off the first camera.
S719-2-4, if the service state of the first camera is the no-service state, the camera management service of the second device and the camera management service of the first device close the first session.
Alternatively, S719-2-3 and S719-2-4 may be performed simultaneously when the service state of the first camera is the out-of-service state.
Optionally, if the service state of the first camera is the service state, it is indicated that the camera service of the second device does not execute S716, which is equivalent to that the camera service of the second device may not receive the first close request due to a transmission abnormality or other problems. In this case, the camera management service of the second device maintains the first session and maintains the first camera enabled. Based on this, the first camera and the first session can be made to coincide with the service state of the first camera.
And S720, the third-party application of the first device sends a first opening request to the camera management service of the first device.
Wherein the first opening request is for indicating to open the second camera. Accordingly, the camera management service of the first device receives the first opening request. The camera management service of the first device may simultaneously perform S721 and S722 after receiving the first close request.
S721, the camera management service of the first device and the camera management service of the second device establish a third session.
After the camera management service of the first device receives the first opening request, the camera management service of the first device may send a negotiation message corresponding to the third session to the camera management service of the second device to establish the third session. Illustratively, the device identifier is an identifier of the second device, the service type is a video type, the keep-alive duration of the third session included in the service parameter is not set, or is set to a maximum value that the video type session can survive, etc., for ensuring reliable transmission of the second video stream.
S722, the camera management service of the first device sends a first opening request to the camera service of the first device.
Accordingly, the camera business service of the first device receives the first opening request.
S723, the camera service of the first device sends a first opening request to the camera service of the second device.
After the first open request is received by the camera service of the first device, it may be sent to the camera service of the second device through a control link between the first device and the second device. Accordingly, the camera business service of the second device receives the first opening request.
Optionally, the service state of the second camera is recorded in the camera service of the first device. When the first equipment camera business service receives the first opening request, the service state of the second camera can be updated from the no-service state to the service state.
And S724, the camera business service of the second device sends a second instruction to the camera management service of the second device.
After the camera service of the second device receives the first opening request, a second instruction may be sent to the camera management service of the second device. The second instruction is a Stream On instruction, and is used for indicating to open the second camera. Accordingly, the camera management service of the second device receives the second instruction.
Optionally, the camera service of the second device further updates the service state of the second camera from the no-service state to the service state.
S725, the camera management service of the second device queries the working state of the first camera and the working state of the second camera.
The camera management service of the second device records the working state of the first camera and the working state of the second camera. The operating state is either an on state (open state) or an off state (close state). Since the camera management service of the second device requires a certain time when executing the closing of the first camera, after the camera management service of the second device receives the second instruction, the working state of the first camera and the working state of the second camera may be queried first, and then S726-1 is executed, which executes S726-2.
S726-1, if the working state of the first camera is the starting state and the working state of the second camera is the closing state, the camera management service of the second device starts the second camera after closing the first camera.
S726-2, if the working state of the first camera is the closed state and the working state of the second camera is the closed state, the camera management service of the second device starts the second camera.
S727, the camera management service of the second device sends the second video stream to the camera management service of the first device.
After the second camera is successfully started and the third session is successfully created, the second camera collects a second video stream. The camera management service of the second device may send the second video stream to the camera management service of the first device via the third session. Accordingly, the camera management service of the first device receives the second video stream.
S728, the camera management service of the first device sends a second video stream to the third party application of the first device.
After the camera management service of the first device receives the second video stream, the second video stream can be sent to the three-party application of the first device for display, so that the effect of switching from the first camera to the second camera is achieved.
It should be noted that, if the keep-alive time of the second session is set to an empirical value, the empirical value is greater than or equal to a time interval between a time when the second session is successfully established and a time when the third session is successfully established. The second session is automatically closed when the keep-alive time is reached. For example, the empirical value is equal to the time interval between the time when the second session is successfully established and the time when the third session is successfully established and no anomaly occurs in the third session establishment procedure, the second session is ideally automatically closed when the third session is successfully established.
In this embodiment, for a scenario in which the first device switches the invoked camera from the first camera of the second device to the second camera, a second session may be created between the first device and the second device to keep the first link alive before sending the first close request to the second device. Based on this, the delay of the camera switching can be reduced by omitting the disconnection and reconstruction steps of the first link. Meanwhile, when the second equipment closes the first camera, whether the first camera needs to be closed or not is determined through the first instruction and the service state of the first camera, so that the working state of the first camera is guaranteed to be consistent with the service state of the first camera. And when the second equipment opens the second camera, the working states of the two cameras are inquired, so that the situation that errors or anomalies are not caused when the second camera is opened can be ensured, and the accuracy of camera switching is improved.
The embodiment of the application also provides electronic equipment, which can comprise: one or more processors and one or more memories. The one or more memories are coupled to the one or more processors and the one or more memories are configured to store computer program code that includes computer instructions that, when executed by the one or more processors, cause the electronic device to perform the functions or steps performed by the first device/second device in the above-described method embodiments.
The embodiment of the application also provides a camera switching device, which comprises a function/unit for executing the first device/the second device in the embodiment.
The present application also provides a chip system, as shown in fig. 8, the chip system 800 includes at least one processor 801 and at least one interface circuit 802. The processor 801 and the interface circuit 802 may be interconnected by wires. For example, interface circuit 802 may be used to receive signals from other devices (e.g., a memory of an electronic apparatus). For another example, interface circuit 802 may be used to send signals to other devices (e.g., processor 801). The interface circuit 802 may, for example, read instructions stored in a memory and send the instructions to the processor 801. The instructions, when executed by the processor 801, may cause the electronic device to perform the various steps of the embodiments described above. Of course, the system-on-chip may also include other discrete devices, which are not particularly limited in accordance with embodiments of the present application.
The present embodiment also provides a computer readable storage medium, where computer instructions are stored, which when executed on an electronic device, cause the electronic device to perform the functions or steps performed by the mobile phone in the above-described method embodiment.
The present embodiment also provides a computer program product which, when run on a computer, causes the computer to perform the functions or steps performed by the handset in the method embodiments described above.
In addition, embodiments of the present application also provide an apparatus, which may be embodied as a chip, component or module, which may include a processor and a memory coupled to each other; the memory is configured to store computer-executable instructions, and when the device is operated, the processor may execute the computer-executable instructions stored in the memory, so that the chip performs the functions or steps executed by the mobile phone in the above method embodiment.
The electronic device, the communication system, the computer readable storage medium, the computer program product or the chip provided in this embodiment are used to execute the corresponding method provided above, so that the benefits achieved by the electronic device, the communication system, the computer readable storage medium, the computer program product or the chip can refer to the benefits in the corresponding method provided above, and are not repeated herein.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated unit may be stored in a readable storage medium if implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that the above-mentioned embodiments are merely for illustrating the technical solution of the present application and not for limiting the same, and although the present application has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications and equivalents may be made to the technical solution of the present application without departing from the spirit and scope of the technical solution of the present application.

Claims (14)

1.A camera switching method, applied to a first device, the method comprising:
Displaying a first control under the condition that the first device receives a first video stream acquired by a first camera of a second device; the first device establishes a first link with the second device; the first video stream is transmitted via a first session carried in the first link; the first control is used for switching the camera called by the first device into a second camera of the second device;
Responding to the operation of the user on the first control, and establishing a second session with the second equipment; the second session is carried on the first link and is used to keep alive the first link;
sending a first closing request to the second device; the first closing request is used for indicating the second equipment to close the first session and close the first camera;
After the first session is successfully closed, establishing a third session with the second device, and sending a first opening request to the second device; the first opening request is used for indicating the second device to start the second camera; the third session is carried on the first link;
Receiving a second video stream; the second video stream is acquired via the second camera of the second device and transmitted to the first device via the third session.
2. The method of claim 1, wherein a keep-alive time period of the second session is greater than or equal to a time interval between a time when the second session was successfully established and a time when the third session was successfully established.
3. Method according to claim 1 or 2, characterized in that the second device is a device that registers for the same user account in the first device and/or in the near field communication range of the first device.
4. The method of claim 1 or 2, wherein the first control comprises a camera flip control for effecting a flip switch between the first camera and the second camera; or the first control comprises an option control of the second camera.
5. A camera switching method, applied to a second device, the method comprising:
Establishing a second session with a first device under the condition that a first video stream acquired by a first camera of the second device is sent to the first device; the first device establishes a first link with the second device; the first video stream is transmitted via a first session carried in the first link; the second session is carried on the first link and is used to keep alive the first link;
receiving a first closing request sent by the first device, and closing the first session and the first camera in response to the first closing request;
After the first session is successfully closed, establishing a third session with the second device;
Receiving a first opening request sent by the first device, and starting a second camera in response to the first opening request;
transmitting a second video stream to the first device; the second video stream is collected via the second camera and transmitted to the first device via the third session.
6. The method of claim 5, wherein the keep-alive time period of the second session is greater than or equal to a time interval between a time when the second session was successfully established and a time when the third session was successfully established.
7. The method of claim 5 or 6, wherein the first device comprises a camera management service and the second device comprises a camera business service and a camera management service;
The establishing a second session with the first device includes:
the camera management service of the second device establishes a second session with the camera management service of the first device;
the closing the first session and the first camera includes:
After the camera management service of the second device establishes a second session with the camera management service of the first device, the camera management service of the second device detects whether a first instruction is received; the first instruction is used for indicating to close the first session and the first camera;
if the camera management service of the second device receives the first instruction, the camera management service of the second device closes the first session and the first camera;
Or if the camera management service of the second device does not receive the first instruction, the camera management service of the second device queries the service state of the first camera to the camera service; and if the service state of the first camera is a no-service state, the camera management service of the second device closes the first session and the first camera.
8. The method of claim 7, wherein prior to the closing the first session and the first camera, the method further comprises:
Responding to a first closing request sent by the first equipment, and updating the service state of the first camera from a service state to a no-service state by the camera business service of the second equipment;
And the camera business service of the second equipment sends the first instruction to the camera management service of the second equipment.
9. The method of claim 7, wherein the initiating a second camera in response to the first open request sent by the first device comprises:
Responding to a first opening request sent by the first equipment, and sending a second instruction to a camera management service of the second equipment by a camera business service of the second equipment; the second instruction is used for indicating to start the second camera;
The camera management service of the second device responds to the second instruction to inquire the working state of the first camera and the working state of the second camera;
If the working state of the first camera is a starting state and the working state of the second camera is a closing state, closing the first camera, and starting the second camera after the first camera is successfully closed;
Or if the working state of the first camera is a closing state and the working state of the second camera is a closing state, starting the second camera.
10. The method of claim 5, wherein the second device is a device that logs in to the same user account within near field communication range of the first device and/or in the first device.
11. An electronic device, comprising: one or more processors, one or more memories; wherein one or more memories are coupled to one or more processors, the one or more memories being for storing computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method of any of claims 1-4 or claims 5-10.
12. A communication system, characterized in that the communication system comprises a first device and a second device as claimed in claims 1-10.
13. A chip system applied to an electronic device, wherein the chip system comprises at least one processor and an interface, wherein the interface is used for receiving computer instructions and transmitting the computer instructions to the at least one processor; the at least one processor executing the computer instructions causes the electronic device to perform the method of any one of claims 1-4 or claims 5-10.
14. A computer readable storage medium having stored therein computer instructions which, when run on an electronic device, cause the electronic device to perform the method of any of claims 1-4 or claims 5-10.
CN202311267256.4A 2023-09-26 2023-09-26 Camera switching method and electronic equipment Pending CN118450249A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311267256.4A CN118450249A (en) 2023-09-26 2023-09-26 Camera switching method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311267256.4A CN118450249A (en) 2023-09-26 2023-09-26 Camera switching method and electronic equipment

Publications (1)

Publication Number Publication Date
CN118450249A true CN118450249A (en) 2024-08-06

Family

ID=92307870

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311267256.4A Pending CN118450249A (en) 2023-09-26 2023-09-26 Camera switching method and electronic equipment

Country Status (1)

Country Link
CN (1) CN118450249A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100251158A1 (en) * 2009-03-30 2010-09-30 Avaya Inc. System and method for graphically managing communication sessions
CN102347855A (en) * 2011-07-21 2012-02-08 福建星网锐捷网络有限公司 Method, device and network equipment for realizing bidirectional forwarding detection
CN112073664A (en) * 2019-06-11 2020-12-11 聚好看科技股份有限公司 Video call method and display device
CN114915747A (en) * 2021-02-09 2022-08-16 华为技术有限公司 Video call method and related equipment
CN116346884A (en) * 2023-03-29 2023-06-27 青岛海尔智能家电科技有限公司 Method, device, system and storage medium for session link keep-alive

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100251158A1 (en) * 2009-03-30 2010-09-30 Avaya Inc. System and method for graphically managing communication sessions
CN102347855A (en) * 2011-07-21 2012-02-08 福建星网锐捷网络有限公司 Method, device and network equipment for realizing bidirectional forwarding detection
CN112073664A (en) * 2019-06-11 2020-12-11 聚好看科技股份有限公司 Video call method and display device
CN114915747A (en) * 2021-02-09 2022-08-16 华为技术有限公司 Video call method and related equipment
CN116346884A (en) * 2023-03-29 2023-06-27 青岛海尔智能家电科技有限公司 Method, device, system and storage medium for session link keep-alive

Similar Documents

Publication Publication Date Title
JP6669151B2 (en) Data processing device, data processing method and program
CN105847317B (en) Data processing apparatus, data processing system, data processing method, and storage medium
EP4084486B1 (en) Cross-device content projection method, and electronic device
WO2022257977A1 (en) Screen projection method for electronic device, and electronic device
WO2021027666A1 (en) Bluetooth reconnection method and related apparatus
EP4013003A1 (en) Communication protocol switching method, apparatus and system
JP6311078B2 (en) Terminal test method, apparatus, program, and recording medium
US11687332B2 (en) Communication apparatus for wirelessly communicating with another apparatus, information processing method, and program
CN109194972B (en) Live stream acquisition method and device, computer equipment and storage medium
JP2017531974A (en) Network connection method, device, system, program, and recording medium
EP4199562A1 (en) Method for transmitting data and electronic device
CN115514882A (en) Distributed shooting method, electronic device and medium
CN113703849B (en) Screen-casting application opening method and device
CN105578017B (en) Photographing and photo sharing system and method
JP2016103703A (en) Information processing device, electronic apparatus, control method for them, program, and storage medium
JP6869746B2 (en) Communication device, its control method, program
KR20180105676A (en) How to set up service connection, device, program and storage medium
CN118450249A (en) Camera switching method and electronic equipment
CN115242994A (en) Video call system, method and device
KR102113550B1 (en) Method for Operating Functions and Resources of Electric Device
JP2020091787A (en) Communication device and control method thereof
CN113271577B (en) Media data playing system, method and related device
WO2024078251A1 (en) Display method and electronic device
WO2023236939A1 (en) Application component interaction method and related device
WO2024001812A1 (en) Message management method, electronic device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination