CN107705275B - Photographing method and mobile terminal - Google Patents
Photographing method and mobile terminal Download PDFInfo
- Publication number
- CN107705275B CN107705275B CN201710803176.4A CN201710803176A CN107705275B CN 107705275 B CN107705275 B CN 107705275B CN 201710803176 A CN201710803176 A CN 201710803176A CN 107705275 B CN107705275 B CN 107705275B
- Authority
- CN
- China
- Prior art keywords
- image
- mobile terminal
- feedback
- original image
- original
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 98
- 238000012545 processing Methods 0.000 claims abstract description 74
- 238000012790 confirmation Methods 0.000 claims description 22
- 230000015572 biosynthetic process Effects 0.000 claims description 11
- 238000003786 synthesis reaction Methods 0.000 claims description 11
- 238000004891 communication Methods 0.000 abstract description 8
- 238000004590 computer program Methods 0.000 description 20
- 238000010586 diagram Methods 0.000 description 18
- 230000006870 function Effects 0.000 description 13
- 230000001360 synchronised effect Effects 0.000 description 12
- 230000000694 effects Effects 0.000 description 11
- 230000003068 static effect Effects 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 239000002041 carbon nanotube Substances 0.000 description 2
- 229910021393 carbon nanotube Inorganic materials 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 238000010897 surface acoustic wave method Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000002087 whitening effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
Abstract
The invention relates to the technical field of communication, and provides a photographing method and a mobile terminal, which aim to solve the problem that the image processing efficiency of the mobile terminal is low. The method comprises the following steps: controlling the shooting device to shoot an original image; sending the original image to each mobile terminal in at least one second mobile terminal; receiving a feedback image sent by the at least one second mobile terminal, wherein the feedback image is an image obtained by processing the original image; and determining a target image according to the feedback image. In this way, the first mobile terminal can obtain the feedback image processed by the second mobile terminal, and the user does not need to process the images one by one, so that the efficiency of image processing can be improved.
Description
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a photographing method and a mobile terminal.
Background
With the continuous development of mobile terminals, the pixels of the mobile terminals are higher and higher, and the mobile terminals have become common shooting tools for people. Especially in scenes such as friend gathering or traveling, the mobile terminal is often required to be used for shooting the co-photos for the purpose of being kept. However, since the captured image is inevitably poor in the state of the person image or poor in the effect of the person image, the user needs to manually process the person images one by one, and a large amount of time is required when the number of person images to be processed is large.
Therefore, in the prior art, the efficiency of image processing by the mobile terminal is low.
Disclosure of Invention
The embodiment of the invention provides a photographing method and a mobile terminal, and aims to solve the problem that the efficiency of image processing of the mobile terminal is low.
In a first aspect, an embodiment of the present invention provides a photographing method applied to a first mobile terminal having a photographing apparatus, including:
controlling the shooting device to shoot an original image;
sending the original image to each second mobile terminal in at least one second mobile terminal;
receiving a feedback image sent by the at least one second mobile terminal, wherein the feedback image is an image obtained by processing the original image;
and determining a target image according to the feedback image.
In a second aspect, an embodiment of the present invention further provides a photographing method applied to a second mobile terminal, including:
receiving an original image sent by a first mobile terminal, wherein the original image is an original image shot by a shooting device of the first mobile terminal;
processing the original image;
and sending the processed image to the first mobile terminal.
In a third aspect, an embodiment of the present invention provides a mobile terminal, including:
the shooting module is used for controlling the shooting device to shoot an original image;
the first sending module is used for sending the original image shot by the shooting module to each second mobile terminal in at least one second mobile terminal;
the receiving module is used for receiving a feedback image sent by the at least one second mobile terminal, wherein the feedback image is an image obtained by processing the original image sent by the first sending module;
and the determining module is used for determining a target image according to the feedback image received by the receiving module.
In a fourth aspect, an embodiment of the present invention further provides a mobile terminal, including:
the mobile terminal comprises a first receiving module, a second receiving module and a third receiving module, wherein the first receiving module is used for receiving an original image sent by a first mobile terminal, and the original image is an original image shot by a shooting device of the first mobile terminal;
the processing module is used for processing the original image received by the first receiving module;
and the first sending module is used for sending the image processed by the processing module to the first mobile terminal.
In a fifth aspect, an embodiment of the present invention further provides a mobile terminal, including: the system comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor executes the computer program to realize the steps of the photographing method.
In a sixth aspect, the embodiment of the present invention further provides a readable storage medium, where a computer program is stored on the readable storage medium, and when the computer program is executed by a processor, the computer program implements the steps in the photographing method as described above.
In the embodiment of the invention, the shooting device is controlled to shoot an original image; sending the original image to each mobile terminal in at least one second mobile terminal; receiving a feedback image sent by the at least one second mobile terminal, wherein the feedback image is an image obtained by processing the original image; and determining a target image according to the feedback image. In this way, the first mobile terminal can obtain the feedback image processed by the second mobile terminal, and the user does not need to process the images one by one, so that the efficiency of image processing can be improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive exercise.
FIG. 1 is a flowchart of a photographing method according to an embodiment of the present invention;
FIG. 2 is a second flowchart of a photographing method according to an embodiment of the present invention;
fig. 3 is one of the structural diagrams of a mobile terminal according to an embodiment of the present invention;
fig. 4 is a second block diagram of a mobile terminal according to an embodiment of the present invention;
fig. 5 is a third block diagram of a mobile terminal according to an embodiment of the present invention;
fig. 6 is a fourth structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 7 is a block diagram of a determination module in a mobile terminal according to an embodiment of the present invention;
fig. 8 is a fifth structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 9 is a sixth structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 10 is a block diagram of a processing module in a mobile terminal according to an embodiment of the present invention;
fig. 11 is a seventh structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 12 is an eighth block diagram of a mobile terminal according to an embodiment of the present invention;
fig. 13 is a ninth block diagram of a mobile terminal according to an embodiment of the present invention;
fig. 14 is a block diagram of a mobile terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a flowchart of a photographing method according to an embodiment of the present invention, where the photographing method can be applied to a mobile terminal having a photographing device, and the mobile terminal is taken as a first mobile terminal as an example. As shown in fig. 1, the method comprises the following steps:
In this step, after receiving the shooting instruction, the first mobile terminal controls the shooting device to shoot, so as to obtain an original image. The original image may be understood as an image that has not been processed after being captured by the capturing device.
Optionally, before the step of controlling the photographing device to photograph the original image, the method further includes: in the shooting preview process, sending a shooting preview image to each second mobile terminal in the at least one second mobile terminal; and if the confirmation information sent by each second mobile terminal in the at least one second mobile terminal based on the shot preview image is received, controlling the shooting device to shoot the original image.
In a scene of photographing, the first mobile terminal may be a mobile terminal for photographing, and the second mobile terminal may be a mobile terminal for controlling photographing by the first mobile terminal. The at least one second mobile terminal may be one, two or three second mobile terminals, and so on, and the number of the second mobile terminals may be determined according to actual situations. For example, when the subject is three persons, the first mobile terminal may be connected to three second mobile terminals of the three persons.
When the first mobile terminal shoots the preview, the shooting preview image is displayed on the display interface of the first mobile terminal, and the first mobile terminal can send the shooting preview image to all the second mobile terminals connected with the first mobile terminal. When the number of the second mobile terminals is at least two, the shooting preview image displayed in the display interface of the first mobile terminal can be displayed on each second mobile terminal, so that the shot person can view the shooting preview image through the second mobile terminals, and confirmation information can be sent to the first mobile terminal.
When the number of the second mobile terminals is at least two, each second mobile terminal can send the confirmation information to the first mobile terminal, and the first mobile terminal shoots after receiving the confirmation information sent by each second mobile terminal, so that the obtained image has a good effect. For example, after the first mobile terminal establishes connection with the three second mobile terminals, when the first mobile terminal receives the confirmation information sent by the three second mobile terminals, the first mobile terminal controls the shooting device to shoot.
In a specific implementation, the first mobile terminal may establish a connection with at least one second mobile terminal before sending the captured preview image to the second mobile terminal, and preferably, the first mobile terminal may establish a connection with the second mobile terminal in a preset area. For example, a connection is established with a mobile terminal within a preset distance by connecting a wireless fidelity hotspot or bluetooth. In this way, the first mobile terminal can acquire information of the second mobile terminal, for example, the model and number of the second mobile terminal establishing connection with the first mobile terminal, and the like, thereby transmitting the photographed preview image to the second mobile terminal.
Thus, the quality of the image is controlled during photographing, and the area of the obtained image that needs to be processed can be reduced, so that the efficiency of image processing can be improved.
Optionally, before the step of sending the shooting preview image to each of the at least one second mobile terminal in the shooting preview process, the method further includes: and when the display interface of the first mobile terminal is a shooting preview interface, adding a shielding layer on the shooting preview interface.
In this embodiment, the occlusion layer may be a layer that obscures the shooting preview interface from display or cannot directly display the shooting preview interface, for example, a mosaic or a preset image is added to the display interface. After the shielding layer is added on the display interface of the first mobile terminal, other people are difficult to view the displayed image through the display interface of the first mobile terminal, and the privacy of a user can be protected.
And 102, sending the original image to each second mobile terminal in at least one second mobile terminal.
The number of the at least one second mobile terminal may be one, two, or more than three. The first mobile terminal can send the original image to all the second mobile terminals in the at least one second mobile terminal, so that the shot person can view the effect of the original image through the second mobile terminals, and the image can be optimized according to the effect of the original image.
In particular, the mobile terminal may further divide the original image into regions, so that the second mobile terminal processes the image in the regions according to the divided regions. For example, the image of the person group is divided into regions according to the person, and each second mobile terminal is caused to perform processing for each person image.
In this step, the second mobile terminal may process the images in the original images, respectively, and send the processed images, i.e., the feedback images, to the first mobile terminal.
The at least one second mobile terminal may be a part of or all of the at least one second mobile terminal, and when implemented, the specific number of the second mobile terminals may be determined according to actual situations. For example, when the number of the second mobile terminals is large, the number of the second mobile terminals that send the feedback image to the first mobile terminal in the preset time may be determined, and the number may be a part of the second mobile terminals, that is, a part of the feedback image sent by the second mobile terminal is obtained; when the number of the second mobile terminals is small, the transmitted feedback images of all the second mobile terminals may be acquired.
And step 104, determining a target image according to the feedback image.
In this step, when the number of the second mobile terminals is one, the feedback image is taken as the target image; and when the number of the second mobile terminals is at least two, carrying out image synthesis by using the original image and the feedback image to obtain a target image.
When the number of the second mobile terminals is one, the number of the obtained feedback images can be one, and the feedback images can be directly used as target images due to the small number of the images, so that the implementation mode is simple, and the image processing efficiency is high; when the number of the second mobile terminals is at least two, the received feedback images can include at least two, and the image areas in the feedback images can replace the image areas in the original images, so that the user can process the original images according to the habits of the user, the obtained images are personalized, and the images are processed by multiple people, so that the image processing efficiency can be improved.
For convenience of understanding, a specific synthesis process of the original image and the feedback image is described as an example of synthesis of the original image and one of the feedback images.
Specifically, identifying a distinctive image feature of the feedback image relative to the original image; and replacing the target image characteristic of the original image with the distinguishing image characteristic, wherein the position of the distinguishing image characteristic in the feedback image is the same as the position of the target image characteristic in the original image.
The first mobile terminal may compare the received feedback image with the original image, identify a difference image feature of the feedback image relative to the original image, and determine a location of the difference image feature in the feedback image. The first mobile terminal can determine a position corresponding to the position in the original image according to the position, so that the target image feature can be determined in the original image. After the image features are replaced, the boundary of the replaced image and the original image can be subjected to blurring processing, so that the replaced image and the original image are more harmonious, and the image quality is improved.
When the feedback images include at least two, the first mobile terminal may synthesize the original image and the second feedback image by using a synthesized image of the original image and the first feedback image as the original image.
In this way, the partial image of the original image is replaced by the feedback image, and other areas of the original image are not processed, so that the target image can be more personalized, and the image processing efficiency can be improved.
Optionally, after the step of determining a target image according to the feedback image, the method further includes: sending the target image to each second mobile terminal in the at least one second mobile terminal; and if feedback information sent by the at least one second mobile terminal is received, deleting the target image, wherein the feedback information is information sent to the first mobile terminal by the second mobile terminal which successfully receives the target image.
In this embodiment, the at least one second mobile terminal may include one second mobile terminal, two second mobile terminals, or at least three second mobile terminals.
The feedback information sent by the at least one second mobile terminal may be feedback information sent by all or part of the at least one second mobile terminal. In particular, the number of the second mobile terminals may be determined according to actual conditions. For example, when the number of the second mobile terminals is large, the number of the second mobile terminals may be determined by determining the number of feedback information sent to the first mobile terminal within a preset time; when the number of the second mobile terminals is small, the target picture can be deleted after all the feedback information sent by the second mobile terminals is received. Therefore, the processed target picture is kept, and the privacy of the user can be protected.
In this embodiment of the present invention, the mobile terminal may be: a Mobile phone, a Tablet Personal Computer (Tablet Personal Computer), a Laptop Computer (Laptop Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or the like.
The photographing method of the embodiment of the invention controls the photographing device to photograph the original image; sending the original image to each mobile terminal in at least one second mobile terminal; receiving a feedback image sent by the at least one second mobile terminal, wherein the feedback image is an image obtained by processing the original image; and determining a target image according to the feedback image. In this way, the first mobile terminal can obtain the feedback image processed by the second mobile terminal, and the user does not need to process the images one by one, so that the efficiency of image processing can be improved.
Referring to fig. 2, the main difference between the present embodiment and the above-mentioned embodiment is that the photographing method is applied to the second mobile terminal, and the second mobile terminal processes the original image and then sends the processed image to the first mobile terminal. Fig. 2 is a flowchart of a photographing method according to an embodiment of the present invention, as shown in fig. 2, including the following steps:
The original image may be an image that has not been processed after being captured by the capturing device of the first mobile terminal. The second mobile terminal may receive the original image transmitted by the first mobile terminal after establishing a connection with the first mobile terminal.
Optionally, before the step of receiving the original image sent by the first mobile terminal, the method further includes: receiving a shooting preview image sent by the first mobile terminal, wherein the shooting preview image is a preview image displayed on a display interface in the shooting preview process of the first mobile terminal; and sending confirmation information to the first mobile terminal based on the shooting preview image so that the first mobile terminal shoots the original image according to the confirmation information.
In this embodiment, the photographed preview image displayed in the display interface of the first mobile terminal may be displayed on the second mobile terminal screen, so that the user can view the photographed preview image on the second mobile terminal. When the user is a photographed person, the user can adjust the posture according to the photographed preview image to obtain a better image effect. When the effect of shooting the preview image is good, the user can operate the second mobile terminal to send the confirmation information to the first mobile terminal. Therefore, in the shooting process, the user can adjust the posture according to the preview image effect, and a better image effect can be obtained.
In this step, a target image is determined in the original image; and processing the image in the preset area where the target image is located.
The second mobile terminal may perform an optimization process on the image. Specifically, the first mobile terminal may preset an area, and the second mobile terminal may acquire an area where the target image is located according to the set area, and process the image in the area. For example, when the original image is a personal photograph, only the personal image of the user himself may be processed. Therefore, different second mobile terminals process different areas, personalized target images can be obtained, and the image processing efficiency can be improved.
The specific processing mode comprises adding a preset expression image or characters in a preset area where the target image is located, adjusting the brightness of the image, whitening the character image and the like.
In this step, the processed image is sent to the first mobile terminal, so that the first mobile terminal can perform image synthesis on the processed image and the original image, and the efficiency of image processing can be improved.
In this embodiment of the present invention, the mobile terminal may be: a Mobile phone, a Tablet Personal Computer (Tablet Personal Computer), a Laptop Computer (Laptop Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or the like.
The photographing method of the embodiment of the invention receives an original image sent by a first mobile terminal; processing the original image; and sending the processed image to the first mobile terminal. Therefore, the second mobile terminals can process the images respectively, the image processing efficiency can be improved, and different second mobile terminals can process the images differently so as to obtain personalized images and improve the visual effect of the images.
Referring to fig. 3, fig. 3 is a structural diagram of a mobile terminal according to an embodiment of the present invention, where the mobile terminal is a first mobile terminal having a camera. As shown in fig. 3, the mobile terminal 300 includes: a photographing module 301, a first transmitting module 302, a receiving module 303 and a determining module 304. The shooting module 301 is connected with the first sending module 302, the first sending module 302 is connected with the receiving module 303, and the receiving module 303 is connected with the determining module 304.
The shooting module 301 is configured to control the shooting device to shoot an original image; a first sending module 302, configured to send the original image captured by the capturing module 301 to each second mobile terminal in at least one second mobile terminal; a receiving module 303, configured to receive a feedback image sent by the at least one second mobile terminal, where the feedback image is an image obtained by processing the original image sent by the first sending module 302; a determining module 304, configured to determine a target image according to the feedback image received by the receiving module 303.
Optionally, as shown in fig. 4, the mobile terminal 300 further includes: a second sending module 305, configured to send a shooting preview image to each second mobile terminal in the at least one second mobile terminal in a shooting preview process; the shooting module 301 is specifically configured to control the shooting device to shoot an original image if receiving confirmation information sent by each of the at least one second mobile terminal based on the shooting preview image.
Optionally, when the number of the second mobile terminals is one, the determining module 304 is specifically configured to use the feedback image as the target image; when the number of the second mobile terminals is at least two, the determining module 304 is specifically configured to perform image synthesis by using the original image and the feedback image to obtain a target image.
Optionally, as shown in fig. 5, the mobile terminal 300 further includes: a third sending module 306, configured to send the target image to each of the at least one second mobile terminal; a deleting module 307, configured to delete the target image sent by the third sending module 306 if feedback information sent by the at least one second mobile terminal is received, where the feedback information is information sent to the first mobile terminal by a second mobile terminal that successfully receives the target image.
Optionally, as shown in fig. 6, the mobile terminal 300 further includes: and the shielding module 308 is configured to add a shielding layer on the shooting preview interface when the display interface of the first mobile terminal is the shooting preview interface.
Optionally, as shown in fig. 7, the determining module 304 includes: an identifying sub-module 3041, configured to identify a difference image feature of the feedback image relative to the original image; a replacing submodule 3042 configured to replace a target image feature of the original image with the difference image feature identified by the identifying submodule 3041, wherein a position of the difference image feature in a feedback image is the same as a position of the target image feature in the original image.
The mobile terminal 300 can implement each process implemented by the mobile terminal in the method embodiment corresponding to fig. 1, and is not described herein again to avoid repetition.
The mobile terminal 300 of the embodiment of the present invention controls the photographing device to photograph an original image; sending the original image to each mobile terminal in at least one second mobile terminal; receiving a feedback image sent by the at least one second mobile terminal, wherein the feedback image is an image obtained by processing the original image; and determining a target image according to the feedback image. In this way, the first mobile terminal can obtain the feedback image processed by the second mobile terminal, and the user does not need to process the images one by one, so that the efficiency of image processing can be improved.
Referring to fig. 8, fig. 8 is a structural diagram of a mobile terminal according to an embodiment of the present invention, where the mobile terminal is a second mobile terminal. As shown in fig. 8, the mobile terminal 800 includes: a first receiving module 801, a processing module 802 and a first sending module 803, wherein the first receiving module 801 is connected with the processing module 802, and the processing module 802 is connected with the first sending module 803. A first receiving module 801, configured to receive an original image sent by a first mobile terminal, where the original image is an original image captured by a capturing device of the first mobile terminal; a processing module 802, configured to process the original image received by the first receiving module 801; a first sending module 803, configured to send the image processed by the processing module 802 to the first mobile terminal.
Optionally, as shown in fig. 9, the mobile terminal 800 further includes: a second receiving module 804, configured to receive a shooting preview image sent by the first mobile terminal, where the shooting preview image is a preview image displayed on a display interface during a shooting preview process of the first mobile terminal; a second sending module 805, configured to send confirmation information to the first mobile terminal based on the shooting preview image received by the second receiving module 804, so that the first mobile terminal shoots the original image according to the confirmation information.
Optionally, as shown in fig. 10, the processing module 802 includes: a determination submodule 8021 for determining a target image in the original image; the processing submodule 8022 is configured to process the image in the preset area where the target image determined by the determining submodule 8021 is located.
The mobile terminal 800 can implement each process implemented by the mobile terminal in the method embodiment corresponding to fig. 2, and is not described herein again to avoid repetition.
The mobile terminal 800 of the embodiment of the present invention receives an original image sent by a first mobile terminal; processing the original image; and sending the processed image to the first mobile terminal. Thus, the effect of the shot image is good, and the efficiency of image processing can be improved.
Referring to fig. 11, fig. 11 is a structural diagram of a mobile terminal according to an embodiment of the present invention, where the mobile terminal is a first mobile terminal. As shown in fig. 11, the mobile terminal 1100 includes: at least one processor 1101, memory 1102, at least one network interface 1104, and a user interface 1103. Various components in mobile terminal 1100 are coupled together by a bus system 1105. It is understood that the bus system 1105 is used to enable communications among the components. The bus system 1105 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled in fig. 11 as the bus system 1105. The mobile terminal 1100 further comprises a camera 1106, the camera 1106 being connected to the various components of the mobile terminal via the bus system 1105.
The user interface 1103 may include, among other things, a display, a keyboard, or a pointing device (e.g., a mouse, track ball, touch pad, or touch screen, etc.).
It is to be understood that the memory 1102 in embodiments of the present invention can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable Programmable ROM (EPROM), an Electrically Erasable Programmable ROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration, and not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous D RAM, SDRAM), Double Data Rate Synchronous Dynamic random access memory (ddr Data Rate SD RAM, ddr SDRAM), Enhanced Synchronous SD RAM (ESDRAM), Synchronous link Dynamic random access memory (Synchronous link D RAM, SLDRAM), and Direct memory bus random access memory (DRRAM). The memory 1102 of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In some embodiments, memory 1102 stores the following elements, executable modules or data structures, or a subset thereof, or an expanded set thereof: an operating system 11021 and application programs 11022.
The operating system 11021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, for implementing various basic services and processing hardware-based tasks. The application 11022 contains various applications such as a Media Player (Media Player), a Browser (Browser), etc. for implementing various application services. Programs that implement methods in accordance with embodiments of the invention may be included in application 11022.
In the embodiment of the present invention, the processor 1101 is configured to, by calling a program or an instruction stored in the memory 1102, specifically, a program or an instruction stored in the application 11022: controlling the shooting device to shoot an original image; sending the original image to each mobile terminal in at least one second mobile terminal; receiving a feedback image sent by the at least one second mobile terminal, wherein the feedback image is an image obtained by processing the original image; and determining a target image according to the feedback image.
The methods disclosed in the embodiments of the present invention described above may be implemented in the processor 1101 or by the processor 1101. The processor 1101 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by instructions in the form of hardware, integrated logic circuits, or software in the processor 1101. The Processor 1101 may be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 1102, and the processor 1101 reads the information in the memory 1102 and completes the steps of the above method in combination with the hardware thereof.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or any combination thereof. For a hardware implementation, the Processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units configured to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described herein may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
Optionally, the computer program when executed by the processor 1101 may further implement the steps of: in the shooting preview process, sending a shooting preview image to each second mobile terminal in the at least one second mobile terminal; and if the confirmation information sent by each second mobile terminal in the at least one second mobile terminal based on the shot preview image is received, controlling the shooting device to shoot the original image.
Optionally, the computer program when executed by the processor 1101 may further implement the steps of: when the number of the second mobile terminals is one, taking the feedback image as the target image; and when the number of the second mobile terminals is at least two, carrying out image synthesis by using the original image and the feedback image to obtain a target image.
Optionally, the computer program when executed by the processor 1101 may further implement the steps of: sending the target image to each second mobile terminal in the at least one second mobile terminal; and if feedback information sent by the at least one second mobile terminal is received, deleting the target image, wherein the feedback information is information sent to the first mobile terminal by the second mobile terminal which successfully receives the target image.
Optionally, the computer program when executed by the processor 1101 may further implement the steps of: and when the display interface of the first mobile terminal is a shooting preview interface, adding a shielding layer on the shooting preview interface.
Optionally, the computer program when executed by the processor 1101 may further implement the steps of: identifying a distinctive image feature of the feedback image relative to the original image; and replacing the target image characteristic of the original image with the distinguishing image characteristic, wherein the position of the distinguishing image characteristic in the feedback image is the same as the position of the target image characteristic in the original image.
The mobile terminal 1100 is capable of implementing each process implemented by the mobile terminal in the foregoing embodiments, and details are not repeated here to avoid repetition.
The mobile terminal 1100 of the embodiment of the present invention controls the photographing device to photograph an original image; sending the original image to each mobile terminal in at least one second mobile terminal; receiving a feedback image sent by the at least one second mobile terminal, wherein the feedback image is an image obtained by processing the original image; and determining a target image according to the feedback image. In this way, the first mobile terminal can obtain the feedback image processed by the second mobile terminal, and the user does not need to process the images one by one, so that the efficiency of image processing can be improved.
Referring to fig. 12, fig. 12 is a structural diagram of a mobile terminal according to an embodiment of the present invention, where the mobile terminal is a second mobile terminal. As shown in fig. 12, the mobile terminal 1200 includes: at least one processor 1201, memory 1202, at least one network interface 1204, and a user interface 1203. Various components in mobile terminal 1200 are coupled together by bus system 1205. It is understood that bus system 1205 is used to enable connected communication between these components. Bus system 1205 includes, in addition to a data bus, a power bus, a control bus, and a status signal bus. But for clarity of illustration the various buses are labeled as bus system 1205 in figure 12.
The user interface 1203 may include, among other things, a display, a keyboard, or a pointing device (e.g., a mouse, track ball, touch pad, or touch screen, etc.
It is to be understood that the memory 1202 in embodiments of the present invention may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable Programmable ROM (EPROM), an Electrically Erasable Programmable ROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration, and not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous D RAM, SDRAM), Double Data Rate Synchronous Dynamic random access memory (ddr Data Rate SD RAM, ddr SDRAM), Enhanced Synchronous SD RAM (ESDRAM), Synchronous link Dynamic random access memory (Synchronous link D RAM, SLDRAM), and Direct memory bus random access memory (DRRAM). The memory 1202 of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In some embodiments, memory 1202 stores the following elements, executable modules or data structures, or a subset thereof, or an expanded set thereof: an operating system 12021 and application programs 12022.
The operating system 12021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, and is used for implementing various basic services and processing hardware-based tasks. The application 12022 contains various applications such as a Media Player (Media Player), a Browser (Browser), and the like, and is used to implement various application services. A program implementing a method according to an embodiment of the present invention may be included in the application 12022.
In the embodiment of the present invention, by calling a program or an instruction stored in the memory 1202, specifically, a program or an instruction stored in the application program 12022, the processor 1201 is configured to: receiving an original image sent by a first mobile terminal, wherein the original image is an original image shot by a shooting device of the first mobile terminal; processing the original image; and sending the processed image to the first mobile terminal.
The method disclosed by the embodiment of the invention can be applied to the processor 1201 or implemented by the processor 1201. The processor 1201 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be implemented by integrated logic circuits of hardware or instructions in the form of software in the processor 1201. The Processor 1201 may be a general-purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 1202, and the processor 1201 reads information in the memory 1202 and completes the steps of the above method in combination with hardware thereof.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or any combination thereof. For a hardware implementation, the Processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units configured to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described herein may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
Optionally, the computer program may further implement the following steps when executed by the processor 1201: receiving a shooting preview image sent by the first mobile terminal, wherein the shooting preview image is a preview image displayed on a display interface in the shooting preview process of the first mobile terminal; and sending confirmation information to the first mobile terminal based on the shot preview image so that the first mobile terminal shoots an original image according to the confirmation information.
Optionally, the computer program may further implement the following steps when executed by the processor 1201: determining a target image in the original image; and processing the image in the preset area where the target image is located.
The mobile terminal 1200 can implement the processes implemented by the mobile terminal in the foregoing embodiments, and details are not repeated here to avoid repetition.
The mobile terminal 1200 of the embodiment of the present invention receives an original image sent by a first mobile terminal; processing the original image; and sending the processed image to the first mobile terminal. Thus, the effect of the shot image is good, and the efficiency of image processing can be improved.
Referring to fig. 13, fig. 13 is a structural diagram of a mobile terminal according to an embodiment of the present invention, where the mobile terminal is a first mobile terminal. Specifically, the mobile terminal 1300 in fig. 13 may be a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), or a vehicle-mounted computer.
As shown in fig. 13, the mobile terminal 1300 includes a Radio Frequency (RF) circuit 1310, a memory 1320, an input unit 1330, a display unit 1340, a processor 1350, an audio circuit 1360, a communication module 1370, a power supply 1380, and a photographing device 1390.
The input unit 1330 may be used to receive numeric or character information input by a user and generate signal inputs related to user settings and function control of the mobile terminal 1300, among other things. Specifically, in the embodiment of the present invention, the input unit 1330 may include a touch panel 1331. Touch panel 1331, also referred to as a touch screen, can collect touch operations by a user (e.g., operations performed by the user on touch panel 1331 using a finger, a stylus, or any other suitable object or accessory) and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 1331 may include two portions of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1350, and receives and executes commands transmitted from the processor 1350. In addition, the touch panel 1331 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to touch panel 1331, input unit 1330 may include other input devices 1332, where other input devices 1332 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
Among other things, the display unit 1340 may be used to display information input by a user or information provided to the user and various menu interfaces of the mobile terminal 1300. The display unit 1340 may include a display panel 1341, and optionally, the display panel 1341 may be configured in the form of an LCD or an Organic Light-Emitting Diode (OLED), or the like.
It should be noted that touch panel 1331 may overlay display panel 1341 to form a touch display screen, and when the touch display screen detects a touch operation thereon or nearby, the touch display screen is transmitted to processor 1350 to determine the type of touch event, and then processor 1350 provides a corresponding visual output on the touch display screen according to the type of touch event.
The touch display screen comprises an application program interface display area and a common control display area. The arrangement modes of the application program interface display area and the common control display area are not limited, and can be an arrangement mode which can distinguish two display areas, such as vertical arrangement, left-right arrangement and the like. The application interface display area may be used to display an interface of an application. Each interface may contain at least one interface element such as an icon and/or widget desktop control for an application. The application interface display area may also be an empty interface that does not contain any content. The common control display area is used for displaying controls with high utilization rate, such as application icons like setting buttons, interface numbers, scroll bars, phone book icons and the like. The touch screen is a flexible screen, and the two surfaces of the flexible screen are both pasted with the organic transparent conductive films of the carbon nanotubes.
The processor 1350 is a control center of the mobile terminal 1300, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal 1300 and processes data by operating or executing software programs and/or modules stored in the first memory 1321 and calling data stored in the second memory 1322, thereby integrally monitoring the mobile terminal 1300. Optionally, processor 1350 may include one or more processing units.
In an embodiment of the present invention, the processor 1350 is configured to, by invoking software programs and/or modules stored in the first memory 1321 and/or data stored in the second memory 1322: controlling the shooting device to shoot an original image; sending the original image to each mobile terminal in at least one second mobile terminal; receiving a feedback image sent by the at least one second mobile terminal, wherein the feedback image is an image obtained by processing the original image; and determining a target image according to the feedback image.
Optionally, the computer program when executed by the processor 1350 may also implement the following steps: in the shooting preview process, sending a shooting preview image to each second mobile terminal in the at least one second mobile terminal; and if the confirmation information sent by each second mobile terminal in the at least one second mobile terminal based on the shot preview image is received, controlling the shooting device to shoot the original image.
Optionally, the computer program when executed by the processor 1350 may also implement the following steps: when the number of the second mobile terminals is one, taking the feedback image as the target image; and when the number of the second mobile terminals is at least two, carrying out image synthesis by using the original image and the feedback image to obtain a target image.
Optionally, the computer program when executed by the processor 1350 may also implement the following steps: sending the target image to each second mobile terminal in at least one second mobile terminal; and if feedback information sent by the at least one second mobile terminal is received, deleting the target image, wherein the feedback information is information sent to the first mobile terminal by the second mobile terminal which successfully receives the target image.
Optionally, the computer program when executed by the processor 1350 may also implement the following steps: and when the display interface of the first mobile terminal is a shooting preview interface, adding a shielding layer on the shooting preview interface.
Optionally, the computer program when executed by the processor 1350 may also implement the following steps: identifying a distinctive image feature of the feedback image relative to the original image; and replacing the target image characteristic of the original image with the distinguishing image characteristic, wherein the position of the distinguishing image characteristic in the feedback image is the same as the position of the target image characteristic in the original image.
The mobile terminal 1300 can implement each process implemented by the mobile terminal in the foregoing embodiments, and details are not repeated here to avoid repetition.
The mobile terminal 1300 of the embodiment of the present invention controls the photographing device to photograph an original image; sending the original image to each mobile terminal in at least one second mobile terminal; receiving a feedback image sent by the at least one second mobile terminal, wherein the feedback image is an image obtained by processing the original image; and determining a target image according to the feedback image. In this way, the first mobile terminal can obtain the feedback image processed by the second mobile terminal, and the user does not need to process the images one by one, so that the efficiency of image processing can be improved.
Referring to fig. 14, fig. 14 is a structural diagram of a mobile terminal according to an embodiment of the present invention, where the mobile terminal is a second mobile terminal. Specifically, the mobile terminal 1400 in fig. 14 may be a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), or a vehicle-mounted computer.
As shown in fig. 14, the mobile terminal 1400 includes a Radio Frequency (RF) circuit 1410, a memory 1420, an input unit 1430, a display unit 1440, a processor 1450, an audio circuit 1460, a communication module 1470, and a power supply 1480.
The input unit 1430 may be used, among other things, to receive numeric or character information input by a user and to generate signal inputs related to user settings and function control of the mobile terminal 1400. Specifically, in the embodiment of the present invention, the input unit 1430 may include a touch panel 1431. The touch panel 1431, also referred to as a touch screen, may collect touch operations performed by a user on or near the touch panel 1431 (for example, operations performed by the user on the touch panel 1431 by using a finger, a stylus pen, or any other suitable object or accessory), and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 1431 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1450, and receives and executes commands from the processor 1450. In addition, the touch panel 1431 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to touch panel 1431, input unit 1430 may include other input devices 1432, where other input devices 1432 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 1440 may be used to display information input by or provided to the user, and various menu interfaces of the mobile terminal 1400, among others. The display unit 1440 may include a display panel 1441, and optionally, the display panel 1441 may be configured in the form of an LCD or an Organic Light-Emitting Diode (OLED), or the like.
It should be noted that touch panel 1431 may overlay display panel 1441 to form a touch display screen, and when the touch display screen detects a touch operation thereon or nearby, the touch display screen is transmitted to processor 1450 to determine the type of touch event, and then processor 1450 provides a corresponding visual output on the touch display screen according to the type of touch event.
The touch display screen comprises an application program interface display area and a common control display area. The arrangement modes of the application program interface display area and the common control display area are not limited, and can be an arrangement mode which can distinguish two display areas, such as vertical arrangement, left-right arrangement and the like. The application interface display area may be used to display an interface of an application. Each interface may contain at least one interface element such as an icon and/or widget desktop control for an application. The application interface display area may also be an empty interface that does not contain any content. The common control display area is used for displaying controls with high utilization rate, such as application icons like setting buttons, interface numbers, scroll bars, phone book icons and the like. The touch screen is a flexible screen, and the two surfaces of the flexible screen are both pasted with the organic transparent conductive films of the carbon nanotubes.
The processor 1450 is a control center of the mobile terminal 1400, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal 1400 and processes data by operating or executing software programs and/or modules stored in the first memory 1421 and calling data stored in the second memory 1422, thereby integrally monitoring the mobile terminal 1400. Alternatively, processor 1450 may include one or more processing units.
In an embodiment of the present invention, the processor 1450, by invoking software programs and/or modules stored in the first memory 1421 and/or data in the second memory 1422, is configured to: receiving an original image sent by a first mobile terminal, wherein the original image is an original image shot by a shooting device of the first mobile terminal; processing the original image; and sending the processed image to the first mobile terminal.
Optionally, the computer program when executed by the processor 1401 may further implement the steps of: receiving a shooting preview image sent by the first mobile terminal, wherein the shooting preview image is a preview image displayed on a display interface in the shooting preview process of the first mobile terminal; and sending confirmation information to the first mobile terminal based on the shot preview image so that the first mobile terminal shoots an original image according to the confirmation information.
Optionally, the computer program when executed by the processor 1450 further performs the steps of: determining a target image in the original image; and processing the image in the preset area where the target image is located.
The mobile terminal 1400 can implement each process implemented by the mobile terminal in the foregoing embodiments, and is not described here again to avoid repetition.
The mobile terminal 1400 of the embodiment of the present invention receives an original image sent by a first mobile terminal; processing the original image; and sending the processed image to the first mobile terminal. Thus, the effect of the shot image is good, and the efficiency of image processing can be improved.
Embodiments of the present invention also provide a computer-readable storage medium having stored thereon a computer program (instructions), which when executed by a processor, implement the steps of:
controlling the shooting device to shoot an original image; sending the original image to each mobile terminal in at least one second mobile terminal; receiving a feedback image sent by the at least one second mobile terminal, wherein the feedback image is an image obtained by processing the original image; and determining a target image according to the feedback image.
Or the program (instructions) when executed by a processor implement the steps of:
receiving an original image sent by a first mobile terminal; processing the original image; and sending the processed image to the first mobile terminal.
Optionally, the program (instructions) when executed by the processor implement the steps of: in the shooting preview process, sending a shooting preview image to each second mobile terminal in the at least one second mobile terminal; and if the confirmation information sent by each second mobile terminal in the at least one second mobile terminal based on the shot preview image is received, controlling the shooting device to shoot the original image.
Optionally, the program (instructions) when executed by the processor implement the steps of: when the number of the second mobile terminals is one, taking the feedback image as the target image; and when the number of the second mobile terminals is at least two, carrying out image synthesis by using the original image and the feedback image to obtain a target image.
Optionally, the program (instructions) when executed by the processor implement the steps of: sending the target image to each second mobile terminal in at least one second mobile terminal; and if feedback information sent by the at least one second mobile terminal is received, deleting the target image, wherein the feedback information is information sent to the first mobile terminal by the second mobile terminal which successfully receives the target image.
Optionally, the program (instructions) when executed by the processor implement the steps of: and when the display interface of the first mobile terminal is a shooting preview interface, adding a shielding layer on the shooting preview interface.
Optionally, the program (instructions) when executed by the processor implement the steps of: identifying a distinctive image feature of the feedback image relative to the original image; and replacing the target image characteristic of the original image with the distinguishing image characteristic, wherein the position of the distinguishing image characteristic in the feedback image is the same as the position of the target image characteristic in the original image.
Optionally, the program (instructions) when executed by the processor implement the steps of: receiving a shooting preview image sent by the first mobile terminal, wherein the shooting preview image is a preview image displayed on a display interface in the shooting preview process of the first mobile terminal; and sending confirmation information to the first mobile terminal based on the shot preview image so that the first mobile terminal shoots an original image according to the confirmation information.
Optionally, the program (instructions) when executed by the processor implement the steps of: determining a target image in the original image; and processing the image in the preset area where the target image is located.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (4)
1. A photographing method is applied to a first mobile terminal with a photographing device, and is characterized by comprising the following steps:
controlling the shooting device to shoot an original image;
sending the original image to each second mobile terminal in at least one second mobile terminal, wherein the first mobile terminal divides the original image into regions, and the second mobile terminal processes the image in the regions according to the divided regions;
receiving a feedback image sent by the at least one second mobile terminal, wherein the feedback image is an image obtained by processing the original image;
determining a target image according to the feedback image;
before the step of controlling the photographing device to photograph the original image, the method further includes:
in the shooting preview process, sending a shooting preview image to each second mobile terminal in the at least one second mobile terminal;
if receiving confirmation information sent by each second mobile terminal of the at least one second mobile terminal based on the shot preview image, executing the step of controlling the shooting device to shoot the original image;
when the number of the second mobile terminals is one, the step of determining the target image according to the feedback image includes:
taking the feedback image as the target image;
when the number of the second mobile terminals is at least two, the step of determining the target image according to the feedback image includes:
performing image synthesis by using the original image and the feedback image to obtain a target image, wherein the received feedback image comprises at least two images, and replacing an image area in the original image with an image area in the feedback image;
before the step of sending the shooting preview image to each of the at least one second mobile terminal during the shooting preview process, the method further includes:
when the display interface of the first mobile terminal is a shooting preview interface, adding a shielding layer on the shooting preview interface;
the step of performing image synthesis using the original image and the feedback image includes:
identifying a distinctive image feature of the feedback image relative to the original image;
and replacing the target image characteristic of the original image by using the distinguishing image characteristic, wherein the position of the distinguishing image characteristic in a feedback image is the same as that of the target image characteristic in the original image, and after replacing the image characteristic, performing boundary blurring processing on the synthesized boundary of the replaced image and the original image.
2. The photographing method according to claim 1, wherein after the step of determining a target image from the feedback image, the method further comprises:
sending the target image to each second mobile terminal in the at least one second mobile terminal;
and if feedback information sent by the at least one second mobile terminal is received, deleting the target image, wherein the feedback information is information sent to the first mobile terminal by the second mobile terminal which successfully receives the target image.
3. A mobile terminal, which is a first mobile terminal having a camera, comprising:
the shooting module is used for controlling the shooting device to shoot an original image;
a first sending module, configured to send the original image captured by the capturing module to each of at least one second mobile terminal, where the first mobile terminal divides the original image into regions, so that the second mobile terminal processes images in the regions according to the divided regions;
the receiving module is used for receiving a feedback image sent by the at least one second mobile terminal, wherein the feedback image is an image obtained by processing the original image sent by the first sending module;
the determining module is used for determining a target image according to the feedback image received by the receiving module;
the mobile terminal further includes:
the second sending module is used for sending the shooting preview image to each second mobile terminal in the at least one second mobile terminal in the shooting preview process;
the shooting module is specifically configured to control the shooting device to shoot an original image if receiving confirmation information sent by each second mobile terminal of the at least one second mobile terminal based on the shot preview image;
when the number of the second mobile terminals is one, the determining module is specifically configured to use the feedback image as the target image;
when the number of the second mobile terminals is at least two, the determining module is specifically configured to perform image synthesis by using the original image and the feedback image to obtain a target image, where the received feedback image includes at least two, and replace an image area in the original image with an image area in the feedback image;
the mobile terminal further includes:
the shielding module is used for adding a shielding layer on a shooting preview interface when the display interface of the first mobile terminal is the shooting preview interface;
the determining module comprises:
an identification submodule for identifying a distinctive image feature of the feedback image relative to the original image;
and the replacing submodule is used for replacing the target image characteristic of the original image by using the distinguishing image characteristic identified by the identifying submodule, wherein the position of the distinguishing image characteristic in the feedback image is the same as that of the target image characteristic in the original image, and after the image characteristic is replaced, the boundary of the replaced image and the original image is subjected to blurring processing.
4. The mobile terminal of claim 3, wherein the mobile terminal further comprises:
a third sending module, configured to send the target image to each of the at least one second mobile terminal;
and the deleting module is configured to delete the target image sent by the third sending module if feedback information sent by the at least one second mobile terminal is received, where the feedback information is information sent to the first mobile terminal by the second mobile terminal that successfully receives the target image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710803176.4A CN107705275B (en) | 2017-09-08 | 2017-09-08 | Photographing method and mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710803176.4A CN107705275B (en) | 2017-09-08 | 2017-09-08 | Photographing method and mobile terminal |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107705275A CN107705275A (en) | 2018-02-16 |
CN107705275B true CN107705275B (en) | 2021-02-26 |
Family
ID=61172297
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710803176.4A Active CN107705275B (en) | 2017-09-08 | 2017-09-08 | Photographing method and mobile terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107705275B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109389550B (en) * | 2018-09-17 | 2023-12-26 | 联想(北京)有限公司 | Data processing method, device and computing equipment |
CN109799936B (en) * | 2019-01-22 | 2022-05-24 | 上海联影医疗科技股份有限公司 | Image generation method, device, equipment and medium |
CN111953904B (en) * | 2020-08-13 | 2022-08-12 | 北京达佳互联信息技术有限公司 | Shooting method, shooting device, electronic equipment and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103631768A (en) * | 2012-08-20 | 2014-03-12 | 三星电子株式会社 | Collaborative data editing and processing system |
CN105099875A (en) * | 2015-06-24 | 2015-11-25 | 努比亚技术有限公司 | Method and device for multiple users to collaboratively edit and publish picture information |
CN105743973A (en) * | 2016-01-22 | 2016-07-06 | 上海科牛信息科技有限公司 | Multi-user multi-device real-time synchronous cloud cooperation method and system |
CN106611402A (en) * | 2015-10-23 | 2017-05-03 | 腾讯科技(深圳)有限公司 | Image processing method and device |
CN106791197A (en) * | 2017-02-21 | 2017-05-31 | 维沃移动通信有限公司 | A kind of method taken pictures and mobile terminal |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7697040B2 (en) * | 2005-10-31 | 2010-04-13 | Lightbox Network, Inc. | Method for digital photo management and distribution |
CN103197910B (en) * | 2013-04-17 | 2016-06-22 | 东软集团股份有限公司 | Image updating method and device |
CN104680480B (en) * | 2013-11-28 | 2019-04-02 | 腾讯科技(上海)有限公司 | A kind of method and device of image procossing |
CN103823678B (en) * | 2014-02-21 | 2018-07-03 | 联想(北京)有限公司 | Image processing method and image processing apparatus |
CN105825534B (en) * | 2016-03-15 | 2021-06-04 | 北京金山安全软件有限公司 | Picture processing method and device |
CN106657620A (en) * | 2016-11-30 | 2017-05-10 | 努比亚技术有限公司 | Picture synthesis method and device, and mobile terminal |
-
2017
- 2017-09-08 CN CN201710803176.4A patent/CN107705275B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103631768A (en) * | 2012-08-20 | 2014-03-12 | 三星电子株式会社 | Collaborative data editing and processing system |
CN105099875A (en) * | 2015-06-24 | 2015-11-25 | 努比亚技术有限公司 | Method and device for multiple users to collaboratively edit and publish picture information |
CN106611402A (en) * | 2015-10-23 | 2017-05-03 | 腾讯科技(深圳)有限公司 | Image processing method and device |
CN105743973A (en) * | 2016-01-22 | 2016-07-06 | 上海科牛信息科技有限公司 | Multi-user multi-device real-time synchronous cloud cooperation method and system |
CN106791197A (en) * | 2017-02-21 | 2017-05-31 | 维沃移动通信有限公司 | A kind of method taken pictures and mobile terminal |
Non-Patent Citations (2)
Title |
---|
Collaborative Image Coding and Transmission over Wireless Sensor Networks;Min Wu 等;《EURASIP Journal on Advances in Signal Processing》;20061201;全文 * |
分布式协作模型及应用研究;张全海 等;《计算机科学》;20030615;第30卷(第6期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN107705275A (en) | 2018-02-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107197169B (en) | high dynamic range image shooting method and mobile terminal | |
CN110100251B (en) | Apparatus, method, and computer-readable storage medium for processing document | |
EP2701152B1 (en) | Media object browsing in a collaborative window, mobile client editing, augmented reality rendering. | |
KR102013331B1 (en) | Terminal device and method for synthesizing a dual image in device having a dual camera | |
CN107678644B (en) | Image processing method and mobile terminal | |
CN106657793B (en) | A kind of image processing method and mobile terminal | |
CN107172346B (en) | Virtualization method and mobile terminal | |
CN107509030B (en) | focusing method and mobile terminal | |
CN106791437B (en) | Panoramic image shooting method and mobile terminal | |
CN106454086B (en) | Image processing method and mobile terminal | |
CN106937055A (en) | A kind of image processing method and mobile terminal | |
US9509733B2 (en) | Program, communication apparatus and control method | |
CN108776822B (en) | Target area detection method, device, terminal and storage medium | |
CN108024073B (en) | Video editing method and device and intelligent mobile terminal | |
CN112230914B (en) | Method, device, terminal and storage medium for producing small program | |
US20170214856A1 (en) | Method for controlling motions and actions of an apparatus including an image capture device having a moving device connected thereto using a controlling device | |
CN106162150B (en) | A kind of photographic method and mobile terminal | |
CN106713747A (en) | Focusing method and mobile terminal | |
CN107705275B (en) | Photographing method and mobile terminal | |
US10063781B2 (en) | Imaging control device, imaging control method, imaging system, and program | |
CN107346332A (en) | A kind of image processing method and mobile terminal | |
CN111159449A (en) | Image display method and electronic equipment | |
CN106412432A (en) | Photographing method and mobile terminal | |
US20240080543A1 (en) | User interfaces for camera management | |
CN108200477B (en) | Method, device and equipment for generating and playing video file |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |