Nothing Special   »   [go: up one dir, main page]

CN109981965B - Focusing method and electronic equipment - Google Patents

Focusing method and electronic equipment Download PDF

Info

Publication number
CN109981965B
CN109981965B CN201711447201.6A CN201711447201A CN109981965B CN 109981965 B CN109981965 B CN 109981965B CN 201711447201 A CN201711447201 A CN 201711447201A CN 109981965 B CN109981965 B CN 109981965B
Authority
CN
China
Prior art keywords
camera
focusing
contrast
phase
focus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711447201.6A
Other languages
Chinese (zh)
Other versions
CN109981965A (en
Inventor
谢琼
蔡西蕾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201711447201.6A priority Critical patent/CN109981965B/en
Priority to PCT/CN2018/123942 priority patent/WO2019129077A1/en
Publication of CN109981965A publication Critical patent/CN109981965A/en
Application granted granted Critical
Publication of CN109981965B publication Critical patent/CN109981965B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

The application provides a focusing method and electronic equipment, wherein the electronic equipment comprises a processor and at least two cameras; a first camera of the at least two cameras is used for detecting phase information of an image; the second camera of the at least two cameras is used for detecting contrast information of the image; the processor is used for determining at least one phase focusing position of the second camera according to one or more phase focusing positions of the first camera in the process of carrying out phase detection automatic focusing by the first camera, and controlling the second camera to move to the at least one phase focusing position; determining a contrast in-focus position of the second camera based on contrast information of images detected by the second camera at one or more of the at least one phase in-focus position, wherein the contrast in-focus position of the second camera comprises a contrast in-focus movement direction of the second camera. The embodiment of the application can realize rapid focusing.

Description

Focusing method and electronic equipment
Technical Field
The present application relates to the field of electronic devices, and more particularly, to a focusing method and an electronic device.
Background
With the development of science and technology, in order to meet the diversified demands of users for shooting, more and more electronic devices support dual cameras or multiple cameras, for example, existing smart phones generally have a rear dual camera or a front dual camera.
In the existing electronic device with double-shot, only one camera (which may be called a main camera and the other camera may be called an auxiliary camera) supports Phase Detection Auto Focus (PDAF) due to cost considerations.
When shooting, one focusing method of the existing electronic device is as follows: the method comprises the steps that a main camera firstly determines a focus position of the main camera by using PDAF, an auxiliary camera determines and adjusts the lens position of the auxiliary camera according to the focus position of the main camera after the main camera focuses, and finally the auxiliary camera adopts a Contrast Detection Auto Focus (CDAF) fine-tuning lens to determine the focus position of the auxiliary camera. However, in the focusing mode, the main camera needs to be focused first, and then the auxiliary camera needs to be focused, so that the whole focusing time is long, and the user experience is influenced.
Another focusing method of the existing electronic device is as follows: the main camera and the sub camera are focused simultaneously, wherein the main camera is focused using PDAF, and the sub camera is focused using CDAF or detects the distance of the subject by laser light to perform focusing (may be simply referred to as laser focusing). However, these focusing methods adopted by the auxiliary camera also have some disadvantages, for example, the wind box phenomenon of the CDAF is serious, the focusing time is long, and the shot picture is blurred in the focusing process; similarly, the laser focusing process takes a long time, and the shot picture is blurred in the focusing process, which affects the user experience.
Therefore, how to provide a fast focusing method becomes an urgent problem to be solved.
Disclosure of Invention
The application provides a focusing method and electronic equipment, which can realize quick focusing and improve user experience.
In a first aspect, a method for focusing is provided, which is applied in an electronic device including at least two cameras, a first camera of the at least two cameras is used for detecting phase information of an image, and a second camera of the at least two cameras is used for detecting contrast information of the image, and the method includes:
in the process of carrying out phase detection automatic focusing by the first camera, determining at least one phase focusing position of the second camera according to one or more phase focusing positions of the first camera, and controlling the second camera to move to the at least one phase focusing position;
determining a contrast in focus position of the second camera based on contrast information of images detected by the second camera at one or more of the at least one phase in focus position, wherein the contrast in focus position of the second camera comprises a contrast in focus movement direction of the second camera.
It should be understood that in the embodiment of the present application, "during the phase detection autofocus by the first camera" may indicate all or part of the time period from when the phase detection focus is turned on to before the phase detection focus is ended by the first camera. In other words, "during the phase detection autofocus by the first camera" may indicate a time period before the first camera turns on the CDAF focus, and the embodiment of the present application is not limited thereto.
Specifically, the processor may determine at least one phase focus position of the second camera according to one or more phase focus positions of the first camera during the phase detection auto-focus of the first camera, and control the second camera to move to the at least one phase focus position; the processor controls the focusing of the two cameras in parallel, so that the second camera which does not support the PDAF function synchronously supports the focusing behavior of the first camera which does not support the PDAF function, the rapid focusing of the second camera which does not support the PDAF function is realized, the integral focusing time can be reduced, and the user experience is improved.
It should be understood that in the embodiment of the present application, the processor controls the second camera to move to the at least one phase focusing position during the phase detection auto-focusing of the first camera. It can also be stated that the processor controls the first camera and the second camera in parallel. Controlling the first camera and the second camera in parallel may be understood as controlling the first camera and the second camera simultaneously, and it should be understood that controlling the two cameras in parallel or simultaneously in the embodiments of the present application is not limited to controlling the two cameras to be strictly identical in time, for example, controlling the time of movement of the two cameras may permit a certain time interval, and the embodiments of the present application are not limited thereto.
For example, after the focusing position of the first camera is determined, the first camera may be controlled to move, then during the moving process of the first camera, the focusing position of the second camera is determined according to the focusing position of the first camera, and then the second camera is controlled to move. For another example, after the focusing position of the first camera is determined, the focusing position of the second camera may be determined according to the focusing position of the first camera, and then the processor controls the first camera and the second camera to move to the corresponding focusing positions respectively in parallel.
In particular, the processor may move the camera to the corresponding focus position by controlling the lens motor driver.
Therefore, in the embodiment of the application, the processor can control at least two cameras to focus in parallel, so that the camera which does not support the PDAF function synchronously supports the focusing behavior of the camera which does not support the PDAF function, the fast focusing of the camera which does not support the PDAF function is realized, the overall focusing time can be reduced, and the user experience is improved.
It should be understood that in the embodiments of the present application, the one or more phase focus positions of the first camera may include one or a plurality of consecutive phase focus positions of the first camera during the phase detection autofocus. The one or more phase-focus positions may include a first phase-focus position, one or more intermediate phase-focus positions in a phase-focus process, or a last phase-focus position, and the embodiments of the present application are not limited thereto.
It should be understood that, in the embodiment of the present application, the phase focus position is a focus position calculated based on phase information of an image. The contrast focus position is a focus position calculated based on contrast information of an image.
It should be understood that the "in-focus position" in the embodiments of the present application may include a moving direction and/or a moving distance, where the moving direction indicates a direction in which a movable lens, a lens group or a lens in the camera needs to be moved in order to obtain a clear image; the movement distance indicates a distance that a movable lens, a lens group, or a lens in the camera needs to be moved in the movement direction in order to obtain a clear image. In other words, the in-focus position may indicate a position that a movable lens, lens group or lens in the camera needs to be in order to obtain a sharp image.
It should be understood that the embodiment of the present application may determine the contrast focus moving direction according to the contrast information of the recorded image, for example, when the curve information of the recorded contrast information is better, the contrast focus moving direction may be determined to be a trend direction toward the curve information.
Specifically, in the embodiment of the present application, in the process of synchronizing the first camera with the second camera, the contrast in-focus position of the second camera may be determined according to contrast information of an image detected by the second camera at one or more phase in-focus positions of the at least one phase in-focus position. That is to say, in the embodiment of the present application, during the synchronization of the first camera by the second camera, contrast information of an image detected by the second camera may be recorded, and a contrast in-focus position of the second camera may be determined according to the contrast information. The action of synchronizing the first camera with the second camera may cause the focusing moving direction of the second camera to be inaccurate, or the second camera needs to focus in a contrast focusing manner after the focusing of the first camera is finished. According to the embodiment of the application, the accuracy of the focusing moving direction can be ensured through the recorded contrast information, or the contrast focusing position can be directly determined through the recorded contrast information, so that the focusing position of the second camera can be prevented from being determined again according to the focusing position of the first camera or the CDAF mode, the focusing time can be shortened, and the user experience can be improved.
It should be understood that after controlling the first camera and the second camera to move to the first focusing position and the second focusing position respectively, the two cameras can be considered to be in focus, and then the image signal processor can acquire images through the first camera and the second camera respectively and combine the images acquired by the two cameras to form a final image. Optionally, in the case that the electronic device has a display screen, the electronic device may also display the final image through the display screen.
It should be understood that, the image processing process after the two cameras respectively move to the corresponding in-focus positions may refer to an existing image synthesis algorithm of multiple cameras, which is not limited in this embodiment of the application.
Optionally, in a possible implementation, the method further includes:
controlling whether the second camera detects contrast information of an image or not according to the moving state of the electronic equipment and/or the image stable state of the second camera;
wherein the second camera is controlled to detect contrast information of the image in one or more of the at least one phase focus positions.
For example, when the moving state of the electronic equipment and/or the image stable state of the second camera meet the detection condition, the processor controls the second camera to detect the contrast information of the image; alternatively, the processor does not control the second camera to detect the contrast information of the image when the moving state, and/or the image stabilization state does not satisfy the detection condition.
For example, when the moving state of the electronic device is more stable or slower, for example, the gyroscope or the accelerator of the electronic device detects that the mobility of the electronic device is less than a preset moving threshold, and/or the stable state of the image is that the contrast of the image is less changed (for example, the contrast of the image of the second camera at the current focusing position and the previous focusing position is less changed, for example, less than a preset contrast change threshold, the detection condition is considered to be satisfied.
It should be understood that, when the detection condition is satisfied, the embodiment of the present application may also consider that the captured picture is still. When the detection condition is not satisfied, the shot picture may be considered to be changed, and the embodiment of the present application is not limited thereto.
Specifically, in the embodiment of the present application, during the synchronization of the first camera by the second camera, whether the second camera detects contrast information of an image may be further controlled according to a moving state of the electronic device and/or an image stable state of the second camera when the second camera is in the at least one phase focus position, for example, the processor determines contrast information of a detected image at one or more phase focus positions of the second camera in the at least one phase focus position, and determines a contrast focus position of the second camera according to the contrast information of the detected image.
That is to say, in the embodiment of the present application, in the process of the second camera synchronizing the first camera, and when it is determined that the picture is still, the contrast information of the image is detected, because the contrast information of the image acquired by the still picture is relatively reliable, the processor can accurately determine the contrast focusing position according to the contrast information of the acquired image, and then the embodiment of the present application can ensure the accuracy of the focusing moving direction through the recorded contrast information, or directly determine the contrast focusing position through the recorded contrast information, thereby avoiding determining the focus aligning position of the second camera again according to the focusing position of the first camera or the CDAF manner, reducing the focusing time, and improving the user experience.
It was described above that the processor controls the second camera to acquire contrast information of an image only in case the picture is still and determines the contrast in-focus position of the second camera. Alternatively, the embodiment of the present application may also detect contrast information of an image at all phase in-focus positions of the second camera, and then obtain valid information from the detected contrast information of the image.
Optionally, in a possible implementation, the method further includes:
determining whether contrast information of an image detected by the second camera is valid according to the moving state of the electronic equipment and/or the image stable state of the second camera;
wherein contrast information of images detected by the second camera at one or more of the at least one phase-in-focus positions is valid.
For example, when the moving state of the electronic device is more stable or slower, for example, the gyroscope or the accelerator of the electronic device detects that the mobility of the electronic device is smaller than a preset moving threshold, and/or the stable state of the image is that the contrast of the image is less changed (for example, the contrast of the image detected by the second camera at the current focusing position is less changed, for example, smaller than a preset contrast change threshold (i.e., when the shooting picture is still), the contrast information of the image detected by the second camera is considered to be valid.
It should be understood that in the embodiments of the present application, the second camera is stationary in one or more of the at least one phase-focus positions, and a plurality of the at least one phase-focus positions are continuous effective information, that is, the second camera always takes a still picture in the plurality of phase-focus positions.
That is to say, in the process of synchronizing the first shooting by the second camera, the embodiment of the present application detects the contrast information of the image, and determines that the contrast information of the image is valid when the image is still, because the contrast information of the image acquired by the still image is relatively reliable, the processor can accurately determine the contrast focusing position according to the contrast information of the acquired image, and further, the embodiment of the present application can ensure the accuracy of the focusing moving direction through the recorded contrast information, or directly determine the contrast focusing position through the recorded contrast information, thereby avoiding determining the focusing position of the second camera again according to the focusing position of the first camera or the CDAF manner, reducing the focusing time, and improving the user experience.
Optionally, in a possible implementation, the contrast in-focus position of the second camera further comprises a contrast in-focus movement distance of the second camera,
the method further comprises the following steps:
controlling the second camera to move to the contrast in-focus position.
It should be understood that, in a case where a curve of the contrast information of the image acquired by the second camera recorded has a peak value, it may be determined that the in-focus position (i.e. the contrast focus moving direction and moving distance) of the second camera can be determined according to the recorded contrast information, for example, the in-focus position may be a position corresponding to the peak value, or the in-focus position may be an in-focus position obtained by curve fitting of the contrast information.
In addition, when it is determined that the in-focus position of the second camera cannot be calculated according to the recorded contrast information of the image acquired by the second camera, when the lens of the second camera is controlled to move to the in-focus position of the second camera determined by the CDAF method, how to adopt CDAF focusing can be determined according to the recorded contrast information, for example, when the curve information of the recorded contrast information is better, the lens can be moved in the direction before the CDAF method is continued to determine the final in-focus position; alternatively, when the curve information of the recorded contrast information is worse, the lens may be moved in the reverse direction by the CDAF method to determine the final in-focus position.
That is to say, in the embodiment of the present application, in the process of synchronizing the second camera with the phase detection autofocus of the first camera, as long as the contrast focusing moving direction and the moving distance of the second camera can be determined according to the contrast information of the recorded image, regardless of whether the first camera completes focusing, the second camera may be directly controlled to move the moving distance in the moving direction, so as to complete focusing of the second camera.
Specifically, in the embodiment of the present application, in the process of synchronizing the first camera with the second camera, the contrast focus moving direction of the second camera and the contrast focus moving distance of the second camera may be determined according to contrast information of an image detected by the second camera at one or more phase focus positions of the at least one phase focus position. Because the in-focus position of the second camera is determined, the processor can directly control the second camera to move the contrast focusing movement distance in the contrast focusing movement direction of the second camera, so that the focusing of the second camera is completed, the behavior of the second camera for synchronizing the first camera is not required to be controlled, the second camera is not required to be controlled, the in-focus position is determined in a CDAF mode, the focusing time can be reduced, and the user experience is improved.
Optionally, in a possible implementation, the method further includes:
determining a next phase focusing position of the second camera according to the next phase focusing position of the first camera, wherein the next phase focusing position of the second camera comprises a phase focusing moving direction and a phase focusing moving distance of the second camera;
and when the contrast focusing moving direction is consistent with the phase focusing moving direction, controlling the second camera to move to the next phase focusing position.
Specifically, during the phase detection autofocus of the first camera (i.e., when the first camera has not finished phase focusing), after the first camera has moved to the current phase focus position, as described above in step 230, controlling the first camera to move to a next phase in-focus position if the distance is greater than the first threshold, and determining a next phase focus position of the second camera based on the next phase focus position of the first camera, when the contrast focusing moving direction is consistent with the phase focusing moving direction, controlling the second camera to move to the next phase focusing position, repeating the process until the phase focusing of the first camera is finished, or the contrast focusing moving direction and the moving distance of the second camera can be determined according to the recorded contrast information.
Optionally, in a possible implementation, the method further includes:
and when the phase detection automatic focusing of the first camera is finished, or the contrast focusing moving direction is inconsistent with the phase focusing moving direction, controlling the second camera to move for a preset distance in the contrast focusing moving direction.
Specifically, when the phase focusing moving direction is not consistent with the contrast focusing moving direction, the embodiment of the present application may stop the behavior of the second camera in synchronization with the first camera, and directly control the second camera to move a preset distance in the contrast focusing moving direction by using a CDAF method, for example, move in the contrast focusing moving direction by using a fixed step length of the CDAF to perform CDAF focusing.
Or, after the phase detection of the first camera is finished and the automatic focusing is finished, the second camera is controlled to move a preset distance in the contrast focusing moving direction by using a CDAF method, for example, the second camera moves in the contrast focusing moving direction by using a fixed step length of the CDAF to perform CDAF focusing.
Therefore, in the embodiment of the application, the contrast focusing moving direction of the second camera is determined through the contrast information of the image in the process of synchronizing the second camera with the first camera, so that the behavior of synchronizing the second camera with the first camera by the second camera is stopped when the phase focusing direction of the second camera is wrong, namely the phase focusing direction of the second camera is opposite to the contrast focusing direction, and CDAF is directly adopted in the contrast focusing direction, so that unnecessary moving behavior of the second camera can be avoided, and accurate and rapid focusing of the second camera is ensured.
Or, in the embodiment of the application, the contrast focusing moving direction of the second camera is determined by the contrast information of the image in the process of synchronizing the second camera with the first camera, so that after the focusing of the first camera is finished, the CDAF focusing can be directly adopted in the contrast focusing direction, the second camera can be prevented from adopting a random direction to move the focusing direction of the second camera in a CDAF mode, and the accurate and quick focusing of the second camera is ensured.
In a second aspect, a processor is provided, the processor comprising: a processing unit and a storage unit, wherein,
the storage unit is configured to store code, and the processing unit is configured to execute the code in the storage unit to implement the first aspect or the method in any feasible implementation manner of the first aspect.
In a third aspect, an electronic device is provided, which includes: a processor and at least two cameras;
a first camera of the at least two cameras is used for detecting phase information of an image;
the second camera of the at least two cameras is used for detecting contrast information of the image;
the processor is configured to:
in the process of carrying out phase detection automatic focusing by the first camera, determining at least one phase focusing position of the second camera according to one or more phase focusing positions of the first camera, and controlling the second camera to move to the at least one phase focusing position;
determining a contrast in focus position of the second camera based on contrast information of images detected by the second camera at one or more of the at least one phase in focus position, wherein the contrast in focus position of the second camera comprises a contrast in focus movement direction of the second camera.
Therefore, in the embodiment of the present application, during the process of performing phase detection focusing by the first camera, at least one phase focusing position of the second camera may be determined according to one or more phase focusing positions of the first camera, and the second camera may be controlled to move to the at least one phase focusing position. The camera which does not support the PDAF function synchronously supports the focusing behavior of the camera which does not support the PDAF function, the fast focusing of the camera which does not support the PDAF function is realized, the integral focusing time can be reduced, and the user experience is improved.
Furthermore, in this embodiment of the application, in the process of synchronizing the first camera with the second camera, the contrast focusing position of the second camera may also be determined according to contrast information of images detected by the second camera at one or more phase focusing positions of the at least one phase focusing position. That is to say, in the embodiment of the present application, during the synchronization of the first camera by the second camera, contrast information of an image detected by the second camera may be recorded, and a contrast in-focus position of the second camera may be determined according to the contrast information. The action of synchronizing the first camera with the second camera may cause the focusing moving direction of the second camera to be inaccurate, or the second camera needs to focus in a contrast focusing manner after the focusing of the first camera is finished. In view of this, according to the embodiment of the present application, the accuracy of the focusing moving direction can be ensured through the recorded contrast information, or the contrast focusing position can be directly determined through the recorded contrast information, so that the determination of the focus position of the second camera according to the focusing position of the first camera or the CDAF manner is avoided, the focusing time can be reduced, and the user experience can be improved.
It is to be understood that the third aspect corresponds to the first aspect, and the processor is capable of implementing the method of the first aspect and its possible implementations, and the detailed description is omitted here where appropriate to avoid repetition.
Optionally, in a possible implementation, the processor is further configured to:
controlling whether the second camera detects contrast information of an image or not according to the moving state of the electronic equipment and/or the image stable state of the second camera;
wherein the processor controls the second camera to detect contrast information of the image in one or more of the at least one phase-in-focus positions.
Optionally, in a possible implementation, the processor is further configured to:
determining whether contrast information of an image detected by the second camera is valid according to the moving state of the electronic equipment and/or the image stable state of the second camera;
wherein contrast information of images detected by the second camera at one or more of the at least one phase-in-focus positions is valid.
Optionally, in a possible implementation, the contrast in-focus position of the second camera further comprises a contrast in-focus movement distance of the second camera,
the processor is further configured to:
controlling the second camera to move to the contrast in-focus position.
Optionally, in a possible implementation, the processor is further configured to:
determining a next phase focusing position of the second camera according to the next phase focusing position of the first camera, wherein the next phase focusing position of the second camera comprises a phase focusing moving direction and a phase focusing moving distance of the second camera;
and when the contrast focusing moving direction is consistent with the phase focusing moving direction, controlling the second camera to move to the next phase focusing position.
Optionally, in a possible implementation, the processor is further configured to:
and when the phase detection automatic focusing of the first camera is finished, or the contrast focusing moving direction is inconsistent with the phase focusing moving direction, controlling the second camera to move for a preset distance in the contrast focusing moving direction.
Alternatively, in one possible design, the processor-implemented scheme described above may be implemented by a chip.
In a fourth aspect, there is provided a computer program product comprising: computer program (also called code, or instructions), which when executed, causes a computer to perform the method of any of the possible implementations of the first aspect and the first aspect described above.
In a fifth aspect, a computer-readable medium is provided, which stores a computer program (which may also be referred to as code or instructions) that, when executed on a computer, causes the computer to perform the method of any one of the above-mentioned first aspect and possible implementation manner of the first aspect.
Drawings
Fig. 1 is a schematic diagram of a scenario in which an embodiment of the present application is applicable.
FIG. 2 is a schematic focusing flow diagram according to an embodiment of the present application.
FIG. 3 is a schematic diagram of a focusing method according to an embodiment of the present application.
Fig. 4 is a schematic view of the principle of lens imaging.
FIG. 5 is a schematic diagram of a focusing process according to an embodiment of the present application.
FIG. 6 is a schematic diagram of a focusing process according to an embodiment of the present application.
FIG. 7 is a schematic diagram of a focusing process according to an embodiment of the present application.
FIG. 8 is a schematic focusing flow diagram according to another embodiment of the present application.
Fig. 9 is a schematic block diagram of an image signal processor according to an embodiment of the present application.
FIG. 10 is a schematic block diagram of an electronic device according to one embodiment of the present application.
FIG. 11 is a block diagram of an electronic device according to one embodiment of the present application.
Detailed Description
The technical solution in the present application will be described below with reference to the accompanying drawings.
Fig. 1 shows a scene diagram applicable to the embodiment of the present application. As shown in fig. 1, the electronic device 100 may include at least two cameras, for example, a first camera 110 and a second camera 120, and the electronic device 100 may control the first camera 110 and the second camera 120 to focus on an object 130 and acquire an image of the object 130 through a processor controller (not shown in the figure).
The electronic device in the embodiment of the present application may include a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), a notebook computer, a desktop computer, a point of sale (POS), a monitoring device, and other devices including at least two cameras.
It should be understood that the processor in the embodiments of the present application may also be referred to as an image signal processor, an image processing unit, a processing unit or a processing module, etc. The processor may be a CPU of the electronic device, and the processor may also be a separate device different from the CPU, and the embodiment of the present application is not limited thereto.
It has been explained above that, based on cost considerations in existing electronic devices having at least two cameras, typically only one camera supports the phase detection autofocus PDAF function.
It should be understood that the implementation principle of PDAF is as follows: some shielding pixel points are reserved on the photosensitive element and are specially used for carrying out phase detection on an image, and a focusing offset value is determined according to the distance between pixels, the change of the distance between pixels and the like, so that accurate focusing is realized.
However, currently, the focusing process of capturing an object by the electronic device with only one camera supporting PDAF takes a long time, and the captured picture is blurred during the focusing process, which results in poor user experience.
In view of the above problems, an embodiment of the present application provides a focusing method, in which a camera (a second camera) that does not support a PDAF function synchronously supports a focusing behavior of a camera (a first camera) that does not support the PDAF function, so as to implement fast focusing of the camera that does not support the PDAF function, reduce focusing time, reduce or avoid a problem of a blurred picture in a focusing process, and improve user experience.
Specifically, in the embodiment of the present application, during the phase detection focusing of the first camera, at least one phase focusing position of the second camera may be determined according to one or more phase focusing positions of the first camera, and the second camera may be controlled to move to the at least one phase focusing position. The camera which does not support the PDAF function synchronously supports the focusing behavior of the camera which does not support the PDAF function, the fast focusing of the camera which does not support the PDAF function is realized, the integral focusing time can be reduced, and the user experience is improved.
Furthermore, in this embodiment of the application, in the process of synchronizing the first camera with the second camera, the contrast focusing position of the second camera may also be determined according to contrast information of images detected by the second camera at one or more phase focusing positions of the at least one phase focusing position. That is to say, in the embodiment of the present application, during the synchronization of the first camera by the second camera, contrast information of an image detected by the second camera may be recorded, and a contrast in-focus position of the second camera may be determined according to the contrast information. The action of synchronizing the first camera with the second camera may cause the focusing moving direction of the second camera to be inaccurate, or the second camera needs to focus in a contrast focusing manner after the focusing of the first camera is finished. In view of this, according to the embodiment of the present application, the accuracy of the focusing moving direction can be ensured through the recorded contrast information, or the contrast focusing position can be directly determined through the recorded contrast information, so that the determination of the focus position of the second camera according to the focusing position of the first camera or the CDAF manner is avoided, the focusing time can be reduced, and the user experience can be improved.
Hereinafter, the focusing method according to the embodiments of the present application will be described in detail with reference to specific examples, by way of example and not by way of limitation.
Since in the embodiment of the present application, the second camera that does not support the PDAF function needs to synchronize the focusing behavior of the PDAF-capable first camera, in order to make the solution of the embodiment of the present application easy to understand, a specific focusing process of the PDAF-capable camera is first described below with reference to fig. 2.
Specifically, the method shown in fig. 2 may be performed by a processor (e.g., an image signal processor), and the method shown in fig. 2 includes:
the camera is controlled to move to a first focus position 210.
Specifically, the first focus position is first determined from PD information of an image acquired by the camera, and the camera is controlled to move to the first focus position.
220, a distance between the current position and the second focus position is determined.
Specifically, after the camera is moved to the first in-focus position, the image signal processor newly acquires the PD information of the image, and determines the second in-focus position from the newly acquired PD information of the image, and then determines the difference between the current position (i.e., the first in-focus position) and the second in-focus position.
It should be understood that, in practical applications, an object captured by the camera may move during focusing, or the electronic device may also move, or, before the camera is moved in step 210, when the position of the camera is far from the in-focus position, the PDAF focusing accuracy may be poor, so that the current position (i.e., the position of the lens after performing step 210) in step 220 may have a certain difference from the second in-focus position (i.e., the in-focus position determined by the PD information of the image acquired at the current position), and therefore, the current position may have a certain distance from the second in-focus position.
It is determined whether the distance is greater than a first threshold 230.
Specifically, when the distance is greater than the first threshold, the camera is controlled to move to the second focus position, and then it is determined whether the new distance acquired by the method after the camera moves to the second focus position is greater than the first threshold, and the process from 210 to 230 is repeated until it is determined that the last distance acquired is less than or equal to the first threshold.
In case the distance is less than or equal to the first threshold, or in case the above-mentioned repetition process reaches the repetition threshold, step 240 is performed.
It should be understood that, in the embodiment of the present application, the first threshold may be a preset value, and the first threshold may be determined according to an actual situation, and the value of the first threshold is not limited in the embodiment of the present application.
And 240, judging whether the PD self-convergence condition is met.
Specifically, in the embodiment of the present application, when the distance is less than or equal to the first threshold, it is considered that the PD self-convergence condition is satisfied.
Alternatively, as another embodiment, in the embodiment of the present application, in case that the distance is smaller than or equal to the first threshold in 230, step 210 may still be performed, and the above process is repeated until the distance is smaller than or equal to the first threshold in n consecutive repetitions (e.g., 2 times, 3 times, etc.), or the repetition number reaches the repetition threshold, and then step 240 is performed. In this case, step 240 may be modified as follows: and judging that the distances in the continuous n times of repeated processes are all smaller than or equal to the first threshold value, and determining that the PD self-convergence condition is met.
It should be understood that in the case that the distance is smaller than or equal to the first threshold value in 230, the above process may be repeated without moving the position of the camera, and the current focusing position may be determined directly again according to the PD information of the image acquired by the camera at the current position, and then the subsequent comparison and determination process may be performed.
If the PD self-convergence condition is satisfied, step 260 is executed and focusing ends.
In case the PD self-convergence condition is not satisfied, step 250 is performed.
The in-focus position is searched for using small steps 250.
For example, the focus position is determined by using a CDAF small step search mode, and the camera is controlled to move to the focus position.
And 260, finishing focusing.
It should be understood that the step 240 may be an optional step, and in the case that the distance is smaller than or equal to the first threshold in 230, or in the case that the distance is greater than the first threshold in 230, and the process of repeating 210 to 230 reaches the repetition threshold, the step 240 may not be executed, and the step 260 or 250 may be directly executed, which is not limited in this application.
It should also be understood that in the method 200, before the step 250 is executed, the CDAF large-step search may be performed to find the focusing position, and then in the case that the focusing position determined by the large step is different from the current position by less than the second threshold, the embodiment of the present application is not limited thereto when the step 250 is executed. It should be understood that the magnitude of the CDAF large step search for the focus position is greater than the magnitude of the small step search for determining the focus position.
It should be understood that, in the embodiment of the present application, the second threshold may be a preset value, and the second threshold may be determined according to an actual situation, and the value of the first threshold is not limited in the embodiment of the present application.
The focusing method of the embodiment of the present application is described below with reference to fig. 3. Fig. 3 shows a schematic flow chart of a focusing method according to an embodiment of the present application. The method shown in fig. 3 may be applied to the above-mentioned electronic device including at least two cameras, a first camera of the at least two cameras supporting the phase detection autofocus PDAF function, the first camera being used for detecting phase information of an image; the second camera does not support the PDAF function, and the second camera is used to detect contrast information of an image. The method 300 shown in fig. 3 may be performed by a processor of the electronic device. Specifically, in the method shown in fig. 3, the processor may control the first camera to perform focusing according to a method similar to that shown in fig. 2, and the processor controls the second camera in parallel to synchronize the focusing behavior of the first camera, so as to implement fast focusing of the second camera that does not support the PDAF function, and reduce the overall focusing time.
It should be understood that, in the embodiment of the present application, the camera may be capable of acquiring an image, the camera may include a lens group and a photosensitive element, optionally, each camera in the embodiment of the present application may further include a respective image signal processing module, optionally, the camera may not include the image signal processing module, and the processor performs image signal processing in a unified manner, and the embodiment of the present application is not limited thereto.
The method 300 shown in FIG. 3 includes:
during the phase detection auto-focus of the first camera, at least one phase-focus position of the second camera is determined according to the one or more phase-focus positions of the first camera, and the second camera is controlled to move to the at least one phase-focus position 310.
It should be understood that, in the embodiment of the present application, the phase focus position is a focus position calculated based on phase information of an image.
Specifically, according to the method described in fig. 2, during the phase detection auto-focusing process of the first camera, at least one phase-focusing position of the first camera may be determined according to the phase detection PD information of the image acquired by the first camera. In an embodiment of the present application, at least one phase focus position of the second camera may be determined according to one or more of the at least one phase focus position of the first camera.
It should be understood that the "in-focus position" in the embodiments of the present application may include a moving direction and/or a moving distance, where the moving direction indicates a direction in which a movable lens, a lens group or a lens in the camera needs to be moved in order to obtain a clear image; the movement distance indicates a distance that a movable lens, a lens group, or a lens in the camera needs to be moved in the movement direction in order to obtain a clear image. In other words, the in-focus position may indicate a position that a movable lens, lens group or lens in the camera needs to be in order to obtain a sharp image.
A method of determining a phase-in-focus position of a second camera from a phase-in-focus position of a first camera in an embodiment of the present application is described below.
For example, in one possible implementation: the phase focus position of the first camera may be converted into depth information or object distance of the object to be photographed, and then the phase focus position of the second camera may be calculated from the depth information or object distance.
For example, as shown in FIG. 4,
the imaging principle of the lens is as follows: 1/f is 1/u + 1/v.
Wherein u is called an image distance (corresponding to the distance between an imaging surface and the optical center of the lens group in the camera); v is called the object distance, i.e. the distance of the object from the optical center; f is the focal length of the lens set, and f is a constant when corresponding to a certain lens set.
The focusing process in the embodiment of the present application is a process of adjusting the image distance u.
When the position of the first camera is determined, that is, the image distance u1 of the first camera is determined (corresponding to the phase focusing position of the first camera), the current object distance v (which is the object distance that the current image distance of the first camera can focus clearly) can be calculated according to the lens imaging principle and the focal length f1 of the first camera; the second camera synchronizes the position of the first camera means that the second camera is focused to the same object distance position as the first camera, so that the image distance u2 of the second camera can be calculated according to the object distance v and the focal length f2 of the second camera, and u2 is the position where the second camera is synchronized (corresponding to the phase focus position of the second camera).
As another example, in another possible implementation: since the positions of the first camera and the second camera in the electronic device are fixed, in general, in practical applications, u1 and u2 have a certain mapping relationship, and the embodiments of the present application can determine the phase focus position of the second camera directly according to the phase focus position of the first camera through the mapping relationship.
It should be understood that in the embodiment of the present application, "during the phase detection autofocus by the first camera" may indicate all or part of the time period from when the phase detection focus is turned on to before the phase detection focus is ended by the first camera. In other words, "during the phase detection autofocus by the first camera" may indicate a time period before the first camera turns on the CDAF focus, and the embodiment of the present application is not limited thereto.
Specifically, the processor may determine at least one phase focus position of the second camera according to one or more phase focus positions of the first camera during the phase detection auto-focus of the first camera, and control the second camera to move to the at least one phase focus position; the processor controls the focusing of the two cameras in parallel, so that the second camera which does not support the PDAF function synchronously supports the focusing behavior of the first camera which does not support the PDAF function, the rapid focusing of the second camera which does not support the PDAF function is realized, the integral focusing time can be reduced, and the user experience is improved.
It should be understood that in the embodiment of the present application, the processor controls the second camera to move to the at least one phase focusing position during the phase detection auto-focusing of the first camera. It can also be stated that the processor controls the first camera and the second camera in parallel. Controlling the first camera and the second camera in parallel may be understood as controlling the first camera and the second camera simultaneously, and it should be understood that controlling the two cameras in parallel or simultaneously in the embodiments of the present application is not limited to controlling the two cameras to be strictly identical in time, for example, controlling the time of movement of the two cameras may permit a certain time interval, and the embodiments of the present application are not limited thereto.
For example, after the focusing position of the first camera is determined, the first camera may be controlled to move, then during the moving process of the first camera, the focusing position of the second camera is determined according to the focusing position of the first camera, and then the second camera is controlled to move. For another example, after the focusing position of the first camera is determined, the focusing position of the second camera may be determined according to the focusing position of the first camera, and then the processor controls the first camera and the second camera to move to the corresponding focusing positions respectively in parallel.
In particular, the processor may move the camera to the corresponding focus position by controlling the lens motor driver.
Therefore, in the embodiment of the application, the processor can control at least two cameras to focus in parallel, so that the camera which does not support the PDAF function synchronously supports the focusing behavior of the camera which does not support the PDAF function, the fast focusing of the camera which does not support the PDAF function is realized, the overall focusing time can be reduced, and the user experience is improved.
It should be understood that in the embodiments of the present application, the one or more phase focus positions of the first camera may include one or a plurality of consecutive phase focus positions of the first camera during the phase detection autofocus. The one or more phase-focus positions may include a first phase-focus position, one or more intermediate phase-focus positions in a phase-focus process, or a last phase-focus position, and the embodiments of the present application are not limited thereto.
And 320, determining the contrast focusing position of the second camera according to the contrast information of the image detected by the second camera at one or more phase focusing positions in the at least one phase focusing position, wherein the contrast focusing position of the second camera comprises the contrast focusing moving direction of the second camera.
It should be understood that, in the embodiment of the present application, the contrast focus position is a focus position calculated based on contrast information of an image.
It should be understood that the embodiment of the present application may determine the contrast focus moving direction according to the contrast information of the recorded image, for example, when the curve information of the recorded contrast information is better, the contrast focus moving direction may be determined to be a trend direction toward the curve information.
Specifically, in the embodiment of the present application, in the process of synchronizing the first camera with the second camera, the contrast in-focus position of the second camera may be determined according to contrast information of an image detected by the second camera at one or more phase in-focus positions of the at least one phase in-focus position. That is to say, in the embodiment of the present application, during the synchronization of the first camera by the second camera, contrast information of an image detected by the second camera may be recorded, and a contrast in-focus position of the second camera may be determined according to the contrast information. The action of synchronizing the first camera with the second camera may cause the focusing moving direction of the second camera to be inaccurate, or the second camera needs to focus in a contrast focusing manner after the focusing of the first camera is finished. According to the embodiment of the application, the accuracy of the focusing moving direction can be ensured through the recorded contrast information, or the contrast focusing position can be directly determined through the recorded contrast information, so that the focusing position of the second camera can be prevented from being determined again according to the focusing position of the first camera or the CDAF mode, the focusing time can be shortened, and the user experience can be improved.
Optionally, as another embodiment, the method further includes:
controlling whether the second camera detects contrast information of an image or not according to the moving state of the electronic equipment and/or the image stable state of the second camera;
wherein the second camera is controlled to detect contrast information of the image in one or more of the at least one phase focus positions.
For example, when the moving state of the electronic equipment and/or the image stable state of the second camera meet the detection condition, the processor controls the second camera to detect the contrast information of the image; alternatively, the processor does not control the second camera to detect the contrast information of the image when the moving state, and/or the image stabilization state does not satisfy the detection condition.
For example, when the moving state of the electronic device is more stable or slower, for example, the gyroscope or the accelerator of the electronic device detects that the mobility of the electronic device is less than a preset moving threshold, and/or the stable state of the image is that the contrast of the image is less changed (for example, the contrast of the image of the second camera at the current focusing position and the previous focusing position is less changed, for example, less than a preset contrast change threshold, the detection condition is considered to be satisfied.
It should be understood that, when the detection condition is satisfied, the embodiment of the present application may also consider that the captured picture is still. When the detection condition is not satisfied, the shot picture may be considered to be changed, and the embodiment of the present application is not limited thereto.
Specifically, in the embodiment of the present application, during the synchronization of the first camera by the second camera, whether the second camera detects contrast information of an image may be further controlled according to a moving state of the electronic device and/or an image stable state of the second camera when the second camera is in the at least one phase focus position, for example, the processor determines contrast information of a detected image at one or more phase focus positions of the second camera in the at least one phase focus position, and determines a contrast focus position of the second camera according to the contrast information of the detected image.
That is to say, in the embodiment of the present application, in the process of the second camera synchronizing the first camera, and when it is determined that the picture is still, the contrast information of the image is detected, because the contrast information of the image acquired by the still picture is relatively reliable, the processor can accurately determine the contrast focusing position according to the contrast information of the acquired image, and then the embodiment of the present application can ensure the accuracy of the focusing moving direction through the recorded contrast information, or directly determine the contrast focusing position through the recorded contrast information, thereby avoiding determining the focus aligning position of the second camera again according to the focusing position of the first camera or the CDAF manner, reducing the focusing time, and improving the user experience.
It was described above that the processor controls the second camera to acquire contrast information of an image only in case the picture is still and determines the contrast in-focus position of the second camera. Alternatively, the embodiment of the present application may also detect contrast information of an image at all phase in-focus positions of the second camera, and then obtain valid information from the detected contrast information of the image.
Correspondingly, as another embodiment, the method of the embodiment of the present application further includes:
determining whether contrast information of an image detected by the second camera is valid according to the moving state of the electronic equipment and/or the image stable state of the second camera;
wherein contrast information of images detected by the second camera at one or more of the at least one phase-in-focus positions is valid.
For example, when the moving state of the electronic device is more stable or slower, for example, the gyroscope or the accelerator of the electronic device detects that the mobility of the electronic device is smaller than a preset moving threshold, and/or the stable state of the image is that the contrast of the image is less changed (for example, the contrast of the image detected by the second camera at the current focusing position is less changed, for example, smaller than a preset contrast change threshold (i.e., when the shooting picture is still), the contrast information of the image detected by the second camera is considered to be valid.
It should be understood that in the embodiments of the present application, the second camera is stationary in one or more of the at least one phase-focus positions, and a plurality of the at least one phase-focus positions are continuous effective information, that is, the second camera always takes a still picture in the plurality of phase-focus positions.
That is to say, in the process of synchronizing the first shooting by the second camera, the embodiment of the present application detects the contrast information of the image, and determines that the contrast information of the image is valid when the image is still, because the contrast information of the image acquired by the still image is relatively reliable, the processor can accurately determine the contrast focusing position according to the contrast information of the acquired image, and further, the embodiment of the present application can ensure the accuracy of the focusing moving direction through the recorded contrast information, or directly determine the contrast focusing position through the recorded contrast information, thereby avoiding determining the focusing position of the second camera again according to the focusing position of the first camera or the CDAF manner, reducing the focusing time, and improving the user experience.
Optionally, as another embodiment, the contrast in-focus position of the second camera further includes a contrast in-focus movement distance of the second camera, and the method further includes:
controlling the second camera to move to the contrast in-focus position.
It should be understood that, in a case where a curve of the contrast information of the image acquired by the second camera recorded has a peak value, it may be determined that the in-focus position (i.e. the contrast focus moving direction and moving distance) of the second camera can be determined according to the recorded contrast information, for example, the in-focus position may be a position corresponding to the peak value, or the in-focus position may be an in-focus position obtained by curve fitting of the contrast information.
In addition, when it is determined that the in-focus position of the second camera cannot be calculated according to the recorded contrast information of the image acquired by the second camera, when the lens of the second camera is controlled to move to the in-focus position of the second camera determined by the CDAF method, how to adopt CDAF focusing can be determined according to the recorded contrast information, for example, when the curve information of the recorded contrast information is better, the lens can be moved in the direction before the CDAF method is continued to determine the final in-focus position; alternatively, when the curve information of the recorded contrast information is worse, the lens may be moved in the reverse direction by the CDAF method to determine the final in-focus position.
That is to say, in the embodiment of the present application, in the process of synchronizing the second camera with the phase detection autofocus of the first camera, as long as the contrast focusing moving direction and the moving distance of the second camera can be determined according to the contrast information of the recorded image, regardless of whether the first camera completes focusing, the second camera may be directly controlled to move the moving distance in the moving direction, so as to complete focusing of the second camera.
It should be understood that the principle of CDAF focusing is: the focusing position is determined by the change of the definition of the focusing object, and particularly, after a picture of the photographed object undergoes a definition 'ascending and descending' process, the CDAF algorithm can obtain the most appropriate focusing position. Taking the example of taking a coin in the CDAF mode, the initial frame is in a virtual focus state, and then the lens moves, so that people can see that the coin in the screen is gradually clear. Until the coin is most clear at a certain position (in-focus state), but the camera cannot realize that the coin is in-focus at the moment, the lens continues to move, and people can see the coin and become fuzzy. The camera module recognizes that the lens is "over-standing" and then returns to the focus position which was just clear, so that one-time focusing is completed. The CDAF mode has long focusing time, blurry pictures in the focusing process and poor user experience.
Specifically, in the embodiment of the present application, in the process of synchronizing the first camera with the second camera, the contrast focus moving direction of the second camera and the contrast focus moving distance of the second camera may be determined according to contrast information of an image detected by the second camera at one or more phase focus positions of the at least one phase focus position. Because the in-focus position of the second camera is determined, the processor can directly control the second camera to move the contrast focusing movement distance in the contrast focusing movement direction of the second camera, so that the focusing of the second camera is completed, the behavior of the second camera for synchronizing the first camera is not required to be controlled, the second camera is not required to be controlled, the in-focus position is determined in a CDAF mode, the focusing time can be reduced, and the user experience is improved.
Optionally, as another embodiment, the method further includes:
determining a next phase focusing position of the second camera according to the next phase focusing position of the first camera, wherein the next phase focusing position of the second camera comprises a phase focusing moving direction and a phase focusing moving distance of the second camera;
and when the contrast focusing moving direction is consistent with the phase focusing moving direction, controlling the second camera to move to the next phase focusing position.
Specifically, during the phase detection autofocus of the first camera (i.e., when the first camera has not finished phase focusing), after the first camera has moved to the current phase focus position, as described above in step 230, controlling the first camera to move to a next phase in-focus position if the distance is greater than the first threshold, and determining a next phase focus position of the second camera based on the next phase focus position of the first camera, when the contrast focusing moving direction is consistent with the phase focusing moving direction, controlling the second camera to move to the next phase focusing position, repeating the process until the phase focusing of the first camera is finished, or the contrast focusing moving direction and the moving distance of the second camera can be determined according to the recorded contrast information.
Optionally, as another embodiment, after the phase detection automatic focusing of the first camera is finished, or when the contrast focusing moving direction is not consistent with the phase focusing moving direction, the second camera is controlled to move a preset distance in the contrast focusing moving direction.
Specifically, when the phase focusing moving direction is not consistent with the contrast focusing moving direction, the embodiment of the present application may stop the behavior of the second camera in synchronization with the first camera, and directly control the second camera to move a preset distance in the contrast focusing moving direction by using a CDAF method, for example, move in the contrast focusing moving direction by using a fixed step length of the CDAF to perform CDAF focusing.
Or, after the phase detection of the first camera is finished and the automatic focusing is finished, the second camera is controlled to move a preset distance in the contrast focusing moving direction by using a CDAF method, for example, the second camera moves in the contrast focusing moving direction by using a fixed step length of the CDAF to perform CDAF focusing.
Therefore, in the embodiment of the application, the contrast focusing moving direction of the second camera is determined through the contrast information of the image in the process of synchronizing the second camera with the first camera, so that the behavior of synchronizing the second camera with the first camera by the second camera is stopped when the phase focusing direction of the second camera is wrong, namely the phase focusing direction of the second camera is opposite to the contrast focusing direction, and CDAF is directly adopted in the contrast focusing direction, so that unnecessary moving behavior of the second camera can be avoided, and accurate and rapid focusing of the second camera is ensured.
Or, in the embodiment of the application, the contrast focusing moving direction of the second camera is determined by the contrast information of the image in the process of synchronizing the second camera with the first camera, so that after the focusing of the first camera is finished, the CDAF focusing can be directly adopted in the contrast focusing direction, the second camera can be prevented from adopting a random direction to move the focusing direction of the second camera in a CDAF mode, and the accurate and quick focusing of the second camera is ensured.
The following describes a process of focusing in an embodiment of the present application when a picture is still, with reference to a specific column of fig. 5 and fig. 7.
For example, as shown in fig. 5, when the first camera begins auto-focusing, assuming the initial first camera and second camera positions are 0, it is understood that the position of the first camera may represent a position of the movable lens or lens group of the first camera in its direction of movement (e.g., the electronic device is a cell phone, which may be a direction perpendicular to the display screen of the cell phone), and the position of the second camera may represent a position of the movable lens or lens group of the first camera in its direction of movement (e.g., the electronic device is a cell phone, which may be a direction perpendicular to the display screen of the cell phone). It should be understood that although the positions of the first camera and the second camera are both 0, the actual spatial positions of the two cameras are different. It should also be understood that, for convenience of description in this embodiment of the present application, the positions of the first camera and the second camera are both set to be 0, but this embodiment of the present application is not limited thereto, and the initial positions of the first camera and the second camera may be different in practical applications.
Assuming that the position ratio is 1:1 when synchronizing from the first camera position to the second camera, it should be understood that in actual use, the ratio is a fixed ratio, but the ratio value may be different due to the difference between the first camera and the second camera, for example, the ratio may be greater than 1:1 or less than 1:1, and the embodiment of the present application is not limited thereto. Assuming that the contrast value obtained when the first camera is at position 0 is 100, as shown in fig. 5, written as [0,100] (it should be understood that the contrast value of the first camera may not be concerned in the embodiment of the present application, and the contrast value of the first camera is set for the sake of uniformity of the description of the top-bottom alignment, but the embodiment of the present application is not limited thereto). The second camera position and contrast is [0,300 ]. When the camera is set at the 0 position, the value of the quasi-focus position given by the PD of the first camera is 80 (the value given by the PD is the distance and the direction between the quasi-focus position and the current position, here, the specified value is positive and represents the positive direction, namely, the right direction, and the value is negative and represents the negative direction, namely, the left direction), the camera is synchronized to the second camera, the quasi-focus position of the second camera is also 80, and then the first camera and the second camera are pushed to 80. At the position 80, the contrast value obtained by the first camera is 200, the contrast value obtained by the second camera is 450 (when there are two or more sets of stored position number (code) and contrast value, the contrast (contrast) focusing moving direction can be judged), because the contrast value of the second camera is rising, the direction (contrast focusing moving direction) pointed by the contrast value is a forward direction, the current direction needs to be kept to continue searching downwards, otherwise, the direction needs to be reversed. At this point PD gives a positive value, i.e. the phase focus movement direction coincides with the contrast focus movement direction, so the position of the first camera continues to be synchronized and the first camera and the second camera move to position 100. At position 100, the second camera contrast is 400, which is decreasing relative to position 80, so the contrast focus movement direction is negative. Since the contrast curve has a peak (position 80), the in-focus position can be obtained by curve fitting (i.e. obtaining the moving distance in the moving direction), or by CDAF toward the position of 80 from the current position.
It should be understood that the CDAF manner in this embodiment refers to a method of obtaining a contrast information by one step with a fixed step size, and obtaining a focus position by curve fitting when a curve has a peak. If the CDAF is used to focus from position 100 to position 80, and the step size is set to 15 (the step size is different for different algorithms, but is a fixed value), the contrast value [100,400] is saved at position 100, 15 is pushed to position 85 to obtain the contrast value [85,445], and then 15 is pushed to position 70 to obtain the contrast value [70, x ]. If x is smaller than 445, the position 85 is a peak value, a quasi-focus position can be obtained by adopting curve fitting, and the focusing can be finished by pushing the motor to the quasi-focus position; if x is larger than 445, the image needs to be pushed to the left according to the step length of 15 until the contrast value at a certain position is smaller than the previous contrast value, and at the moment, the quasi-focus position is obtained by adopting curve fitting.
Because the in-focus position (i.e. the moving direction and the moving distance of the contrast focus) of the second camera can be determined according to the recorded contrast information, the second camera can be directly controlled to move the in-focus position in the moving direction, and the focusing of the second camera is completed.
Specifically, in the embodiment of the present application, in the process of synchronizing the first camera with the second camera, the contrast focus moving direction of the second camera and the contrast focus moving distance of the second camera may be determined according to contrast information of an image detected by the second camera at one or more phase focus positions of the at least one phase focus position. Because the in-focus position of the second camera is determined, the processor can directly control the second camera to move the contrast focusing movement distance in the contrast focusing movement direction of the second camera, so that the focusing of the second camera is completed, the behavior of the second camera for synchronizing the first camera is not required to be controlled, the second camera is not required to be controlled, the in-focus position is determined in a CDAF mode, the focusing time can be reduced, and the user experience is improved.
For another example, as shown in fig. 6, when the first camera is at position 0, PD gives a value of 80, so the first camera needs to push to position 80 next; setting the contrast of the second camera at position 0 to 300, and synchronizing the position of the first camera to push the motor to 80; at position 80, the PD value of the first camera is 20, the contrast value obtained by the second camera at position 80 is 450, and since the contrast value of the second camera is increasing, the direction of the contrast determination is forward to the right, whereas the PD of the first camera gives a value of 20, the value is forward to the right, so that the direction of synchronizing the first camera coincides with the direction of the contrast saved by the second camera, and therefore the synchronization of the position of the first camera is continued. At position 100, if the PD value obtained by the first camera is-5 and the contrast value obtained by the second camera is 500, the direction in which the first camera is synchronized (i.e., the direction of the phase focus shift of the second camera) is negative left, while the contrast value is still rising, and the direction in which the contrast is obtained (i.e., the direction of the contrast focus shift of the second camera) is positive right. Because the two directions are not consistent, the second camera stops synchronizing the action of the first camera shooting and needs to continue to focus to the right direction in the CDAF.
It should be understood that the example in fig. 6 above is that the PD direction does not coincide with the contrast direction at the time of the third step, alternatively, if the contrast of the second camera is less than 300 at position 80 at the second step, i.e. if the PD direction does not coincide with the contrast direction at the second step, the second camera will start to focus to the left at the second step.
For another example, as shown in fig. 7, the previous three steps are the same as those in fig. 6, and in the third step, if the value given by PD is 5, the contrast of the second camera is rising, so the contrast direction is the same as the direction given by PD, and is positive to the right, so the second camera still synchronizes the behavior of the first camera to push the motor to 105 in the third step; at this time, the first camera indicates that the current position is very close to the in-focus position (assuming that the PD convergence threshold is greater than 5) because the PD value is small, so that focusing can be finished, and at this time, the first camera can be pushed to 105, so that the current position can be maintained, which is different according to different algorithms. When the second camera is pushed to 105, the behavior of the first camera cannot be synchronized again because the first camera is focused, and a subsequent flow needs to be automatically judged, if the contrast value x acquired at 105 is less than 500, the position 100 is a contrast peak value, and curve fitting can be directly carried out to find a peak value point of the second camera or the second camera is connected with CDAF from 105 to 100 according to a fixed step length. If x is larger than 500, it indicates that the peak point is still at the right side of the positive direction, and at this time, the second camera will focus to the right in the CDAF with a fixed step length from the current position.
It should be appreciated that as another example, if the first camera motor is already in the in-focus position in the first step, the second camera is directly connected to the CDAF, the step size is fixed, and the direction may be left or right, depending on the implementation.
It should be understood that in the embodiment of the present application, after controlling the first camera and the second camera to complete focusing respectively, the two cameras may be considered to complete focusing, and then the processor may acquire images through the first camera and the second camera respectively and synthesize the images acquired by the two cameras to form a final image. Optionally, in the case that the electronic device has a display screen, the electronic device may also display the final image through the display screen.
It should be understood that, the image processing process after the two cameras finish focusing respectively may refer to an existing image synthesis algorithm of multiple cameras, and this is not limited in the embodiments of the present application.
The specific process of focusing in the embodiment of the present application is described above with reference to fig. 2 to 7. The focusing method according to the embodiment of the present application is described in detail below with reference to a specific example of fig. 8. The method of fig. 8 may be applied to the above-mentioned electronic device including at least two cameras, a first camera of the at least two cameras supporting the phase detection autofocus PDAF function, and a second camera not supporting the PDAF function. The method illustrated in fig. 8 may be performed by a processor of the electronic device. It should be understood that only an example of focusing by two cameras (double-shot) is described in fig. 8, but the embodiments of the present application are not limited thereto, and the focusing process is similar when the electronic device includes three or more cameras, and is not described herein again to avoid repetition.
The method 800 shown in FIG. 8 includes:
at 801, the double focusing is started.
A picture change is determined 810.
Specifically, the description for determining the picture change, i.e., the picture still, may refer to the description above, and is not repeated here to avoid repetition.
811, the first camera performs phase detection autofocus.
It should be understood that the process of the first camera performing the phase detection auto-focusing can refer to the description in fig. 2, and is not described herein again to avoid repetition.
The second camera synchronizes 812 the position of the first camera.
Specifically, the second camera synchronizes the position of the first camera and does not record image contrast information of the second camera.
813 judges whether the screen is still.
In case the picture is still, step 814 is performed, otherwise step 810 is performed.
814, it is determined whether the first camera is in focus.
Step 815 and step 816 are performed in case the first camera is in focus, and step 820 is performed in case the first camera is out of focus.
815, the first camera completes focusing.
816, the second camera performs CDAF focusing.
It should be understood that steps 810 to 816 describe the focusing process in case of a picture change, and in particular, in case of a picture change, the second camera synchronizes the behavior of the first camera without recording the image contrast information of the second camera.
Therefore, in the embodiment of the application, the camera which does not support the PDAF function synchronously supports the focusing behavior of the camera which does not support the PDAF function, the fast focusing of the camera which does not support the PDAF function is realized, the integral focusing time can be reduced, and the user experience is improved.
And 820, determining that the picture is still.
Specifically, the description for determining the picture change, i.e., the picture still, may refer to the description above, and is not repeated here to avoid repetition.
821, the first camera performs phase detection auto-focus.
It should be understood that the process of the first camera performing the phase detection auto-focusing can refer to the description in fig. 2, and is not described herein again to avoid repetition.
822, the second camera synchronizes the position of the first camera.
Specifically, the second camera synchronizes the position of the first camera and records image contrast information of the second camera.
823 determines whether the contrast focus moving direction can be calculated from the recorded image contrast information.
If the contrast focus moving direction can be calculated, step 824 is executed, and if the contrast focus moving direction cannot be calculated, step 822 is executed.
824, it is determined whether the contrast focus movement distance can be calculated from the recorded image contrast information.
If the contrast in-focus moving distance can be calculated, step 825 is executed, and if the contrast in-focus moving distance cannot be calculated, step 826 is executed.
At 825, the second camera is pushed to the in-focus position (i.e., moved in the contrast focus movement direction by the contrast focus movement distance).
826, whether the phase focus shift direction of the second camera coincides with the contrast focus direction.
Step 822 is performed in the case of coincidence, and step 829 is performed in the case of non-coincidence.
827, determine whether the first camera satisfies phase focus self-convergence.
Step 828 is executed if the first camera phase focus self-convergence is achieved, and steps 821 and 826 are executed if the first camera does not satisfy the phase focus self-convergence.
Alternatively, as another embodiment, in the method shown in fig. 8, similar to fig. 2, in a case that the number of times that the PD self-convergence is not satisfied reaches a preset repetition threshold, the phase focusing process of the first camera may be stopped, and then the CDAF is used to determine the in-focus position of the first camera.
828, the first camera completes the focus.
829, controlling the second camera to perform CDAF focus in the contrast focus moving direction.
It should be understood that steps 820 to 829 describe the focusing process in the case of still pictures, and in particular, in the case of a picture change, the second camera synchronizes the behavior of the first camera and records the image contrast information of the second camera. And determining a contrast focus position of the second camera based on the recorded contrast information (the contrast focus position including a contrast focus movement direction, or the contrast focus position including a contrast focus movement direction and a contrast focus movement distance).
That is to say, in the embodiment of the present application, in the process of the second camera synchronizing the first camera, and when it is determined that the picture is still, the contrast information of the image is detected, because the contrast information of the image acquired by the still picture is relatively reliable, the processor can accurately determine the contrast focusing position according to the contrast information of the acquired image, and then the embodiment of the present application can ensure the accuracy of the focusing moving direction through the recorded contrast information, or directly determine the contrast focusing position through the recorded contrast information, thereby avoiding determining the focus aligning position of the second camera again according to the focusing position of the first camera or the CDAF manner, reducing the focusing time, and improving the user experience.
It should be understood that the above examples of fig. 2 to 8 are only for assisting the skilled person in understanding the embodiments of the present invention, and are not intended to limit the embodiments of the present invention to the specific values or specific scenarios illustrated. It will be apparent to those skilled in the art that various equivalent modifications or variations are possible in light of the examples given in figures 2 through 8, and such modifications or variations are also within the scope of the embodiments of the invention.
It should be understood that the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiment of the present invention.
The focusing method according to the embodiment of the present invention is described in detail with reference to fig. 2 to 8, the processor according to the embodiment of the present invention is described below with reference to fig. 9, the electronic device according to the embodiment of the present invention is described with reference to fig. 10, and the mobile phone according to the embodiment of the present invention is described with reference to fig. 11.
FIG. 9 is a schematic block diagram of a processor according to one embodiment of the present application. The processor 900 shown in fig. 9 includes: a processing unit 910 and a storage unit 920. The storage unit 920 is configured to store codes, and the processing unit 910 is configured to execute the codes in the storage unit 920 to perform the methods shown in fig. 2 to 8. For a specific method implemented by the processor, reference may be made to the descriptions in fig. 2 to fig. 5 above, and details are not described here again to avoid repetition.
It should be understood that the processor 900 may also be referred to as an image signal processor, an image processing unit, a processing unit or a processing module, etc. The processor 900 may be a CPU of the electronic device, and the image signal processor may also be a separate device different from the CPU, which is not limited to this embodiment of the application.
It should be noted that the processor in the embodiments of the present invention may be an integrated circuit chip having signal processing capability. In implementation, the steps of the above method embodiments may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The processor may be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic device, or discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
It will be appreciated that the memory unit in embodiments of the invention may also be referred to as a memory, which may be volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. Volatile memory can be Random Access Memory (RAM), which acts as external cache memory. By way of example, but not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic Random Access Memory (SDRAM), double data rate SDRAM (ddr SDRAM), Enhanced SDRAM (ESDRAM), synchlink DRAM (SLDRAM), and direct rambus RAM (DR RAM).
FIG. 10 is a schematic block diagram of an electronic device according to one embodiment of the present application. The electronic device 1000 shown in fig. 10 comprises a processor 1010 and at least two cameras 1020, wherein a first camera 1021 of the at least two cameras supports a phase detection autofocus PDAF function and a second camera 1022 does not support the PDAF function.
It should be understood that the processor 1010 and the 9-processor 600 shown in fig. 9 can implement the functions of the methods shown in fig. 2 to fig. 8, and the processor 1010 may include all modules or units for implementing the methods, which are not described herein again to avoid repetition.
It should be understood that the electronic device in the embodiment of the present application may further include other modules, and an example in which the electronic device in the embodiment of the present application is a mobile phone is described below with reference to fig. 11.
Specifically, fig. 11 is a block diagram showing a partial structure of a cellular phone 1100 according to an embodiment of the present invention. Referring to fig. 11, a cell phone 1100 includes, among other components, Radio Frequency (RF) circuitry 1110, memory 1120, other input devices 1130, a display 1140, sensors 1150, audio circuitry 1160, an I/O subsystem 1170, a processor 1180, a power supply 1190, and at least two cameras 11100.
Those skilled in the art will appreciate that the handset configuration shown in fig. 11 is not intended to be limiting and may include more or fewer components than those shown, or may combine certain components, or split certain components, or arranged in different components. Those skilled in the art will appreciate that the display 1140 is part of a User Interface (UI) and that the cell phone 1100 may include fewer than or the same user interface as shown.
The various components of cell phone 1100 will now be described in detail with reference to fig. 11:
RF circuit 1110 may be used for receiving and transmitting signals during a message transmission or call, and in particular, for receiving downlink messages from a base station and then processing the received downlink messages to processor 1180; in addition, the data for designing uplink is transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 1110 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), and the like.
The memory 1120 may be used to store software programs and modules, and the processor 1180 may execute various functional applications and data processing of the mobile phone 1100 by operating the software programs and modules stored in the memory 1120. The memory 1120 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the stored data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone 1100, and the like. Further, the memory 1120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
Other input devices 1130 may be used to receive entered numeric or character information and generate key signal inputs relating to user settings and function controls of the handset 1100. In particular, other input devices 1130 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, a light mouse (a light mouse is a touch-sensitive surface that does not display visual output, or is an extension of a touch-sensitive surface formed by a touch screen), and the like. Other input devices 1130 are connected to other input device controllers 1171 of the I/O subsystem 1170 and interact with the processor 1180 in signals under the control of the other device input controllers 1171.
Display 1140 may be used to display information entered by or provided to the user as well as various menus for cell phone 1100, and may also accept user input. Specific display 1140 may include a display panel 1141 and a touch panel 1142. The Display panel 1141 may be configured by a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), and the like. Touch panel 1142, also referred to as a touch screen, a touch sensitive screen, etc., may collect contact or non-contact operations (e.g., operations performed by a user on or near touch panel 1142 using any suitable object or accessory such as a finger, a stylus, etc., and may also include body-sensing operations; including single-point control operations, multi-point control operations, etc.) on or near touch panel 1142, and drive corresponding connection devices according to a preset program. Alternatively, the touch panel 1142 may include two parts, i.e., a touch detection device and a touch controller. The touch detection device detects the touch direction and gesture of a user, detects signals brought by touch operation and transmits the signals to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into information that can be processed by the processor, sends the information to the processor 1180, and receives and executes commands sent by the processor 1180. In addition, the touch panel 1142 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, a surface acoustic wave, and the like, and the touch panel 1142 may also be implemented by any technology developed in the future. Further, touch panel 1142 covers display panel 1141, a user can operate on or near touch panel 1142 covered on display panel 1141 according to content displayed on display panel 1141 (the display content includes, but is not limited to, a soft keyboard, a virtual mouse, virtual keys, icons, etc.), touch panel 1142 detects the operation on or near touch panel 1142, and transmits the detected operation to processor 1180 through I/O subsystem 1170 to determine a user input, and processor 1180 provides a corresponding visual output on display panel 1141 through I/O subsystem 1170 according to the user input. Although the touch panel 1142 and the display panel 1141 are shown as two separate components in fig. 11 to implement the input and output functions of the mobile phone 1100, in some embodiments, the touch panel 1142 and the display panel 1141 may be integrated to implement the input and output functions of the mobile phone 1100.
The cell phone 1100 can also include at least one sensor 1150, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display panel 1141 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 1141 and/or the backlight when the mobile phone 1100 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for the other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone 1100, the detailed description is omitted.
Audio circuitry 1160, speaker 1161, and microphone 1162 may provide an audio interface between a user and the cell phone 1100. The audio circuit 1160 may transmit the converted signal of the received audio data to the speaker 1161, and convert the signal into a sound signal for output by the speaker 1161; on the other hand, the microphone 1162 converts collected sound signals into signals, which are received by the audio circuit 1160, converted into audio data, and then output the audio data to the RF circuit 1110 for transmission to, for example, another cell phone, or to the memory 1120 for further processing.
The external devices used by the I/O subsystem 1170 to control input and output may include other device input controllers 1171, sensor controllers 1172, display controllers 1173, and image signal processors 1174. Optionally, the image signal processor 1174 is configured to control at least two cameras 11100 to capture objects, and perform the focusing method shown in fig. 2 to 5; one or more other input control device controllers 1171 receive signals from and/or transmit signals to other input devices 1130, the other input devices 1130 may include physical buttons (push buttons, rocker buttons, etc.), dials, slide switches, joysticks, click wheels, light mice (a light mouse is a touch-sensitive surface that does not display visual output, or is an extension of a touch-sensitive surface formed by a touch screen). It is noted that other input control device controllers 1171 may be connected to any one or more of the above devices. A display controller 1173 in the I/O subsystem 1170 receives signals from and/or sends signals to the display screen 1140. Upon detection of user input by the display screen 1140, the display controller 1173 converts the detected user input into interaction with user interface objects displayed on the display screen 1140, i.e., to implement human-machine interaction. The sensor controller 1172 may receive signals from the one or more sensors 1150 and/or send signals to the one or more sensors 1150.
The processor 1180 is a control center of the mobile phone 1100, and connects various parts of the whole mobile phone by using various interfaces and lines, and performs various functions and processes of the mobile phone 1100 by running or executing software programs and/or modules stored in the memory 1120 and calling data stored in the memory 1120, thereby performing overall monitoring of the mobile phone. Optionally, processor 1180 may include one or more processing units; preferably, the processor 1180 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated within processor 1180. Alternatively, the image signal controller may also be integrated in the processor 1180, and the embodiment of the present application is not limited thereto.
The cell phone 1100 also includes a power supply 1190 (e.g., a battery) for providing power to various components, which may be logically coupled to the processor 1180 via a power management system to facilitate managing charging, discharging, and power consumption via the power management system.
Although not shown, the mobile phone 1100 may also include a bluetooth module or the like, which will not be described in detail herein.
Embodiments of the present invention also provide a computer-readable medium, on which a computer program is stored, which, when executed by a computer, implements the method of any of the above-described method embodiments.
The embodiment of the invention also provides a computer program product, and the computer program product realizes the method of any one of the method embodiments when being executed by a computer.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The processes or functions described in accordance with the embodiments of the present invention occur, in whole or in part, when the computer instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that includes one or more of the available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a Digital Video Disk (DVD)), or a semiconductor medium (e.g., a Solid State Disk (SSD)), among others.
It should be understood that the image signal processor may be a chip, the processor may be implemented by hardware or may be implemented by software, and when implemented by hardware, the processor may be a logic circuit, an integrated circuit, or the like; when implemented in software, the processor may be a general-purpose processor implemented by reading software code stored in a memory, which may be integrated in the processor, located external to the processor, or stand-alone.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (14)

1. An electronic device, comprising:
a processor and at least two cameras;
a first camera of the at least two cameras is used for detecting phase information of an image;
the second camera of the at least two cameras is used for detecting contrast information of the image;
the processor is configured to:
in the process of carrying out phase detection automatic focusing by the first camera, determining at least one phase focusing position of the second camera according to one or more phase focusing positions of the first camera, and controlling the second camera to move to the at least one phase focusing position;
determining a contrast in focus position of the second camera based on contrast information of images detected by the second camera at one or more of the at least one phase in focus position, wherein the contrast in focus position of the second camera comprises a contrast in focus movement direction of the second camera.
2. The electronic device of claim 1,
the processor is further configured to:
controlling whether the second camera detects contrast information of an image or not according to the moving state of the electronic equipment and/or the image stable state of the second camera;
wherein the processor controls the second camera to detect contrast information of the image in one or more of the at least one phase-in-focus positions.
3. The electronic device of claim 1,
the processor is further configured to:
determining whether contrast information of an image detected by the second camera is valid according to the moving state of the electronic equipment and/or the image stable state of the second camera;
wherein contrast information of images detected by the second camera at one or more of the at least one phase-in-focus positions is valid.
4. The electronic device of claim 1,
the contrast in focus position of the second camera further comprises a contrast in focus movement distance of the second camera,
the processor is further configured to:
controlling the second camera to move to the contrast in-focus position.
5. The electronic device of claim 1,
the processor is further configured to:
determining a next phase focusing position of the second camera according to the next phase focusing position of the first camera, wherein the next phase focusing position of the second camera comprises a phase focusing moving direction and a phase focusing moving distance of the second camera;
and when the contrast focusing moving direction is consistent with the phase focusing moving direction, controlling the second camera to move to the next phase focusing position.
6. The electronic device of claim 5,
the processor is further configured to:
and when the phase detection automatic focusing of the first camera is finished, or the contrast focusing moving direction is inconsistent with the phase focusing moving direction, controlling the second camera to move for a preset distance in the contrast focusing moving direction.
7. A focusing method applied to an electronic device including at least two cameras, a first camera of the at least two cameras being used for detecting phase information of an image, and a second camera of the at least two cameras being used for detecting contrast information of the image, the method comprising:
in the process of carrying out phase detection automatic focusing by the first camera, determining at least one phase focusing position of the second camera according to one or more phase focusing positions of the first camera, and controlling the second camera to move to the at least one phase focusing position;
determining a contrast in focus position of the second camera based on contrast information of images detected by the second camera at one or more of the at least one phase in focus position, wherein the contrast in focus position of the second camera comprises a contrast in focus movement direction of the second camera.
8. The method of claim 7, further comprising:
controlling whether the second camera detects contrast information of an image or not according to the moving state of the electronic equipment and/or the image stable state of the second camera;
wherein the second camera is controlled to detect contrast information of the image in one or more of the at least one phase focus positions.
9. The method of claim 7, further comprising:
determining whether contrast information of an image detected by the second camera is valid according to the moving state of the electronic equipment and/or the image stable state of the second camera;
wherein contrast information of images detected by the second camera at one or more of the at least one phase-in-focus positions is valid.
10. The method of claim 7,
the contrast in focus position of the second camera further comprises a contrast in focus movement distance of the second camera,
the method further comprises the following steps:
controlling the second camera to move to the contrast in-focus position.
11. The method of claim 7,
the method further comprises the following steps:
determining a next phase focusing position of the second camera according to the next phase focusing position of the first camera, wherein the next phase focusing position of the second camera comprises a phase focusing moving direction and a phase focusing moving distance of the second camera;
and when the contrast focusing moving direction is consistent with the phase focusing moving direction, controlling the second camera to move to the next phase focusing position.
12. The method of claim 11,
the method further comprises the following steps:
and when the phase detection automatic focusing of the first camera is finished, or the contrast focusing moving direction is inconsistent with the phase focusing moving direction, controlling the second camera to move for a preset distance in the contrast focusing moving direction.
13. A processor, comprising:
a processing unit and a storage unit, wherein,
the storage unit is used for storing codes, and the processing unit is used for executing the codes in the storage unit to realize the method of any one of claims 7 to 12.
14. A computer-readable storage medium, comprising a computer program which, when run on a computer, causes the computer to perform the method of any one of claims 7 to 12.
CN201711447201.6A 2017-12-27 2017-12-27 Focusing method and electronic equipment Active CN109981965B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201711447201.6A CN109981965B (en) 2017-12-27 2017-12-27 Focusing method and electronic equipment
PCT/CN2018/123942 WO2019129077A1 (en) 2017-12-27 2018-12-26 Focusing method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711447201.6A CN109981965B (en) 2017-12-27 2017-12-27 Focusing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN109981965A CN109981965A (en) 2019-07-05
CN109981965B true CN109981965B (en) 2021-01-01

Family

ID=67063184

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711447201.6A Active CN109981965B (en) 2017-12-27 2017-12-27 Focusing method and electronic equipment

Country Status (2)

Country Link
CN (1) CN109981965B (en)
WO (1) WO2019129077A1 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110475071B (en) * 2019-09-19 2021-06-04 厦门美图之家科技有限公司 Phase focusing method, phase focusing device, electronic equipment and machine-readable storage medium
CN110881103B (en) * 2019-09-19 2022-01-28 Oppo广东移动通信有限公司 Focusing control method and device, electronic equipment and computer readable storage medium
WO2021081909A1 (en) * 2019-10-31 2021-05-06 深圳市大疆创新科技有限公司 Focusing method for photographing device, photographing device, system, and storage medium
CN112866551B (en) * 2019-11-12 2022-06-14 Oppo广东移动通信有限公司 Focusing method and device, electronic equipment and computer readable storage medium
CN110933305B (en) * 2019-11-28 2021-07-20 维沃移动通信有限公司 Electronic equipment and focusing method
CN113438407B (en) * 2020-03-23 2022-10-04 华为技术有限公司 Multi-camera module focusing method and device
CN111787231B (en) * 2020-07-31 2022-05-27 广东小天才科技有限公司 Focusing method, terminal equipment and computer readable storage medium
CN112261398A (en) * 2020-11-17 2021-01-22 广东未来科技有限公司 Focusing method of binocular camera based on mobile equipment
CN113556472B (en) * 2021-09-22 2021-12-14 上海豪承信息技术有限公司 Image compensation method, device, medium and front camera
CN116233605B (en) * 2023-05-08 2023-07-25 此芯科技(武汉)有限公司 Focusing implementation method and device, storage medium and image pickup equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1450398A (en) * 2002-04-05 2003-10-22 佳能株式会社 Image pick up apparatus and camera system thereof
CN105376474A (en) * 2014-09-01 2016-03-02 光宝电子(广州)有限公司 Image acquisition device and automatic focusing method thereof
CN106331484A (en) * 2016-08-24 2017-01-11 维沃移动通信有限公司 Focusing method and mobile terminal

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007310009A (en) * 2006-05-16 2007-11-29 Olympus Imaging Corp Digital camera and camera system
CN107172410A (en) * 2017-07-14 2017-09-15 闻泰通讯股份有限公司 Dual camera focusing method and device
CN107465881B (en) * 2017-09-30 2020-06-26 努比亚技术有限公司 Dual-camera focusing method, mobile terminal and computer readable storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1450398A (en) * 2002-04-05 2003-10-22 佳能株式会社 Image pick up apparatus and camera system thereof
CN105376474A (en) * 2014-09-01 2016-03-02 光宝电子(广州)有限公司 Image acquisition device and automatic focusing method thereof
CN106331484A (en) * 2016-08-24 2017-01-11 维沃移动通信有限公司 Focusing method and mobile terminal

Also Published As

Publication number Publication date
WO2019129077A1 (en) 2019-07-04
CN109981965A (en) 2019-07-05

Similar Documents

Publication Publication Date Title
CN109981965B (en) Focusing method and electronic equipment
CN111355886B (en) Photographing method, photographing device and mobile terminal
EP2975838B1 (en) Image shooting parameter adjustment method and device
KR101712301B1 (en) Method and device for shooting a picture
EP3575862B1 (en) Method and device for adjusting lens position
KR102079054B1 (en) Method and method for adjusting the shooting focal length of the mobile terminal using the touch pad
CN108234876B (en) Tracking focusing method, terminal and computer readable storage medium
US20230292269A1 (en) Method and apparatus for determining offset indication, and method and apparatus for determining offset
CN105430715B (en) Control the method and device of WIFI scanning
WO2018219275A1 (en) Focusing method and device, computer-readable storage medium, and mobile terminal
CN110262692B (en) Touch screen scanning method, device and medium
US11523043B2 (en) Camera autofocus using time-of-flight assistance
CN108829475B (en) UI drawing method, device and storage medium
US9641746B2 (en) Image capturing apparatus and control method for selecting from successively-captured images
CN110519503B (en) Method for acquiring scanned image and mobile terminal
CN105242837B (en) Five application page acquisition methods and terminal
CN110944114B (en) Photographing method and electronic equipment
WO2017084180A1 (en) Method and apparatus for optimizing air mouse remote controller, and air mouse remote controller
CN107704489B (en) Processing method and device for read-write timeout and computer readable storage medium
CN111610921A (en) Gesture recognition method and device
CN114339019B (en) Focusing method, focusing device and storage medium
US20220147244A1 (en) Method and device for touch operation, and storage medium
US9723218B2 (en) Method and device for shooting a picture
CN108646384B (en) Focusing method and device and mobile terminal
US11611693B1 (en) Updating lens focus calibration values

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant