CN110062161B - Image processor, image processing method, photographing device, and electronic apparatus - Google Patents
Image processor, image processing method, photographing device, and electronic apparatus Download PDFInfo
- Publication number
- CN110062161B CN110062161B CN201910285809.6A CN201910285809A CN110062161B CN 110062161 B CN110062161 B CN 110062161B CN 201910285809 A CN201910285809 A CN 201910285809A CN 110062161 B CN110062161 B CN 110062161B
- Authority
- CN
- China
- Prior art keywords
- module
- processing
- algorithm
- post
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/81—Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
The application discloses an image processor, an image processing method, a shooting device and an electronic device. The image processor comprises a hardware abstraction module and an algorithm post-processing module. The hardware abstraction module is used for transmitting image data and metadata corresponding to the image data. The algorithm post-processing module is connected with the hardware abstraction module, at least one image processing algorithm is stored in the algorithm post-processing module, and the algorithm post-processing module is used for processing image data by adopting the image processing algorithm according to the metadata so as to realize post-photographing processing. In the image processor, the image processing method, the shooting device and the electronic equipment, the hardware abstraction module does not perform post-shooting processing on the image data, the post-shooting processing on the image data can be realized by the algorithm post-processing module, the image processing algorithm of the post-shooting processing does not need to perform flow truncation on the algorithm framework of the hardware abstraction module, only needs to be externally compatible, and the design difficulty is reduced.
Description
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processor, an image processing method, a photographing apparatus, and an electronic device.
Background
At present, post-photographing processes, such as a beauty process, a filter process, a rotation process, a watermarking process, a blurring process, a High-Dynamic Range (HDR) process, a multi-frame process, and the like, are all processed in a Hardware abstraction module (HAL), wherein the Hardware abstraction module is usually provided by one manufacturer, an image processing algorithm for performing the post-photographing process is provided by another manufacturer, and when a compatible design is performed, an image processing algorithm needs to perform flow truncation on an algorithm architecture of the Hardware abstraction module itself, and the two are coupled to each other, which results in a large design difficulty.
Disclosure of Invention
The embodiment of the application provides an image processor, an image processing method, a shooting device and electronic equipment.
The image processor of the embodiment of the application comprises a hardware abstraction module and an algorithm post-processing module. The hardware abstraction module is used for transmitting image data and metadata corresponding to the image data. The algorithm post-processing module is connected with the hardware abstraction module, at least one image processing algorithm is stored in the algorithm post-processing module, and the algorithm post-processing module is used for processing the image data according to the metadata by adopting the image processing algorithm so as to realize post-photographing processing.
The image processing method of the embodiment of the application comprises the following steps: the hardware abstraction module transmits image data and metadata corresponding to the image data; and the algorithm post-processing module adopts an image processing algorithm and processes the image data according to the metadata so as to realize post-photographing processing.
The shooting device of the embodiment of the application comprises the image processor and the image sensor, wherein the image sensor is connected with the image processor.
The electronic equipment of the embodiment of the application comprises the shooting device and the shell, wherein the shooting device is combined with the shell.
In the image processor, the image processing method, the shooting device and the electronic equipment, the hardware abstraction module does not perform post-shooting processing on the image data, the post-shooting processing on the image data can be realized by the algorithm post-processing module, the image processing algorithm of the post-shooting processing does not need to perform flow truncation on the algorithm framework of the hardware abstraction module, only needs to be externally compatible, and the design difficulty is reduced.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIGS. 1 and 2 are schematic diagrams of a camera according to certain embodiments of the present application;
FIG. 3 is a schematic diagram of an algorithmic post-processing module in accordance with certain embodiments of the present application;
FIG. 4 is a schematic view of a camera according to some embodiments of the present application;
FIGS. 5 and 6 are schematic structural views of an electronic device according to some embodiments of the present application;
fig. 7 to 12 are schematic flow charts of image processing methods according to some embodiments of the present disclosure.
Detailed Description
Reference will now be made in detail to the embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below with reference to the accompanying drawings are illustrative and are only for the purpose of explaining the present application and are not to be construed as limiting the present application.
The following disclosure provides many different embodiments or examples for implementing different configurations of embodiments of the application. In order to simplify the disclosure of the embodiments of the present application, the components and arrangements of specific examples are described below. Of course, they are merely examples and are not intended to limit the present application.
Referring to fig. 1, the photographing apparatus 100 includes an image processor 10 and an image sensor 20. The image processor 10 is connected to the image sensor 20. The Image sensor 20 includes an Image acquisition unit (sensor)22 and a RAW Image data unit (IFE) 24, the Image acquisition unit 22 is configured to receive light to acquire a RAW Image, and the RAW Image data unit 24 is configured to transmit the RAW Image acquired by the Image acquisition unit 22 to the Image processor 10, wherein the RAW Image data unit 24 is configured to process the RAW Image acquired by the Image acquisition unit 22 and output the processed RAW Image to the Image processor 10.
The image processor 10 includes a hardware abstraction module 12 and an algorithmic post-processing module (APS) 16.
The hardware abstraction module 12 is used to transfer image data and metadata (metadata) corresponding to the image data. The image data may include a RAW image and/or a YUV image, among others. Specifically, the hardware abstraction module 12 may be configured to receive a RAW image, convert the RAW image into a YUV image, and transmit the RAW image and/or the YUV image. The hardware abstraction module 12 may be connected to the image sensor 20. The hardware abstraction module 12 may include a buffer queue (buffer queue)122 connected to the Image sensor 20, a RAW to RGB processing unit (BPS) 124, and a noise reduction and YUV post-processing unit (IPE) 126 connected to the application module 14. The buffer unit 122 is used for buffering the RAW image from the image sensor 20 and transmitting the RAW image to the post-algorithm processing module 16. The RAW-to-RGB processing unit 124 is configured to convert the RAW image from the buffer unit 122 into an RGB image. The denoising and YUV post-processing unit 126 is configured to process the RGB image to obtain a YUV image and transmit the YUV image to the algorithm post-processing module 16. The metadata includes 3a (automatic exposure control AE, automatic focus control AF, automatic white balance control AWB) information, picture information (e.g., image width, height), exposure parameters (aperture size, shutter speed, and sensitivity aperture value), and the like, and post-photographing processing (including at least one of beauty processing, filter processing, rotation processing, watermark processing, blurring processing, HDR processing, and multi-frame processing, for example) of the RAW image and/or the YUV image can be assisted by the metadata. In one embodiment, the metadata Includes Sensitivity (ISO) information according to which the brightness of the RAW image and/or the YUV image can be assisted to be adjusted, thereby implementing post-photographing processing related to the adjusted brightness.
Because the hardware abstraction module 12 does not perform post-photographing processing on the image data, the image processing algorithm of post-photographing processing does not need to perform flow truncation on the algorithm framework of the hardware abstraction module 12, and only needs to be externally compatible, so that the design difficulty is reduced.
In the related art, an Application Program Interface (API) establishes a hardware abstraction module as a pipeline (pipeline), and since the establishment of the pipeline requires a lot of time and memory, all the pipelines used for a working mode corresponding to a camera need to be established first when the camera is started, and in order to implement various image processing algorithms, a lot of pipelines (for example, more than three pipelines) generally need to be established, which may cause the startup of the camera to consume a lot of time and occupy a lot of memory. The hardware abstraction module 12 according to the embodiment of the present application does not perform post-photographing processing on the RAW image and/or the YUV image, and therefore, the hardware abstraction module 12 only needs to establish a small number of (for example, one or two) pipelines, and does not need to establish a large number of pipelines, so that the memory can be saved, and the starting speed of the camera can be increased.
The algorithm post-processing module 16 is connected to the hardware abstraction module 12, at least one image processing algorithm (for example, at least one of a beauty processing algorithm, a filter processing algorithm, a rotation processing algorithm, a watermark processing algorithm, a blurring processing algorithm, an HDR processing algorithm, and a multi-frame processing algorithm) is stored in the algorithm post-processing module 16, and the algorithm post-processing module 16 is configured to process image data (RAW image and/or YUV image) according to metadata by using an image processing algorithm to implement post-photographing processing. Since the post-photographing processing of the RAW image and/or the YUV image can be realized by the algorithm post-processing module 16, the process truncation is not required on the algorithm architecture of the hardware abstraction module 12, and only the external compatibility is required, so that the design difficulty is reduced. And because the post-photographing processing is realized by the algorithm post-processing module 16, the function of the algorithm post-processing module 16 is more single and more focused, thereby achieving the effects of fast transplantation, simple expansion of new image processing algorithms and the like.
When the post-algorithm processing module 16 processes only the RAW image (for example, the image processing algorithm processes the RAW image), the hardware abstraction module 12 may transmit only the RAW image (at this time, it may not be necessary to convert the RAW image into the YUV image); when the post-algorithm processing module 16 processes only YUV images (e.g., the image processing algorithm processes YUV images), the hardware abstraction module 12 may transmit only YUV images; while the post-algorithm processing module 16 processes the RAW image and the YUV image, the hardware abstraction module 12 may transmit the RAW image and the YUV image.
The image processor 10 also includes an application module (APP) 14. The application program module 14 is used for connecting the hardware abstraction module 12 and the algorithm post-processing module 16, respectively. In the image processor 10 according to the embodiment of the present application, when the hardware abstraction module 12 transmits the image data and the metadata to the algorithm post-processing module 16, the application module 14 is not needed, so that the data transmission is more real-time and efficient, and therefore, more frames of image data can be transmitted in a unit time, so that functions such as image preview and video recording (functions requiring a higher frame rate) do not need to be performed on the algorithm framework of the hardware abstraction module 12 itself for flow truncation, and the algorithm post-processing module 16 can be used for performing post-shooting processing to implement functions such as image preview and video recording, thereby facilitating the implementation of effects such as scene detection and video anti-shaking through image preview and video recording.
In some embodiments, the image processor 10 further includes a camera service module 18, and the post-algorithm processing module 16 is connected to the hardware abstraction module 12 through the camera service module 18. The camera service module 18 encapsulates the image data and metadata and transmits the encapsulated image data and metadata to the post-algorithm processing module 16. In this way, by encapsulating the image by the camera service module 18, the efficiency of image transmission can be improved, and the security of image transmission can be improved. Specifically, two queues may be created in the process of the camera service module 18, one of which is an image data queue and the other of which is a metadata queue, through which image data and metadata of the same frame are transmitted to the post-algorithm processing module 16. The post-algorithm processing module 16 also has two corresponding queues that are created upon camera startup and used to store image data and metadata, respectively. In addition, the application module 14 may also establish a receiving queue for receiving the image data processed by the post-algorithm processing module 16.
The application module 14 may be connected to the hardware abstraction module 12 through the camera service module 18, i.e., the application module 14 may be connected to the camera service module 18. The application module 14 may be used to initiate a data request to the camera service module 18. Specifically, the application module 14 may be configured to generate a data request according to the input of the user and transmit the data request to the hardware abstraction module 12 through the camera service module 18, and the hardware abstraction module 12 may transmit the data request to the image sensor 20, so as to control the operation of the image sensor 20 accordingly. After the image sensor 20 generates RAW images and metadata corresponding to the data request, the RAW images and metadata are transmitted to the hardware abstraction module 12, the hardware abstraction module 12 transmits corresponding image data (RAW images and/or YUV images) and metadata to the camera service module 18 according to the data request, and the camera service module 18 encapsulates the image data and metadata according to the data request and transmits the encapsulated image data and metadata to the post-algorithm processing module 16. Wherein, the application module 14 can run with 64 bits (bit), and the static data link library (lib) of the image processing algorithm of the post-photographing processing can be configured with 64 bits to improve the operation speed. The camera service module 18 may also be used to feed back the results of the data request to the application module 14.
In some embodiments, the camera service module 18 may send the frame number suggestion to the application module 14 according to the sensitivity information, the shake condition of the gyroscope, the AR scene detection result (detection scene type, such as people, animals, scenery, etc.), and the like, for example, when the shake detected by the gyroscope is large, the frame number suggestion sent by the camera service module 18 to the application module 14 may be: more frames are suggested to better realize post-photographing processing; when the gyroscope detected jitter is small, the frame number suggestion that the camera service module 18 sends to the application module 14 may be: fewer frames are suggested to reduce the amount of data transmission. That is, the number of frames the camera service module 18 suggests to the application module 14 may be positively correlated with the degree of shake detected by the gyroscope. The camera service module 18 may also send an algorithm suggestion to the application module 14 according to the sensitivity information, the shaking condition of the gyroscope, the AR scene detection result, and the like, for example, when the shaking detected by the gyroscope is large, the algorithm suggestion sent by the camera service module 18 to the application module 14 may be multi-frame processing to eliminate the shaking according to the multi-frame processing; when the scene type detected by the AR scene detection result is a person, the algorithm suggestion sent by the camera service module 18 to the application module 14 may be a beauty treatment to beautify the person; when the scene type detected by the AR scene detection result is landscape, the algorithm suggestion sent by the camera service module 18 to the application module 14 may be HDR processing to form a high dynamic range landscape image. The application module 14 issues a data request to the camera service module 18 based on the frame number suggestions and the algorithm suggestions. Of course, the hardware abstraction module 12 may also obtain frame number suggestions and/or algorithm suggestions according to the sensitivity information, the shaking condition of the gyroscope, and the AR scene detection result, and then transmit the frame number suggestions and/or algorithm suggestions to the application module 14 through the camera service module 18.
Referring to fig. 2, the application module 14 includes an algorithm post-processing client 142, and the algorithm post-processing module 16 transmits the processed image data to the algorithm post-processing client 142. The algorithm post-processing module 16 may be configured to perform further post-photographing processing on the image data processed by the algorithm post-processing module 16, for example, the algorithm post-processing module 16 performs some post-photographing processing (e.g., HDR processing, multi-frame processing, etc.), the algorithm post-processing client 142 performs some other post-photographing processing (e.g., beauty processing, filter processing, rotation processing, watermarking processing, blurring processing, etc.), and after the post-photographing processing on the image data is performed by the algorithm post-processing client 142, the image data may be transmitted to the display module (surfefinger) 30 for display. Of course, when the post-algorithm processing client 142 does not need to perform post-photographing processing on the image, the post-algorithm processing client 142 may also directly transmit the image data processed by the post-algorithm processing module 16 to the display module 30 for displaying. When the image data is a moving image (e.g., video), the display module 30 includes a video encoding unit (media server)32, and the video encoding unit 32 is configured to encode the image data to form a video.
If some post-photographing processing (e.g., HDR processing, multi-frame processing, etc.) is performed in the algorithm post-processing module 16, the algorithm post-processing client 142 performs another post-photographing processing (e.g., beauty processing, filter processing, rotation processing, watermark processing, blurring processing, etc.), at least one image processing algorithm (e.g., including at least one of beauty processing algorithm, filter processing algorithm, rotation processing algorithm, watermark processing algorithm, blurring processing algorithm, HDR processing algorithm, and multi-frame processing algorithm) may also be stored in the algorithm post-processing client 142, and the algorithm post-processing client 142 is further configured to process the image data by using the image processing algorithm to implement the post-photographing processing. Since the post-photographing processing of the image data is realized by the algorithm post-processing client 142 and the algorithm post-processing module 16, the process truncation on the algorithm architecture of the hardware abstraction module 12 itself is not needed, and only the external compatibility is needed, so that the design difficulty is also greatly reduced.
To facilitate the user's use of the image processing algorithms in the algorithm post-processing module 16, the algorithm post-processing client 142 may be implemented as a Software Development Kit (SDK) to facilitate third party development.
The data communicated between the application module 14 and the post-algorithm processing module 16 may include metadata, interactive information, image content, etc., in addition to image data. For example, when a user clicks a certain area of the display interface of the application module 14, the application module 14 may transmit interaction information such as a position of the area and a clicked manner (for example, a click or a double click) to the algorithm post-processing module 16, so that the algorithm post-processing module 16 may operate according to the interaction information. For another example, if the post-algorithm processing module 16 obtains image content by processing the image data (e.g., detecting a human face), then information that a human face exists in the image data may be transmitted to the application module 14, so that the application module 14 may inform the user of the information (e.g., display "human face exists" via the display interface).
In some embodiments, the post-algorithm processing module 16 may directly transmit the processed image data to the display module 30 for display. Specifically, the application module 14 may send a control instruction to the post-algorithm processing module 16, and the post-algorithm processing module 16 selects, according to the control instruction, to directly transmit the processed image data to the display module 30 for display, or to transmit the processed image data to the application module 14. In this way, the algorithm post-processing module 16 has more diversified transmission modes for the image data.
After the image sensor 20 performs one shot (exposure imaging), the RAW image is transmitted to the hardware abstraction module 12, and after the post-algorithm processing module 16 receives the image data (RAW image and/or YUV image) corresponding to the RAW, the image sensor 20 can perform the next shot, or the image sensor 20 can be turned off, or the application module 14 can exit the application interface. Since the post-photographing processing is implemented by the algorithm post-processing module 16, after the photographing data is transmitted to the algorithm post-processing module 16, the post-photographing processing can be implemented only by the algorithm post-processing module 16, and at this time, the image sensor 20 and the application program module 14 may not participate in the post-photographing processing, so that the image sensor 20 can be turned off or the next photographing can be performed, and the application program module 14 can be turned off or quit the application interface. In this way, the photographing apparatus 100 can achieve snapshot, and the application module 14 can be closed or the application interface can be exited when the post-photographing processing is performed by the post-algorithm processing module 16, so that some other operations (for example, operations unrelated to the photographing apparatus 100, such as browsing a web page, watching a video, making a call, etc.) can be performed on the electronic device, so that the user does not need to spend a lot of time waiting for the completion of the post-photographing processing, and the user can use the electronic device conveniently.
The algorithm post-processing module 16 may include an encoding unit 162, and the encoding unit 162 is configured to convert the YUV image into a JPG image (or a JPEG image, etc.). Specifically, when the YUV image is processed by the post-algorithm processing module 14, the encoding unit 162 may directly encode the YUV image to form a JPG image, thereby increasing the output speed of the image. When the RAW image is processed by the post-algorithm processing module 14, the post-algorithm processing module 14 may transmit the RAW image processed to realize post-photographing processing back to the camera service module 18 through the application module 14, and then transmit the RAW image back to the hardware abstraction module 12 through the camera service module 18, for example, back to the RAW to RGB processing unit 124, the RAW to RGB processing unit 124 may be configured to convert the RAW image processed to realize post-photographing processing by the post-algorithm processing module 14 and transmitted back through the application module 14 into an RGB image, the noise reduction and YUV post-processing unit 126 may convert the RGB image into a YUV image, and the YUV image may be transmitted to the encoding unit 162 of the post-algorithm processing module 16 again to convert the YUV image into a JPG image. In some embodiments, the algorithm post-processing module 14 may also transmit the RAW image processed to realize the post-photographing processing back to the camera service module 18 through the application module 14, and then the RAW image is transmitted back to the buffer unit 122 by the camera service module 18, and the transmitted RAW image is processed by the RAW-to-RGB processing unit 124 and the noise reduction and YUV post-processing unit 126 to form a YUV image, and then is transmitted to the encoding unit 162 to form the JPG image. In addition, the algorithm post-processing module 14 may also directly transmit the RAW image processed to realize the post-photographing processing back to the hardware abstraction module 12 through the camera service module 18 (without passing through the application program module 14). After the JPG image is formed, an algorithmic post-processing module 16 may be used to transfer the JPG image to memory for storage.
Referring to fig. 3, the algorithm post-processing module 16 includes a logic processing calling layer 164, an algorithm module interface layer 166 and an algorithm processing layer 168. The logic processing call layer 164 is used to communicate with the application module 14. The algorithm module interface layer 166 is used to maintain the algorithm interface. The algorithm processing layer 168 includes at least one image processing algorithm. The algorithm module interface layer 166 is used for performing at least one of registration, logout, call and callback on the image processing algorithm of the algorithm processing layer 168 through the algorithm interface.
The logic processing calling layer 164 may include a thread queue, and after receiving the post-photographing processing task of the RAW image and/or the YUV image, the algorithm post-processing module 16 may cache the post-photographing processing task in the thread queue for processing, where the thread queue may cache a plurality of post-photographing processing tasks, and thus, a snapshot (i.e., a snapshot mechanism) may be implemented by the logic processing calling layer 164. The logical process calling layer 164 may receive an instruction such as an initialization (init) or process (process) transmitted from the application module 14, and store the corresponding instruction and data in the thread queue. The logic processing call layer 164 makes a call of specific logic (i.e., a specific logic call combination) according to the task in the thread queue. Logical process call layer 164 may also pass the thumbnail (thumbnail) obtained by the process back to application module 14 for display (i.e., thumbnail display). In the description of the embodiments of the present application, "a plurality" means two or more unless specifically defined otherwise.
The algorithm module interface layer 166 is used for calling an algorithm interface, the calling command can also be stored in the thread queue, and the algorithm processing layer 168 can analyze the parameter of the calling command to obtain the image processing algorithm to be called when receiving the calling command of the thread queue. When the algorithm module interface layer 166 registers the image processing algorithm, an image processing algorithm may be newly added to the algorithm processing layer 168; when the algorithm module interface layer 166 performs logout on an image processing algorithm, one of the image processing algorithms in the algorithm processing layer 168 may be deleted; when the algorithm module interface layer 166 calls an image processing algorithm, one of the image processing algorithms in the algorithm processing layer 168 may be called; when the algorithm module interface layer 166 recalls the image processing algorithm, the data and status after the algorithm processing can be transmitted back to the application module 14. The unified interface can be adopted to realize the operations of registration, logout, call-back and the like of the image processing algorithm. Each image processing algorithm in the algorithm processing layer 168 is independent, so that operations such as registration, logout, call back and the like can be conveniently realized on the image processing algorithms.
Referring to fig. 4, in some embodiments, the hardware abstraction module 12 is directly connected to the post-algorithm processing module 16, and the hardware abstraction module 12 directly transmits the image data and the metadata to the post-algorithm processing module 16. Thus, the image data and the metadata can be transmitted by the shortest transmission path, thereby ensuring the real-time performance and the high efficiency of data transmission. In this case, the application module 14 may be directly connected to the hardware abstraction module 12. When the image processor 10 does not include the camera service module 18, the path of data (image data, metadata, etc.) transmission in the image processor 10 may be adapted, that is, the hardware abstraction module 12 and the post-algorithm processing module 16 may directly communicate, and the hardware abstraction module 12 and the application module 14 may also directly communicate. For example, the hardware abstraction module 12 directly transfers the image data and metadata to the post-algorithm processing module 16. As another example, the hardware abstraction module 12 sends frame number suggestions and/or algorithm suggestions directly to the application module 14, and the application module 14 sends data requests directly to the hardware abstraction module 12. Also for example, the hardware abstraction module may be used to feed back the results of the data request to the application module 14.
Referring to FIG. 2, the hardware abstraction module 12 and the algorithm post-processing module 16 may be located in a user (vendor) partition, and the application module 14, the camera service module 18 and the display module 30 may be located in a system (system) partition. Wherein user partitions and system partitions may refer to partitions of storage elements. Because the algorithm post-processing module 16 is disposed in the user partition, the system partition can be not changed or changed greatly, and when modules (such as the application module 14, the camera service module 18, or the display module 30) in the system partition are upgraded or otherwise changed, the algorithm post-processing module 16 does not affect the normal progress of the upgrade, and the upgraded modules do not interfere with the algorithm post-processing module 16. When the hardware abstraction module 12 directly transmits the image data and the metadata to the algorithm post-processing module 16, the hardware abstraction module 12 and the algorithm post-processing module 16 are disposed in the same partition, so that communication is relatively convenient, and data transmission efficiency can be improved.
Referring to fig. 5 and 6, an electronic device 1000 includes the image capturing device 100 and the housing 200 according to any of the above embodiments, and the image capturing device 100 is combined with the housing 200. The housing 200 may serve as a mounting carrier for functional elements of the electronic apparatus 1000. The housing 200 may provide protection against dust, falling, water, etc. for functional elements, such as a display screen, the camera 100, a receiver, etc. In one embodiment, the housing 200 includes a main body 210 and a movable bracket 220, the movable bracket 220 can move relative to the main body 210 under the driving of the driving device, for example, the movable bracket 220 can slide relative to the main body 210 to slide into the main body 210 (for example, the state of fig. 5) or slide out of the main body 210 (for example, the state of fig. 6). Some functional components may be mounted on the main body 210, and another part of functional components (e.g., the camera 100) may be mounted on the movable bracket 220, and the movement of the movable bracket 220 may cause the another part of functional components to retract into the main body 210 or extend out of the main body 210. In another embodiment, the housing 200 has a collection window, and the camera 100 is aligned with the collection window so that the camera 100 can receive external light through the collection window to form an image.
In the description of the embodiments of the present application, it should be noted that, unless otherwise explicitly specified or limited, the term "mounted" is to be interpreted broadly, e.g., as being either fixedly attached, detachably attached, or integrally attached; may be mechanically connected, may be electrically connected or may be in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. Specific meanings of the above terms in the embodiments of the present application can be understood by those of ordinary skill in the art according to specific situations.
Referring to fig. 1 and 7, the image processing method includes:
01: the hardware abstraction module 12 transmits image data and metadata corresponding to the image data; and
02: the post-algorithm processing module 16 processes the image data according to the metadata using an image processing algorithm to achieve post-photographing processing.
The image processing method according to the embodiment of the present invention may be used in the image processor 10 according to the embodiment of the present invention, or the image processing method according to the embodiment of the present invention may be implemented by the image processor 10 according to the embodiment of the present invention, wherein step 01 may be implemented by the hardware abstraction module 12, and step 02 may be implemented by the post-algorithm processing module 16.
Referring to fig. 1 and 8, the image processing method further includes:
03: the camera service module 18 encapsulates the image data and metadata and transmits the encapsulated image data and metadata to the post-algorithm processing module 16.
Wherein step 03 may be implemented by the camera service module 18.
Referring to fig. 1 and 9, the image processing method further includes:
04: the application module 14 initiates a data request to the camera service module 18;
032: the camera service module 18 encapsulates the image data and metadata according to the data request and transmits the encapsulated image data and metadata to the post-algorithm processing module 16.
Wherein step 04 may be implemented by the application module 14, and step 032 may be implemented by the camera service module 18.
Referring to fig. 1 and 10, the image processing method further includes:
05: the post-algorithm processing module 16 transmits the processed image data to the post-algorithm processing client 142;
06: the post-algorithm processing client 142 processes the processed image data.
Wherein, step 05 can be implemented by the algorithm post-processing module 16, and step 06 can be implemented by the algorithm post-processing client 142.
Referring to fig. 2 and 11, the image processing method further includes:
05: the post-algorithm processing module 16 transmits the processed image data to the post-algorithm processing client 142;
07: the post-algorithm processing client 142 transmits the processed image data to the display module 30 for display.
Wherein, step 05 can be implemented by the algorithm post-processing module 16, and step 07 can be implemented by the algorithm post-processing client 142.
Referring to fig. 2 and 12, the image processing method further includes:
08: the post-algorithm processing module 16 transmits the processed image data to the display module 30 for display.
Wherein step 08 can be implemented by the algorithm post-processing module 16.
The explanation of the image processor 10 in the above embodiment is also applicable to the image processing method according to the embodiment of the present invention, and will not be described herein again.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processing module-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection having one or more wires (control method), a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the embodiments of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
In the description herein, references to the description of "certain embodiments" or the like are intended to mean that a particular feature, structure, or characteristic described in connection with the embodiment or example is included in at least one embodiment of the present application. In the present specification, the schematic representations of the above terms do not necessarily refer to the same embodiment. Furthermore, the particular features, structures, or characteristics described may be combined in any suitable manner in any one or more embodiments.
Although embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.
Claims (11)
1. An image processor, characterized in that the image processor comprises:
a hardware abstraction module to transmit image data and metadata corresponding to the image data; and
the post-algorithm processing module is connected with the hardware abstraction module, at least one image processing algorithm is stored in the post-algorithm processing module, and the post-algorithm processing module is used for processing the image data by adopting the image processing algorithm according to the metadata so as to realize post-photographing processing;
the image processor also comprises an application program module which is used for being respectively connected with the hardware abstraction module and the algorithm post-processing module; when the hardware abstraction module transmits the image data and the metadata to the algorithm post-processing module, the application program module is not passed through;
the image processor also comprises a camera service module, the algorithm post-processing module is connected with the hardware abstraction module through the camera service module, and the camera service module is used for packaging the image data and the metadata and transmitting the packaged image data and the packaged metadata to the algorithm post-processing module.
2. The image processor of claim 1, wherein the application module is coupled to the camera service module, the application module configured to initiate a data request to the camera service module, and the camera service module configured to encapsulate the image data and the metadata according to the data request and transmit the encapsulated image data and metadata to the post-algorithm processing module.
3. The image processor of claim 1, wherein the application module comprises a post-algorithm processing client for transmitting the processed image data to the post-algorithm processing client;
the algorithm post-processing client is used for processing the processed image data, or the algorithm post-processing client is used for transmitting the processed image data to a display module for displaying.
4. The image processor of claim 1, wherein the hardware abstraction module and the post-algorithm processing module are disposed in a user partition.
5. The image processor of claim 1, wherein the post-algorithm processing module is configured to transmit the processed image data to a display module for display.
6. An image processing method, characterized in that the image processing method comprises:
the hardware abstraction module transmits image data and metadata corresponding to the image data; and
the algorithm post-processing module adopts an image processing algorithm and processes the image data according to the metadata to realize post-photographing processing; when the hardware abstraction module transmits the image data and the metadata to the algorithm post-processing module, the image data and the metadata do not pass through an application program module;
the image processing method further includes:
and the camera service module encapsulates the image data and the metadata and transmits the encapsulated image data and the encapsulated metadata to an algorithm post-processing module.
7. The image processing method according to claim 6, characterized in that the image processing method further comprises:
an application program module initiates a data request to the camera service module;
the camera service module encapsulates the image data and the metadata and transmits the encapsulated image data and metadata to an algorithm post-processing module, including:
and the camera service module encapsulates the image data and the metadata according to the data request and transmits the encapsulated image data and metadata to the algorithm post-processing module.
8. The image processing method according to claim 6, characterized in that the image processing method further comprises:
the algorithm post-processing module transmits the processed image data to an algorithm post-processing client;
and the post-algorithm processing client processes the processed image data, or transmits the processed image data to a display module for display.
9. The image processing method according to claim 6, characterized in that the image processing method further comprises:
and the algorithm post-processing module transmits the processed image data to a display module for display.
10. A photographing apparatus, characterized by comprising:
the image processor of any one of claims 1 to 5; and
an image sensor connected with the image processor.
11. An electronic device, characterized in that the electronic device comprises:
the camera of claim 10; and
a housing, the photographing device being combined with the housing.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910285809.6A CN110062161B (en) | 2019-04-10 | 2019-04-10 | Image processor, image processing method, photographing device, and electronic apparatus |
PCT/CN2020/079327 WO2020207192A1 (en) | 2019-04-10 | 2020-03-13 | Image processor, image processing method, photography apparatus, and electronic device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910285809.6A CN110062161B (en) | 2019-04-10 | 2019-04-10 | Image processor, image processing method, photographing device, and electronic apparatus |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110062161A CN110062161A (en) | 2019-07-26 |
CN110062161B true CN110062161B (en) | 2021-06-25 |
Family
ID=67317535
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910285809.6A Active CN110062161B (en) | 2019-04-10 | 2019-04-10 | Image processor, image processing method, photographing device, and electronic apparatus |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110062161B (en) |
WO (1) | WO2020207192A1 (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110062161B (en) * | 2019-04-10 | 2021-06-25 | Oppo广东移动通信有限公司 | Image processor, image processing method, photographing device, and electronic apparatus |
CN109963083B (en) | 2019-04-10 | 2021-09-24 | Oppo广东移动通信有限公司 | Image processor, image processing method, photographing device, and electronic apparatus |
CN110602359B (en) * | 2019-09-02 | 2022-01-18 | Oppo广东移动通信有限公司 | Image processing method, image processor, photographing device and electronic equipment |
CN111314606B (en) * | 2020-02-21 | 2021-06-18 | Oppo广东移动通信有限公司 | Photographing method and device, electronic equipment and storage medium |
CN113645409B (en) * | 2021-08-16 | 2022-08-19 | 展讯通信(上海)有限公司 | Photographing processing method and device, photographing method, device and system and terminal equipment |
CN113840091B (en) * | 2021-10-29 | 2023-07-18 | Oppo广东移动通信有限公司 | Image processing method, apparatus, electronic device, and computer-readable storage medium |
Family Cites Families (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8392497B2 (en) * | 2009-11-25 | 2013-03-05 | Framehawk, LLC | Systems and algorithm for interfacing with a virtualized computing service over a network using a lightweight client |
JP6136086B2 (en) * | 2011-12-28 | 2017-05-31 | ソニー株式会社 | Imaging apparatus and image processing apparatus |
CN102769740A (en) * | 2012-07-02 | 2012-11-07 | 北京百纳威尔科技有限公司 | Method for monitoring mobile terminal and remote video |
WO2015088212A1 (en) * | 2013-12-09 | 2015-06-18 | Samsung Electronics Co., Ltd. | Digital photographing apparatus capable of reconfiguring image signal processor and method of controlling the same |
KR102149448B1 (en) * | 2014-02-21 | 2020-08-28 | 삼성전자주식회사 | Electronic device and method for processing image |
US9591195B2 (en) * | 2014-07-10 | 2017-03-07 | Intel Corporation | Platform architecture for accelerated camera control algorithms |
WO2016149894A1 (en) * | 2015-03-23 | 2016-09-29 | Intel Corporation | Workload scheduler for computing devices with camera |
CN105516423A (en) * | 2015-12-24 | 2016-04-20 | 努比亚技术有限公司 | Mobile terminal, data transmission system and mobile terminal shoot method |
CN105979235A (en) * | 2016-05-30 | 2016-09-28 | 努比亚技术有限公司 | Image processing method and terminal |
KR102688614B1 (en) * | 2016-09-30 | 2024-07-26 | 삼성전자주식회사 | Method for Processing Image and the Electronic Device supporting the same |
CN107066400A (en) * | 2016-12-13 | 2017-08-18 | 深圳众思科技有限公司 | Signal processing method, device and the electronic equipment of electronic equipment |
CN107222686A (en) * | 2017-06-30 | 2017-09-29 | 维沃移动通信有限公司 | A kind for the treatment of method and apparatus of view data |
CN108495043B (en) * | 2018-04-28 | 2020-08-07 | Oppo广东移动通信有限公司 | Image data processing method and related device |
CN108965732B (en) * | 2018-08-22 | 2020-04-14 | Oppo广东移动通信有限公司 | Image processing method, image processing device, computer-readable storage medium and electronic equipment |
CN109040591B (en) * | 2018-08-22 | 2020-08-04 | Oppo广东移动通信有限公司 | Image processing method, image processing device, computer-readable storage medium and electronic equipment |
CN109101352B (en) * | 2018-08-30 | 2021-08-06 | Oppo广东移动通信有限公司 | Image processing algorithm architecture, algorithm calling method, device, storage medium and mobile terminal |
CN109167915A (en) * | 2018-09-29 | 2019-01-08 | 南昌黑鲨科技有限公司 | Image processing method, system and computer readable storage medium |
CN109167916A (en) * | 2018-09-29 | 2019-01-08 | 南昌黑鲨科技有限公司 | intelligent terminal, image processing method and computer readable storage medium |
CN109963083B (en) * | 2019-04-10 | 2021-09-24 | Oppo广东移动通信有限公司 | Image processor, image processing method, photographing device, and electronic apparatus |
CN110062161B (en) * | 2019-04-10 | 2021-06-25 | Oppo广东移动通信有限公司 | Image processor, image processing method, photographing device, and electronic apparatus |
CN110086967B (en) * | 2019-04-10 | 2021-02-05 | Oppo广东移动通信有限公司 | Image processing method, image processor, photographing device and electronic equipment |
CN110290288B (en) * | 2019-06-03 | 2022-01-04 | Oppo广东移动通信有限公司 | Image processor, image processing method, photographing device, and electronic apparatus |
CN110278373A (en) * | 2019-06-26 | 2019-09-24 | Oppo广东移动通信有限公司 | Image processor, image processing method, filming apparatus and electronic equipment |
CN110276718A (en) * | 2019-06-28 | 2019-09-24 | Oppo广东移动通信有限公司 | Image processing method, image processor, filming apparatus and electronic equipment |
CN110177214B (en) * | 2019-06-28 | 2021-09-24 | Oppo广东移动通信有限公司 | Image processor, image processing method, photographing device and electronic equipment |
CN110266951A (en) * | 2019-06-28 | 2019-09-20 | Oppo广东移动通信有限公司 | Image processor, image processing method, filming apparatus and electronic equipment |
CN110121022A (en) * | 2019-06-28 | 2019-08-13 | Oppo广东移动通信有限公司 | Control method, filming apparatus and the electronic equipment of filming apparatus |
CN110300240B (en) * | 2019-06-28 | 2021-08-13 | Oppo广东移动通信有限公司 | Image processor, image processing method, photographing device and electronic equipment |
CN110602359B (en) * | 2019-09-02 | 2022-01-18 | Oppo广东移动通信有限公司 | Image processing method, image processor, photographing device and electronic equipment |
-
2019
- 2019-04-10 CN CN201910285809.6A patent/CN110062161B/en active Active
-
2020
- 2020-03-13 WO PCT/CN2020/079327 patent/WO2020207192A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
CN110062161A (en) | 2019-07-26 |
WO2020207192A1 (en) | 2020-10-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109963083B (en) | Image processor, image processing method, photographing device, and electronic apparatus | |
CN110086967B (en) | Image processing method, image processor, photographing device and electronic equipment | |
CN110062161B (en) | Image processor, image processing method, photographing device, and electronic apparatus | |
CN110290288B (en) | Image processor, image processing method, photographing device, and electronic apparatus | |
US11588984B2 (en) | Optimized exposure temporal smoothing for time-lapse mode | |
CN111147695B (en) | Image processing method, image processor, shooting device and electronic equipment | |
CN110177214B (en) | Image processor, image processing method, photographing device and electronic equipment | |
WO2020259250A1 (en) | Image processing method, image processor, photographing apparatus, and electronic device | |
CN110996012B (en) | Continuous shooting processing method, image processor, shooting device and electronic equipment | |
CN110278373A (en) | Image processor, image processing method, filming apparatus and electronic equipment | |
CN110300240B (en) | Image processor, image processing method, photographing device and electronic equipment | |
CN111193866B (en) | Image processing method, image processor, photographing device and electronic equipment | |
WO2013184256A1 (en) | Dynamic camera mode switching | |
CN110753187A (en) | Camera control method and device | |
CN110418061B (en) | Image processing method, image processor, photographing device and electronic equipment | |
CN115526787B (en) | Video processing method and device | |
CN111193867B (en) | Image processing method, image processor, photographing device and electronic equipment | |
CN110121022A (en) | Control method, filming apparatus and the electronic equipment of filming apparatus | |
CN111510629A (en) | Data display method, image processor, photographing device and electronic equipment | |
JP2010016826A (en) | System and method for efficiently performing image processing operations | |
CN111491101B (en) | Image processor, image processing method, photographing device, and electronic apparatus | |
CN110401800B (en) | Image processing method, image processor, photographing device and electronic equipment | |
CN110602359B (en) | Image processing method, image processor, photographing device and electronic equipment | |
JP2013211724A (en) | Imaging apparatus | |
JP2013211715A (en) | Imaging device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |