Nothing Special   »   [go: up one dir, main page]

WO2009067097A2 - White balance adjustment for scenes with varying illumination - Google Patents

White balance adjustment for scenes with varying illumination Download PDF

Info

Publication number
WO2009067097A2
WO2009067097A2 PCT/US2007/024229 US2007024229W WO2009067097A2 WO 2009067097 A2 WO2009067097 A2 WO 2009067097A2 US 2007024229 W US2007024229 W US 2007024229W WO 2009067097 A2 WO2009067097 A2 WO 2009067097A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
illuminant
white balance
captured image
scene
Prior art date
Application number
PCT/US2007/024229
Other languages
French (fr)
Other versions
WO2009067097A3 (en
Inventor
Li Hong
Original Assignee
Nikon Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corporation filed Critical Nikon Corporation
Priority to PCT/US2007/024229 priority Critical patent/WO2009067097A2/en
Publication of WO2009067097A2 publication Critical patent/WO2009067097A2/en
Publication of WO2009067097A3 publication Critical patent/WO2009067097A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/6027Correction or control of colour gradation or colour contrast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals

Definitions

  • Cameras are commonly used to capture an i mage of a scene. Most scenes are not illuminated by a 100% pure white light source. For example, sunlight at midday is much closer to white than the late afternoon or morning sunlight which includes more yellow. The color of light reflected from an object will vary according to the color of the light source that is illuminating the scene. As a result thereof, for example, if the color of the light source includes a lot of yellow, a white object in the scene will not be captured as a white object with a typical film type camera.
  • white balance correction the digital camera attempts to compensate for variations in the colors in the captured image caused by an off-white illuminant, and the actual color of the objects in the scene are more accurately represented in the provided image.
  • the present invention is directed to an image apparatus for providing an adjusted image of a scene.
  • the image apparatus includes a capturing system and a control system.
  • the capturing system captures information for a captured image of the scene.
  • the control system evaluates the captured image and determines if the scene is illuminated by more than one illuminant. With this design, the image apparatus is particularly suited to capture scenes that are illuminated by multiple illuminants.
  • the scene can include a first scene region that is illuminated by a first illuminant and a second scene region that is illuminated by a second illuminant.
  • the control system can evaluate the captured image and estimate which portion of the captured image was toned by the first illuminant and which portion of the captured image was toned by the second illuminant.
  • control system can evaluate the captured image to estimate the characteristics of the first illuminant and the second illuminant. Further, the control system can separate the captured image into at least a first image region and a second image region. With this design, the control system can perform a first level of white balance adjustment on the first image region and a second level of white balance adjustment on the second image region. In this embodiment, the second level of white balance adjustment is different than the first level of white balance adjustment. With this design, in certain embodiments, the control system can compensate for the different illuminants in the scene and the characteristics of both illuminants are used to guide the color adjustment. As a result thereof, the adjusted image more closely represents the true colors in the scene.
  • the terms "true colors” or “actual colors” shall mean the colors that are present at the scene when the scene is illuminated with an even, true 100 percent white light.
  • the captured image is defined by a plurality of pixels.
  • the control system evaluates the pixels relative to an illuminant database and evaluates each pixel relative to its neighbors to determine if the scene is illuminated by more than one illuminant and to estimate the characteristics of the one or more illuminants.
  • the present invention is also directed to an image apparatus including an apparatus frame, a capturing system secured to the apparatus frame, and a control system secured to the apparatus frame.
  • the capturing system again captures information for a captured image of the scene.
  • the control system evaluates the captured image to estimate the characteristics of the first illuminant and the second illuminant.
  • the control system performs a first level of white balance adjustment on at a first image region of the captured image and performs a second level of white balance adjustment on a second image region of the captured image.
  • the second level of white balance adjustment is different than the first level of white balance adjustment.
  • the present invention is also directed to one or more methods for providing an adjusted image of the scene.
  • the method includes the steps of capturing information for a captured image of the scene with a capturing system, and evaluating the captured image and determining if the scene is illuminated by more than one illuminant with a control system.
  • Figure 1 is a simplified top plan view of a scene and an image apparatus having features of the present invention
  • Figure 2A is a simplified front perspective view of one embodiment of the image apparatus
  • Figure 2B is a simplified rear view of the image apparatus of Figure 2A and an adjusted image of the scene of Figure 1 ;
  • Figure 3A illustrates the image apparatus, a raw captured image of the scene of Figure 1 divided into image regions;
  • Figure 3B illustrates the image apparatus, the adjusted image, the image regions
  • Figure 4 is a simplified flowchart that details the operation of the image apparatus
  • FIG. 5 is a simplified illustration of another embodiment of an image apparatus having features of the present invention.
  • FIG. 6 is a simplified flowchart that details the operation of another embodiment of the image apparatus.
  • Figure 7 illustrates the image apparatus, a raw captured image of the scene of Figure 1 divided into image regions and segmented into pixel groups.
  • Figure 1 is a simplified top plan illustration of an image apparatus 10 and a scene 12 that is illuminated by more than one illuminant 14.
  • the image apparatus 10 is uniquely designed to (i) capture a captured image 316 (illustrated in Figure 3A), (ii) evaluate the captured image 316 to determine if the scene 12 is illumined by more than one illuminate 14, and (iii) perform white balance adjustment on the captured image 316 based , on the multiple illuminants 14 to provide an adjusted image 218 (illustrated in Figure 2B).
  • the image apparatus 10 more accurately removes unrealistic color casts from the captured image 316 and that colors in the adjusted image 218 more accurately approach the true colors in the scene 12.
  • the type of scene 12 captured by the image apparatus 10 can vary.
  • the scene 12 can include one or more objects 20, e.g. animals, plants, mammals, and/or environments.
  • the scene 12 is illustrated as including seven objects 20.
  • the scene 12 can include more than seven or less than seven objects 20.
  • one of the objects 20 is a wall 2OA
  • one of the objects 20 is a first painting 2OB attached to the wall 2OA
  • one of the objects 20 is a second painting 2OC attached to the wall 2OA
  • one of the objects 20 is a table 2OD
  • the remaining three objects 20 are the illuminants 14.
  • one or all of the illuminants 14 may not be in the scene 12 that is being captured with the image apparatus 10.
  • the number, design and location of the illuminants 14 that illuminate the scene 12 can vary greatly.
  • the scene 12 is illuminated by a first illuminant 14A, a second illuminant 14B and a third illuminant 14C.
  • the scene 12 can be illuminated with more than three or fewer than three illuminants 14.
  • the first illuminant 14A is a fluorescent lamp positioned near the first painting 2OB
  • the second illuminant 14B is an incandescent bulb attached to the wall 2OA near the second painting 2OB
  • the third illuminant 14B is a candle positioned on the table 2OD.
  • Non-exclusive examples, of other illuminants 14 can include (i) the sun at sunrise with a clear sky, (ii) the sun at sunset with a clear sky, (iii) the sun at midday with a clear sky, (iv) an electronic flash, (v) a flashlight, (vi) the sun with a moderately overcast sky, or (vii) the sun with shade or a heavily overcast sky.
  • the illuminants 14 are different from each other. More specifically, fluorescent light (illustrated as arrows from the first illuminant 14A), the incandescent light (illustrated as arrows from the second illuminant 14B), and candlelight (illustrated as arrows from the third illuminant 14C) each have a different color temperature.
  • candlelight has a color temperature of between approximately 1000-2000K
  • the incandescent light has a color temperature of between approximately 2500-3500K
  • the fluorescent light has a color temperature of between approximately 4000-5000K.
  • the objects 20 illuminated by each of the illuminants 14 will have a color cast that corresponds to its respective illuminant 14.
  • the scene 12 can be divided in different scene regions 12A (defined by dashed lines) based upon the areas that are illuminated by the different illuminants 14.
  • scene regions 12A defined by dashed lines
  • Figure 1 because the scene 12 is illuminated by three illuminants 14, it can be divided into three scene regions 12A, namely a first scene region 12B, a second scene region 12C, and a third scene region 12D.
  • the first scene region 12B is illuminated mainly by the first illuminant 14A and includes the fluorescent lamp 14A, the first painting 2OB and a portion of the wall 2OA;
  • the second scene region 12C is illuminated mainly by the second illuminant 14B and includes the incandescent bulb 14B, the second painting 2OC and a portion of the wall 2OA;
  • the third scene region 12D is illuminated mainly by the third illuminant 14C and includes the candle 14C and the table 2OD.
  • the size and shaped of each scene region 12A can vary according to the scene 12 and the illuminants 14.
  • the image apparatus 10 is water resistant and is adapted to capture images of scenes that are partly or fully in a liquid such as water.
  • the image apparatus 10 can be enclosed by an outer shell (not shown) that surrounds and encloses the image apparatus 10 and that provides a watertight barrier around the image apparatus 10.
  • FIG. 2A illustrates a simplified, front perspective view of one, nonexclusive embodiment of the image apparatus 10.
  • the image apparatus 10 is a digital camera, and includes an apparatus frame 222, an optical assembly 224, a capturing system 226 (illustrated as a box in phantom), a power source 228 (illustrated as a box in phantom), a flash system 230, and a control system 232 (illustrated as a box in phantom).
  • the design of these components can be varied to suit the design requirements and type of image apparatus 10.
  • the image apparatus 10 could be designed without one or more of these components.
  • the image apparatus 10 could be designed without the flash system 230.
  • the apparatus frame 222 can be rigid and support at least some of the other components of the image apparatus 10.
  • the apparatus frame 222 includes a generally rectangular shaped hollow body that forms a cavity that receives and retains at least a portion of the capturing system 226.
  • the apparatus frame 222 can include an aperture 234 and a shutter mechanism 236 that work together to control the amount of light that reaches the capturing system 226.
  • the shutter mechanism 236 can be activated by a shutter button 238.
  • the shutter mechanism 236 can include a pair of blinds (sometimes referred to as "blades") that work in conjunction with each other to allow the light to be focused on the capturing system 226 for a certain amount of time.
  • the shutter mechanism 236 can be all electronic and contain no moving parts.
  • an electronic capturing system 226 can have a capture time controlled electronically to emulate the functionality of the blinds.
  • the optical assembly 224 can include a single lens or a combination of lenses that work in conjunction with each other to focus light onto the capturing system 226.
  • the image apparatus 10 includes an autofocus assembly (not shown) including one or more lens movers that move one or more lenses of the optical assembly 224 in or out until the sharpest possible image of the subject is received by the capturing system 226.
  • the capturing system 226 captures information for the captured image 316 (illustrated in Figure 3A).
  • the design of the capturing system 226 can vary according to the type of image apparatus 10.
  • the capturing system 226 includes an image sensor 240 (illustrated in phantom), a filter assembly 242 (illustrated in phantom), and a storage system 244 (illustrated in phantom).
  • the image sensor 240 receives the light that passes through the aperture
  • An image sensor 240 for digital cameras is known as a charge coupled device
  • CCD complementary metal oxide semiconductor
  • CMOS complementary metal oxide semiconductor
  • the image sensor 240 by itself, produces a grayscale image as it only keeps track of the total intensity of the light that strikes the surface of the image sensor 240. Accordingly, in order to produce a full color image, the filter assembly 242 is necessary to capture the colors of the image.
  • the control system 232 can selectively compensate the colors in the raw captured image 316.
  • the storage system 244 stores the various captured images before the images are ultimately printed out, deleted, transferred or downloaded to an auxiliary compensation system (not shown in Figure 2A), an auxiliary storage system or a printer.
  • the storage system 244 can be fixedly or removable coupled to the apparatus frame 222.
  • suitable storage systems 244 include flash memory, a floppy disk, a hard disk, or a writeable CD or DVD.
  • the power source 228 provides electrical power to the electrical components of the image apparatus 10.
  • the power source 228 can include one or more chemical batteries, either the one time use disposable batteries (such as alkaline, zinc-air), or the multiple use rechargeable batteries
  • the flash system 230 provides a flash of light that can be used to selectively illuminate at least a portion of the scene 12 (illustrated in Figure 1).
  • the control system 232 evaluates the captured image 316 to determine if the scene 12 is illumined by more than one illuminant 14, and performs white balance adjustment on the captured image 316 based on the multiple illuminants 14 to provide the adjusted image 218.
  • control system 232 is electrically connected to and controls the operation of the electrical components of the image apparatus 10.
  • the control system 232 can include one or more processors and circuits and the control system 232 can be programmed to perform one or more of the functions described herein.
  • the control system 232 is coupled to the apparatus frame 222 and is positioned within the apparatus frame 222.
  • control system 232 includes an illuminant database 245 (illustrated as a box) that stores a separate gamut of all possible colors for each of a plurality of possible illuminants.
  • illuminant database 245 can store a separate gamut of all possible colors for at least approximately 5, 10, 15, 20, 25, 30, 5, 40, 50, or more different illuminants.
  • Non-exclusive, specific examples for illuminants in the illuminant database 245 can include (i) the gamut of colors possible from a fluorescent light, (ii) the gamut of colors possible from an incandescent light, (iii) the gamut of colors possible from a candlelight, (iv) the gamut of colors possible from the sun at sunrise with a clear sky, (v) the gamut of colors possible from the sun at sunset with a clear sky, (vi) the gamut of colors possible from the sun at midday with a clear sky, (vii) the gamut of colors possible from an electronic flash, (viii) the gamut of colors possible from a flashlight, (ix) the gamut of colors possible from the sun with a moderately overcast sky, and/or (x) the gamut of colors possible from the sun with shade or a heavily overcast sky.
  • the gamut of colors is expressed in the chromatic scale.
  • the control system 232 is discussed in more detail below.
  • the image apparatus 10 can include an image display 246 that displays the adjusted image 218 and/or the raw captured image 316 (illustrated in Figure 3A). With this design, the user can decide which adjusted images 218 and/or captured images 316 should be stored and which adjusted images 218 and/or captured images 316 should be deleted.
  • the image display 246 is fixedly mounted to the apparatus frame 224 on the back side.
  • the image display 246 can be secured to the apparatus frame 222 with a hinge mounting system (not shown) that enables the display to be pivoted away from the apparatus frame 222.
  • One non-exclusive example of an image display 246 includes an LCD screen. Further, the image display 246 can display other information such as the time of day, and the date.
  • the image apparatus 10 can include one or more control switches 248 electrically connected to the control system 232 that allows the user to control the functions of the image apparatus 10. Additionally, one or more of the control switches 248 can be used to selectively switch the image apparatus 10 to the white balance adjustment mode in which one or more of the adjustment features disclosed herein is activated.
  • Figure 3A illustrates a rear view of the image apparatus 10 with the raw captured image 316 displayed on the image display 246.
  • the raw captured image 31,6 is the image of the scene 12 (illustrated in Figure 1) that is originally captured with information from the capturing system 226 (illustrated in Figure 2A).
  • the captured image 316 has captured an image of the objects 20 (illustrated in Figure 1) as a captured first object 320A (the wall is represented by "W"), a captured second object 320B (the first painting), a captured third object 320C (the second painting), a fourth captured object 320D (the table), a fifth captured object 320E (the lamp), and a sixth captured object 320F (the bulb), and a seventh captured object 320G (the candle). Additionally, a few representative pixels 350 (represented as boxes) are illustrated in Figure 3A.
  • the color of light reflected from the objects 20 will vary according to the color of the light from the illuminant 14 that is illuminating the object 20. Accordingly, the color of each of the captured object 320A-320G will depend upon the characteristics of the illuminant 14 that is illuminating the respective object 20. For example, if the wall 2OA (illustrated in Figure 1) is a constant color, for the captured image 316, the portion of the captured wall 320A that is illuminated by the first illuminant 14A will have a different tone than the portion of the captured wall 320A that is illuminated by the second illuminant 14B.
  • the control system 232 includes software that evaluates the pixels 350 in the captured image 316 to determine if the captured image 316 has been toned by more than one illuminant 14. Further, if the control system 232 determines that the captured image 316 is toned by multiple illuminants 14, the control system 232 (i) evaluates the pixels 350 in the captured image 316 to estimate the characteristics of the multiple illuminants 14, (ii) determines which pixels 350 are influenced by which illuminant 14, and (iii) performs white balance adjustment on the captured image 316 based on the characteristics of the multiple illuminants 14 to provide the adjusted image 218 (illustrated in Figure 3B). As a result thereof, the image apparatus 10 more accurately removes unrealistic color casts from the captured image 316 and that the colors in the adjusted image 218 more accurately approaches the true colors in the scene 12.
  • control system 232 utilizes an energy cost system that evaluates each pixel 350 of the captured image 316, both (i) for consistency in color to a set of plausible illuminants, and (ii) also for smoothness cost between neighboring pixels (or region). Subsequently, the control system 232 uses graph cuts techniques to solve the energy cost system.
  • the control system 232 first determines the RGB (red, green, blue) tonal value for each pixel 350 in the captured image 316. Next, the RGB tonal value for each pixel 350 can be converted to a chromatic scale. Subsequently, the control system 232 can compare the chromatic value of each pixel 350 with the gamut of colors for each of the illuminants in the illuminant database 245 (illustrated in Figure 2A). In one embodiment, for each pixel 350, a weight is given with respect to each illuminant in the illuminant database 245.
  • the pixel 350 is given a perfect score of zero ("0") for that illuminant.
  • a matching score is measured between the input image pixel color the closest matching color for the illuminant.
  • the matching score (e.g. weight) increases as the difference between the pixel color and the corresponding color in the illuminant gamut increases. For example, if the difference between the pixel color and the corresponding color in the illuminant gamut is small, the matching score is small.
  • each pixel 350 will have a value for each of the thirty illuminants.
  • the control system 232 evaluates the color of each pixel 350 and compares the color of each pixel 350 to that of its neighbors (the pixels that surround the pixel that is being evaluated).
  • pixels 350 that are spatially clustered together are illuminated by the same illuminant and have illuminant spatial smoothness.
  • neighboring pixels 350 with a similar color tend to lie under the same light;
  • spatially connected regions have high probability of being under the same light; and
  • pixels 350 that are greatly spaced apart are less likely to be under the same light.
  • the control system 232 performs a color analysis of each pixel 350 relative to its neighbors to determine if there is a variation in color between the pixel that is evaluated and its neighbors. If a given pixel and its neighbors are similar in color, there is high probability they are under the same light and that pixel is given a positive weight. Stated in another fashion, adjacent pixels that are similar in color are assumed to be under the same light. Alternatively, if the pixel that is being evaluated is different in color from its neighbor, a smaller non-negative weight is given. If pixels are spaced apart, there is less chance under the same illuminant.
  • control system 232 uses graph cuts techniques to solve the energy cost system, i.e. assign the illuminant to each pixel (region) of the captured image that best accommodates both the color/gamut consistency and the illuminant spatial smoothness.
  • the control system converts the color constancy problem to an energy optimization problem that finds the illuminant partition (or labeling) of each pixel 350 that gives best consistency between the captured colors and typical illuminant sets.
  • the control system 232 evaluates the pixels 350 and determines which pixels 350 have been toned with which illuminant. With this design, the control system 232 can partition the captured image 316 into different image regions 352 (defined by dashed lines) based upon the areas that are toned by the different illuminants 14. In Figure 3A, because the scene 12 is illuminated by three illuminants 14, the captured image 316 can be partitioned into three image regions 352, namely a first image region 352A, a second image region 352B, and a third image region 352C.
  • the first image region 352A includes the pixels 350 that have captured the part of the scene 12 illuminated mainly by the first illuminant 14A and includes the captured image of the fluorescent lamp 320E, the captured image of the first painting 320B and the captured image of a portion of the wall 320A;
  • the second image region 352B includes the pixels 350 that have captured the part of the scene 12 illuminated mainly by the second illuminant 14B and includes the captured image of the incandescent bulb 320F, the captured image of the second painting 320C and the captured image of a portion of the wall 320A;
  • the third image region 352C includes the pixels 350 that have captured the part of the scene 12 illuminated mainly by the third illuminant 14C and includes the captured image of the candle 320G and the captured image of the table 320D.
  • each image region 352 can vary according to the scene 12 and the illuminants 14. Typically, a truly white object that is illuminated with an illuminant having a color temperature in range of 5,000-5,500 K will appear white. Alternatively, a truly white object that is illuminated with an illuminant having a color temperature below 5000 K will appear more yellow or red. Still alternatively, a truly white object that is illuminated with an illuminant having a color temperature above 5500 K will appear tinged with blue.
  • control system 232 determines which pixels 350 are in which image region 352 and estimating which illuminant has toned which image region 352. Subsequently, the control system 232 can perform selective white balance adjustment for each image region 352 based upon the illuminant.
  • the control system 232 performs a different level of white balance adjustment for each image region 352.
  • the control system 232 can perform a first level of white balance adjustment (illustrated as ".") on the first image region 352A, a second level of white balance adjustment (illustrated as '#") on the second image region 352B, and a third level of white balance adjustment (illustrated as "*") on the third image region 352C.
  • the level of white balance adjustment is based upon the characteristics of illuminant.
  • the difference in white balance adjustment between the first, second and third levels can vary.
  • the tonal values of red, green, and blue can be expressed on a scale of 0 to 255.
  • one or more of the red, green, and blue tonal values can be attenuated or amplified at a first rate
  • one or more of the red, green, and blue tonal values can be attenuated or amplified at a second rate that is different than the first rate
  • the red, green, and blue tonal values can be attenuated or amplified at a third rate that is different than the first rate and the second rate.
  • the amount of white balance adjustment will depend upon the lighting conditions of the portion of the scene captured with the respective image region 352 and the control system 232 compensates the different image regions 352 at different rates.
  • the control system 232 includes software that adjusts the colors in each image region 352 according to the color of the light source that illuminated that region. This can yield more uniform and acceptable white balance in the adjusted image 218.
  • the level of white balance performed at the transitions of the regions can be blended between the different levels of white balance adjustments to avoid drastically changing the colors at the borders of the regions.
  • the level of illuminant scale adjustment along borders of the regions is gradually changed. For example, (i) at the transition between the first image region 352A and the second image region 352B, the level of white balance can be intermediate the first level of white balance adjustment and the second level of white balance adjustment, and (ii) at the transition between the second image region 352B and the third image region 352B, the level of white balance can be intermediate the second level of white balance adjustment and the third level of white balance adjustment.
  • the level of white balance can gradually transition from the first level of white balance adjustment to the second level of white balance adjustment
  • the level of white balance can gradually transition from the second level of white balance adjustment to the third level of white balance adjustment
  • control system 232 can be designed to perform greater than three or less than three levels of white balance adjustment on the captured image 316.
  • control system 232 evaluates the pixels and automatically performs the appropriate level of white balance adjustment for each image region 352.
  • the user can utilize one or more of the control switches 248 to manually bracket, highlight or otherwise identify one or more of the image regions 352 and subsequently select the desired level of white balance adjustment for each image region 352.
  • This embodiment may be more suited for subsequently adjustment outside of the camera with a post-processing adjustment system (illustrated in Figure 5).
  • control system 232 can also determine that the scene 12 is being illuminated by only one illuminant 14. In this example, the control system 232 will provide a single level of white balance adjustment for the entire captured image 316.
  • control system 232 can limit the number of separate image regions 352 to three or less and limit the number of white balance adjustment levels to three or less.
  • Figure 4 is a simplified flow chart that further illustrates one non-exclusive example the operation of the image apparatus. It should be noted that one or more of the steps can be omitted or the order of steps can be switched.
  • the image apparatus is aimed toward the scene 410.
  • the user adjusts the zoom so as to adjust the size of the image as desired 412.
  • the user presses lightly on the shutter button to enable the image apparatus to automatically focus on the object(s) 414.
  • the user presses the shutter button all the way, which resets the image sensor, and exposes the image sensor to light 416.
  • the ADC measures the charge at each photosite of the image sensor and creates a digital signal that represents the values of the charge at each photosite 418.
  • control system interpolates the data from the different photosites, with assistance from the filtering component, to create the raw captured image 420.
  • control system evaluates the captured image, determines if the scene includes multiple illuminants, determines the characteristics of the illuminants and determines the pixels influence by which illuminant 422.
  • control system selectively applies white balance compensation to the raw captured image to get the adjusted image, and displays the adjusted image 424.
  • Figure 5 is a simplified illustration of a combination having features of the present invention, including a camera 510, and a post-processing adjustment system 512.
  • the camera 510 captures the captured images (not shown in Figure 5) and the adjustment system 512 provide the different levels of white balance adjustment to the captured image.
  • the adjustment system 512 includes a control system with software that (i) evaluates the captured image to determine if the scene is illumined by more one than one illuminant, (ii) determines the characteristics of the one or more illuminant, and (iii) performs white balance adjustment on the captured image based on the multiple illuminants to provide an adjusted image.
  • the captured image can be transferred to the adjustment system 512 via an electrical connection line (not shown), a wireless connection, or in another fashion.
  • the camera 510 can include a removable storage system (not shown in Figure 5) that is selectively removed from the camera 510 and inserted into a docking port (not shown) of the adjustment system 512.
  • FIG. 6 is a simplified flow chart that illustrates another non-exclusive example the operation of the image apparatus. This example includes steps 610, 612, 614, 616, 618, 620, 624 that are similar to the corresponding steps described above and illustrated in Figure 4.
  • the input image is preprocessed and segmented at step 621 into a plurality of pixels groups prior to determining if the scene is illuminated by multiple illuminants.
  • pixels that have a similar color and that are in the same neighborhood are grouped together into pixel groups.
  • FIG. 7 illustrates the image apparatus 710, a raw captured image 716 and a plurality of representative pixel groups, namely a first pixel group 760A, a second pixel group 760B, and a third pixel group 760C that have been segmented.
  • the pixels (not shown in Figure 7) in each pixel group 760A, 706B, 760C are similar in color and have been segmented by the control system 732 during preprocessing of the image 716.
  • the control system utilizes an energy cost system to evaluate the pixel groups 622 both (i) for consistency in color to a set of plausible illuminants, and (ii) also for smoothness cost between neighboring pixel groups. Subsequently, the control system uses graph cuts techniques to solve the energy cost system. In one embodiment, a weight is given for each pixel group 760A-760C with respect to each illuminant in the illuminant database.
  • the pixel group 760A-760C is given a perfect score of zero ("0") for that illuminant.
  • a matching score is measured between the pixel group color and the closest matching color for the illuminant.
  • the matching score (e.g. weight) increases as the difference between the pixel group color and the corresponding color in the illuminant gamut increases.
  • each pixel group 760A-760C will have a value for each of the thirty illuminants.
  • control system 732 evaluates the color of each pixel group 760A- 760C and compares the color of each pixel group to that of its neighboring pixel groups (the pixel groups that surround the pixel group that is being evaluated). The control system 732 then performs a color analysis of each pixel group
  • 760A-760C relative to its neighboring pixel groups 760A-760C to determine if there is a variation in color between the pixel group that is evaluated and its neighboring pixel groups 760A-760C. If a given pixel group and its neighboring pixel groups 760A-760C are similar in color, there is high probability they are under the same light and that pixel group is given a positive weight. Stated in another fashion, adjacent pixels that are similar in color are assumed to be under the same light. Alternatively, if the pixel group that is being evaluated is different in color from its neighboring pixel groups 760A-760C, a smaller non-negative weight is given. If pixel groups 760A-760C are spaced apart, there is less chance under the same illuminant.
  • control system 732 uses graph cuts techniques to solve the energy cost system, i.e. assign the illuminant to each pixel group of the captured image that best accommodates both the color/gamut consistency and the illuminant spatial smoothness.
  • the control system 732 converts the color constancy problem to an energy optimization problem that finds the illuminant partition (or labeling) of each pixel group 760A-760C that gives best consistency between the captured colors and typical illuminant sets.
  • the control system 732 evaluates the pixel groups 760A-760C and determines which pixel groups 760A-760C have been toned with which illuminant.
  • the control system 732 can partition the captured image 716 into different image regions 752 (defined by dashed lines) based upon the areas that are toned by the different illuminants.
  • control system selectively applies white balance compensation to the raw captured image to get the adjusted image, and displays the adjusted image 624.
  • the number of computations necessary to determine if the scene was illuminated by multiple illuminants can be reduced.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Color Television Image Signal Generators (AREA)
  • Processing Of Color Television Signals (AREA)

Abstract

An image apparatus (10) for providing an adjusted image (218) of a scene (12) includes a capturing system (226) and a control system (232). The capturing system (226) captures a captured image (316) of the scene (12). The control system (232) evaluates the captured image (316) and determines if the scene (12) is illuminated by more than one illuminant (14). With this design, the image apparatus (1 0) is particularly suited to capture scenes (12) that are illuminated by multiple illuminants (14). The control system (232) can evaluate the captured image (316) and estimate which portion of the captured image (316) was toned by a first illuminant (14A) and which portio of the captured image (316) was toned by the second illuminant (14B). Further, the control system (232) can evaluate the captured image (316) to estimate the characteristics of the first illuminant (14A) and the second illuminant (14B).

Description

WHITE BALANCE ADJUSTMENT FOR
SCENES WITH VARYING ILLUMINATION for Li Hong
BACKGROUND
Cameras are commonly used to capture an i mage of a scene. Most scenes are not illuminated by a 100% pure white light source. For example, sunlight at midday is much closer to white than the late afternoon or morning sunlight which includes more yellow. The color of light reflected from an object will vary according to the color of the light source that is illuminating the scene. As a result thereof, for example, if the color of the light source includes a lot of yellow, a white object in the scene will not be captured as a white object with a typical film type camera.
Recently, some digital cameras have been designed to include a program that adjusts all of the colors in a captured image according to the color of the illuminant. This commonly referred to as white balance correction. With white balance correction, the digital camera attempts to compensate for variations in the colors in the captured image caused by an off-white illuminant, and the actual color of the objects in the scene are more accurately represented in the provided image.
Unfortunately, existing white balance correction has not been able to accurately adjust all of the colors in the captured image in all situations and thus is not completely satisfactory. SUMMARY
The present invention is directed to an image apparatus for providing an adjusted image of a scene. The image apparatus includes a capturing system and a control system. The capturing system captures information for a captured image of the scene. In one embodiment, the control system evaluates the captured image and determines if the scene is illuminated by more than one illuminant. With this design, the image apparatus is particularly suited to capture scenes that are illuminated by multiple illuminants.
For example, the scene can include a first scene region that is illuminated by a first illuminant and a second scene region that is illuminated by a second illuminant. In this example, the control system can evaluate the captured image and estimate which portion of the captured image was toned by the first illuminant and which portion of the captured image was toned by the second illuminant.
Additionally, the control system can evaluate the captured image to estimate the characteristics of the first illuminant and the second illuminant. Further, the control system can separate the captured image into at least a first image region and a second image region. With this design, the control system can perform a first level of white balance adjustment on the first image region and a second level of white balance adjustment on the second image region. In this embodiment, the second level of white balance adjustment is different than the first level of white balance adjustment. With this design, in certain embodiments, the control system can compensate for the different illuminants in the scene and the characteristics of both illuminants are used to guide the color adjustment. As a result thereof, the adjusted image more closely represents the true colors in the scene.
As utilized herein, the terms "true colors" or "actual colors" shall mean the colors that are present at the scene when the scene is illuminated with an even, true 100 percent white light. Typically, the captured image is defined by a plurality of pixels. In one embodiment, the control system evaluates the pixels relative to an illuminant database and evaluates each pixel relative to its neighbors to determine if the scene is illuminated by more than one illuminant and to estimate the characteristics of the one or more illuminants.
The present invention is also directed to an image apparatus including an apparatus frame, a capturing system secured to the apparatus frame, and a control system secured to the apparatus frame. In this embodiment, the capturing system again captures information for a captured image of the scene. Further, in this embodiment, the control system evaluates the captured image to estimate the characteristics of the first illuminant and the second illuminant. Moreover, the control system performs a first level of white balance adjustment on at a first image region of the captured image and performs a second level of white balance adjustment on a second image region of the captured image. In this embodiment, the second level of white balance adjustment is different than the first level of white balance adjustment.
Moreover, the present invention is also directed to one or more methods for providing an adjusted image of the scene. In one embodiment, the method includes the steps of capturing information for a captured image of the scene with a capturing system, and evaluating the captured image and determining if the scene is illuminated by more than one illuminant with a control system.
BRIEF DESCRIPTION OF THE DRAWINGS
The novel features of this invention, as well as the invention itself, both as to its structure and its operation, will be best understood from the accompanying drawings, taken in conjunction with the accompanying description, in which similar reference characters refer to similar parts, and in which: Figure 1 is a simplified top plan view of a scene and an image apparatus having features of the present invention;
Figure 2A is a simplified front perspective view of one embodiment of the image apparatus;
Figure 2B is a simplified rear view of the image apparatus of Figure 2A and an adjusted image of the scene of Figure 1 ; Figure 3A illustrates the image apparatus, a raw captured image of the scene of Figure 1 divided into image regions;
Figure 3B illustrates the image apparatus, the adjusted image, the image regions; Figure 4 is a simplified flowchart that details the operation of the image apparatus;
Figure 5 is a simplified illustration of another embodiment of an image apparatus having features of the present invention;
Figure 6 is a simplified flowchart that details the operation of another embodiment of the image apparatus; and
Figure 7 illustrates the image apparatus, a raw captured image of the scene of Figure 1 divided into image regions and segmented into pixel groups.
DESCRIPTION
Figure 1 is a simplified top plan illustration of an image apparatus 10 and a scene 12 that is illuminated by more than one illuminant 14. As an overview, in certain embodiments, the image apparatus 10 is uniquely designed to (i) capture a captured image 316 (illustrated in Figure 3A), (ii) evaluate the captured image 316 to determine if the scene 12 is illumined by more than one illuminate 14, and (iii) perform white balance adjustment on the captured image 316 based , on the multiple illuminants 14 to provide an adjusted image 218 (illustrated in Figure 2B). As a result thereof, the image apparatus 10 more accurately removes unrealistic color casts from the captured image 316 and that colors in the adjusted image 218 more accurately approach the true colors in the scene 12.
The type of scene 12 captured by the image apparatus 10 can vary. For example, the scene 12 can include one or more objects 20, e.g. animals, plants, mammals, and/or environments. For simplicity, in Figure 1 , the scene 12 is illustrated as including seven objects 20. Alternatively, the scene 12 can include more than seven or less than seven objects 20.
In Figure 1 , one of the objects 20 is a wall 2OA, one of the objects 20 is a first painting 2OB attached to the wall 2OA, one of the objects 20 is a second painting 2OC attached to the wall 2OA, one of the objects 20 is a table 2OD, and the remaining three objects 20 are the illuminants 14. Alternatively, one or all of the illuminants 14 may not be in the scene 12 that is being captured with the image apparatus 10.
The number, design and location of the illuminants 14 that illuminate the scene 12 can vary greatly. In Figure 1 , the scene 12 is illuminated by a first illuminant 14A, a second illuminant 14B and a third illuminant 14C. Alternatively, the scene 12 can be illuminated with more than three or fewer than three illuminants 14. In Figure 1 , the first illuminant 14A is a fluorescent lamp positioned near the first painting 2OB, the second illuminant 14B is an incandescent bulb attached to the wall 2OA near the second painting 2OB, and the third illuminant 14B is a candle positioned on the table 2OD. Non-exclusive examples, of other illuminants 14 can include (i) the sun at sunrise with a clear sky, (ii) the sun at sunset with a clear sky, (iii) the sun at midday with a clear sky, (iv) an electronic flash, (v) a flashlight, (vi) the sun with a moderately overcast sky, or (vii) the sun with shade or a heavily overcast sky.
It should be noted that in Figure 1 , the illuminants 14 are different from each other. More specifically, fluorescent light (illustrated as arrows from the first illuminant 14A), the incandescent light (illustrated as arrows from the second illuminant 14B), and candlelight (illustrated as arrows from the third illuminant 14C) each have a different color temperature. For example, candlelight has a color temperature of between approximately 1000-2000K, the incandescent light has a color temperature of between approximately 2500-3500K, while the fluorescent light has a color temperature of between approximately 4000-5000K. As a result thereof, the objects 20 illuminated by each of the illuminants 14 will have a color cast that corresponds to its respective illuminant 14.
For discussion, the scene 12 can be divided in different scene regions 12A (defined by dashed lines) based upon the areas that are illuminated by the different illuminants 14. In Figure 1 , because the scene 12 is illuminated by three illuminants 14, it can be divided into three scene regions 12A, namely a first scene region 12B, a second scene region 12C, and a third scene region 12D. In this example, (i) the first scene region 12B is illuminated mainly by the first illuminant 14A and includes the fluorescent lamp 14A, the first painting 2OB and a portion of the wall 2OA; the second scene region 12C is illuminated mainly by the second illuminant 14B and includes the incandescent bulb 14B, the second painting 2OC and a portion of the wall 2OA; and (iii) the third scene region 12D is illuminated mainly by the third illuminant 14C and includes the candle 14C and the table 2OD. The size and shaped of each scene region 12A can vary according to the scene 12 and the illuminants 14.
In certain embodiments, the image apparatus 10 is water resistant and is adapted to capture images of scenes that are partly or fully in a liquid such as water. Alternatively, the image apparatus 10 can be enclosed by an outer shell (not shown) that surrounds and encloses the image apparatus 10 and that provides a watertight barrier around the image apparatus 10.
Figure 2A illustrates a simplified, front perspective view of one, nonexclusive embodiment of the image apparatus 10. In this embodiment, the image apparatus 10 is a digital camera, and includes an apparatus frame 222, an optical assembly 224, a capturing system 226 (illustrated as a box in phantom), a power source 228 (illustrated as a box in phantom), a flash system 230, and a control system 232 (illustrated as a box in phantom). The design of these components can be varied to suit the design requirements and type of image apparatus 10. Further, the image apparatus 10 could be designed without one or more of these components. For example, the image apparatus 10 could be designed without the flash system 230.
The apparatus frame 222 can be rigid and support at least some of the other components of the image apparatus 10. In one embodiment, the apparatus frame 222 includes a generally rectangular shaped hollow body that forms a cavity that receives and retains at least a portion of the capturing system 226.
The apparatus frame 222 can include an aperture 234 and a shutter mechanism 236 that work together to control the amount of light that reaches the capturing system 226. The shutter mechanism 236 can be activated by a shutter button 238. The shutter mechanism 236 can include a pair of blinds (sometimes referred to as "blades") that work in conjunction with each other to allow the light to be focused on the capturing system 226 for a certain amount of time. Alternatively, for example, the shutter mechanism 236 can be all electronic and contain no moving parts. For example, an electronic capturing system 226 can have a capture time controlled electronically to emulate the functionality of the blinds.
The optical assembly 224 can include a single lens or a combination of lenses that work in conjunction with each other to focus light onto the capturing system 226. In one embodiment, the image apparatus 10 includes an autofocus assembly (not shown) including one or more lens movers that move one or more lenses of the optical assembly 224 in or out until the sharpest possible image of the subject is received by the capturing system 226.
The capturing system 226 captures information for the captured image 316 (illustrated in Figure 3A). The design of the capturing system 226 can vary according to the type of image apparatus 10. For a digital type camera, the capturing system 226 includes an image sensor 240 (illustrated in phantom), a filter assembly 242 (illustrated in phantom), and a storage system 244 (illustrated in phantom).
The image sensor 240 receives the light that passes through the aperture
234 and converts the light into electricity. One non-exclusive example of an image sensor 240 for digital cameras is known as a charge coupled device
("CCD"). An alternative image sensor 240 that may be employed in digital cameras uses complementary metal oxide semiconductor ("CMOS") technology.
The image sensor 240, by itself, produces a grayscale image as it only keeps track of the total intensity of the light that strikes the surface of the image sensor 240. Accordingly, in order to produce a full color image, the filter assembly 242 is necessary to capture the colors of the image.
It should be noted that other designs for the capturing system 226 can be utilized. It should also be noted, as discussed in more detail below, that with information from the capturing system 226, the control system 232 can selectively compensate the colors in the raw captured image 316.
The storage system 244 stores the various captured images before the images are ultimately printed out, deleted, transferred or downloaded to an auxiliary compensation system (not shown in Figure 2A), an auxiliary storage system or a printer. The storage system 244 can be fixedly or removable coupled to the apparatus frame 222. Non-exclusive examples of suitable storage systems 244 include flash memory, a floppy disk, a hard disk, or a writeable CD or DVD.
The power source 228 provides electrical power to the electrical components of the image apparatus 10. For example, the power source 228 can include one or more chemical batteries, either the one time use disposable batteries (such as alkaline, zinc-air), or the multiple use rechargeable batteries
(such as nickel-cadmium, nickel-metal-hydride, lead-acid, lithium-ion).
The flash system 230 provides a flash of light that can be used to selectively illuminate at least a portion of the scene 12 (illustrated in Figure 1).
The control system 232 evaluates the captured image 316 to determine if the scene 12 is illumined by more than one illuminant 14, and performs white balance adjustment on the captured image 316 based on the multiple illuminants 14 to provide the adjusted image 218.
In one embodiment, the control system 232 is electrically connected to and controls the operation of the electrical components of the image apparatus 10. The control system 232 can include one or more processors and circuits and the control system 232 can be programmed to perform one or more of the functions described herein. In Figure 2A, the control system 232 is coupled to the apparatus frame 222 and is positioned within the apparatus frame 222.
In one embodiment, the control system 232 includes an illuminant database 245 (illustrated as a box) that stores a separate gamut of all possible colors for each of a plurality of possible illuminants. For example, the illuminant database 245 can store a separate gamut of all possible colors for at least approximately 5, 10, 15, 20, 25, 30, 5, 40, 50, or more different illuminants. Non-exclusive, specific examples for illuminants in the illuminant database 245 can include (i) the gamut of colors possible from a fluorescent light, (ii) the gamut of colors possible from an incandescent light, (iii) the gamut of colors possible from a candlelight, (iv) the gamut of colors possible from the sun at sunrise with a clear sky, (v) the gamut of colors possible from the sun at sunset with a clear sky, (vi) the gamut of colors possible from the sun at midday with a clear sky, (vii) the gamut of colors possible from an electronic flash, (viii) the gamut of colors possible from a flashlight, (ix) the gamut of colors possible from the sun with a moderately overcast sky, and/or (x) the gamut of colors possible from the sun with shade or a heavily overcast sky. In one embodiment, the gamut of colors is expressed in the chromatic scale. The control system 232 is discussed in more detail below. Referring to Figure 2B, the image apparatus 10 can include an image display 246 that displays the adjusted image 218 and/or the raw captured image 316 (illustrated in Figure 3A). With this design, the user can decide which adjusted images 218 and/or captured images 316 should be stored and which adjusted images 218 and/or captured images 316 should be deleted. In Figure 2B1 the image display 246 is fixedly mounted to the apparatus frame 224 on the back side. Alternatively, the image display 246 can be secured to the apparatus frame 222 with a hinge mounting system (not shown) that enables the display to be pivoted away from the apparatus frame 222. One non-exclusive example of an image display 246 includes an LCD screen. Further, the image display 246 can display other information such as the time of day, and the date.
Moreover, the image apparatus 10 can include one or more control switches 248 electrically connected to the control system 232 that allows the user to control the functions of the image apparatus 10. Additionally, one or more of the control switches 248 can be used to selectively switch the image apparatus 10 to the white balance adjustment mode in which one or more of the adjustment features disclosed herein is activated.
Figure 3A illustrates a rear view of the image apparatus 10 with the raw captured image 316 displayed on the image display 246. The raw captured image 31,6 is the image of the scene 12 (illustrated in Figure 1) that is originally captured with information from the capturing system 226 (illustrated in Figure 2A). In Figure 3A, the captured image 316 has captured an image of the objects 20 (illustrated in Figure 1) as a captured first object 320A (the wall is represented by "W"), a captured second object 320B (the first painting), a captured third object 320C (the second painting), a fourth captured object 320D (the table), a fifth captured object 320E (the lamp), and a sixth captured object 320F (the bulb), and a seventh captured object 320G (the candle). Additionally, a few representative pixels 350 (represented as boxes) are illustrated in Figure 3A.
As is known, the color of light reflected from the objects 20 (illustrated in Figure 1) will vary according to the color of the light from the illuminant 14 that is illuminating the object 20. Accordingly, the color of each of the captured object 320A-320G will depend upon the characteristics of the illuminant 14 that is illuminating the respective object 20. For example, if the wall 2OA (illustrated in Figure 1) is a constant color, for the captured image 316, the portion of the captured wall 320A that is illuminated by the first illuminant 14A will have a different tone than the portion of the captured wall 320A that is illuminated by the second illuminant 14B.
As provided herein, the control system 232 includes software that evaluates the pixels 350 in the captured image 316 to determine if the captured image 316 has been toned by more than one illuminant 14. Further, if the control system 232 determines that the captured image 316 is toned by multiple illuminants 14, the control system 232 (i) evaluates the pixels 350 in the captured image 316 to estimate the characteristics of the multiple illuminants 14, (ii) determines which pixels 350 are influenced by which illuminant 14, and (iii) performs white balance adjustment on the captured image 316 based on the characteristics of the multiple illuminants 14 to provide the adjusted image 218 (illustrated in Figure 3B). As a result thereof, the image apparatus 10 more accurately removes unrealistic color casts from the captured image 316 and that the colors in the adjusted image 218 more accurately approaches the true colors in the scene 12.
As an overview, in one embodiment, the control system 232 utilizes an energy cost system that evaluates each pixel 350 of the captured image 316, both (i) for consistency in color to a set of plausible illuminants, and (ii) also for smoothness cost between neighboring pixels (or region). Subsequently, the control system 232 uses graph cuts techniques to solve the energy cost system.
In certain embodiments, the control system 232 first determines the RGB (red, green, blue) tonal value for each pixel 350 in the captured image 316. Next, the RGB tonal value for each pixel 350 can be converted to a chromatic scale. Subsequently, the control system 232 can compare the chromatic value of each pixel 350 with the gamut of colors for each of the illuminants in the illuminant database 245 (illustrated in Figure 2A). In one embodiment, for each pixel 350, a weight is given with respect to each illuminant in the illuminant database 245. For example, if the chromatic value of the pixel 350 exactly matches a possible chromatic value for an illuminant, the pixel 350 is given a perfect score of zero ("0") for that illuminant. Alternatively, if the chromatic value of the pixel 350 does not match a possible chromatic value for an illuminant, a matching score is measured between the input image pixel color the closest matching color for the illuminant. The matching score (e.g. weight) increases as the difference between the pixel color and the corresponding color in the illuminant gamut increases. For example, if the difference between the pixel color and the corresponding color in the illuminant gamut is small, the matching score is small. Alternatively, if the difference between the pixel color and the corresponding color in the illuminant gamut is large, the matching score is large. Thus, for example, if the illuminant database 245 includes the color gamut for thirty illuminants, each pixel 350 will have a value for each of the thirty illuminants.
Next, the control system 232 evaluates the color of each pixel 350 and compares the color of each pixel 350 to that of its neighbors (the pixels that surround the pixel that is being evaluated). Typically, pixels 350 that are spatially clustered together are illuminated by the same illuminant and have illuminant spatial smoothness. Stated in another fashion, (i) neighboring pixels 350 with a similar color tend to lie under the same light; (ii) spatially connected regions have high probability of being under the same light; and (iii) pixels 350 that are greatly spaced apart are less likely to be under the same light.
The control system 232 performs a color analysis of each pixel 350 relative to its neighbors to determine if there is a variation in color between the pixel that is evaluated and its neighbors. If a given pixel and its neighbors are similar in color, there is high probability they are under the same light and that pixel is given a positive weight. Stated in another fashion, adjacent pixels that are similar in color are assumed to be under the same light. Alternatively, if the pixel that is being evaluated is different in color from its neighbor, a smaller non-negative weight is given. If pixels are spaced apart, there is less chance under the same illuminant.
Next, the control system 232 uses graph cuts techniques to solve the energy cost system, i.e. assign the illuminant to each pixel (region) of the captured image that best accommodates both the color/gamut consistency and the illuminant spatial smoothness. Thus, the control system converts the color constancy problem to an energy optimization problem that finds the illuminant partition (or labeling) of each pixel 350 that gives best consistency between the captured colors and typical illuminant sets.
Accordingly, in one embodiment, for a scene with multiple illuminants, the control system 232 evaluates the pixels 350 and determines which pixels 350 have been toned with which illuminant. With this design, the control system 232 can partition the captured image 316 into different image regions 352 (defined by dashed lines) based upon the areas that are toned by the different illuminants 14. In Figure 3A, because the scene 12 is illuminated by three illuminants 14, the captured image 316 can be partitioned into three image regions 352, namely a first image region 352A, a second image region 352B, and a third image region 352C. In this example, (i) the first image region 352A includes the pixels 350 that have captured the part of the scene 12 illuminated mainly by the first illuminant 14A and includes the captured image of the fluorescent lamp 320E, the captured image of the first painting 320B and the captured image of a portion of the wall 320A; (ii) the second image region 352B includes the pixels 350 that have captured the part of the scene 12 illuminated mainly by the second illuminant 14B and includes the captured image of the incandescent bulb 320F, the captured image of the second painting 320C and the captured image of a portion of the wall 320A; and (iii) the third image region 352C includes the pixels 350 that have captured the part of the scene 12 illuminated mainly by the third illuminant 14C and includes the captured image of the candle 320G and the captured image of the table 320D. The size and shaped of each image region 352 can vary according to the scene 12 and the illuminants 14. Typically, a truly white object that is illuminated with an illuminant having a color temperature in range of 5,000-5,500 K will appear white. Alternatively, a truly white object that is illuminated with an illuminant having a color temperature below 5000 K will appear more yellow or red. Still alternatively, a truly white object that is illuminated with an illuminant having a color temperature above 5500 K will appear tinged with blue. As a result thereof (i) because the first image region 352A captured the portion of the scene 12 illuminated with the fluorescent light (4000-5000K), the colors in this region are tinted yellow or red; (ii) because the second image region 352B captured the portion of the scene 12 illuminated with the bulb (2500-3500K), the colors in this region are tinted with more yellow or red; and (iii) because the third image region 352C captured the portion of the scene 12 illuminated with the candle (1000-2000K), the colors in this region are even more tinted with yellow or red.
First, the control system 232 determines which pixels 350 are in which image region 352 and estimating which illuminant has toned which image region 352. Subsequently, the control system 232 can perform selective white balance adjustment for each image region 352 based upon the illuminant.
Referring to Figure 3B, in the present example, because each of the image regions 352 is under a different light condition, the control system 232 performs a different level of white balance adjustment for each image region 352. For example, the control system 232 can perform a first level of white balance adjustment (illustrated as ".") on the first image region 352A, a second level of white balance adjustment (illustrated as '#") on the second image region 352B, and a third level of white balance adjustment (illustrated as "*") on the third image region 352C. Further, the level of white balance adjustment is based upon the characteristics of illuminant.
The difference in white balance adjustment between the first, second and third levels can vary. As provided above, the tonal values of red, green, and blue can be expressed on a scale of 0 to 255. In one embodiment, (i) in the first level of white balance adjustment, one or more of the red, green, and blue tonal values can be attenuated or amplified at a first rate, (ii) in the second level of white balance adjustment, one or more of the red, green, and blue tonal values can be attenuated or amplified at a second rate that is different than the first rate, and (iii) in the third level of white balance adjustment, one or more of the red, green, and blue tonal values can be attenuated or amplified at a third rate that is different than the first rate and the second rate.
Thus, the amount of white balance adjustment will depend upon the lighting conditions of the portion of the scene captured with the respective image region 352 and the control system 232 compensates the different image regions 352 at different rates. Stated in another fashion, the control system 232 includes software that adjusts the colors in each image region 352 according to the color of the light source that illuminated that region. This can yield more uniform and acceptable white balance in the adjusted image 218.
In certain embodiments, when it is determined that the scene 12 is illuminated by multiple illuminants 14, the level of white balance performed at the transitions of the regions can be blended between the different levels of white balance adjustments to avoid drastically changing the colors at the borders of the regions. Stated in another fashion, the level of illuminant scale adjustment along borders of the regions is gradually changed. For example, (i) at the transition between the first image region 352A and the second image region 352B, the level of white balance can be intermediate the first level of white balance adjustment and the second level of white balance adjustment, and (ii) at the transition between the second image region 352B and the third image region 352B, the level of white balance can be intermediate the second level of white balance adjustment and the third level of white balance adjustment. In another example, (i) at the transition between the first image region 352A and the second image region 352B, the level of white balance can gradually transition from the first level of white balance adjustment to the second level of white balance adjustment, and (ii) at the transition between the second image region 352B and the third image region 352B, the level of white balance can gradually transition from the second level of white balance adjustment to the third level of white balance adjustment.
It should be noted that depending upon the scene 12, the control system 232 can be designed to perform greater than three or less than three levels of white balance adjustment on the captured image 316.
In the embodiment illustrated in Figures 3A and 3B, the control system 232 evaluates the pixels and automatically performs the appropriate level of white balance adjustment for each image region 352. In another embodiment, the user can utilize one or more of the control switches 248 to manually bracket, highlight or otherwise identify one or more of the image regions 352 and subsequently select the desired level of white balance adjustment for each image region 352. This embodiment may be more suited for subsequently adjustment outside of the camera with a post-processing adjustment system (illustrated in Figure 5).
It should be noted that depending upon the scene 12, the control system 232 can also determine that the scene 12 is being illuminated by only one illuminant 14. In this example, the control system 232 will provide a single level of white balance adjustment for the entire captured image 316.
Further, in certain embodiments, in order to simplify calculations, the control system 232 can limit the number of separate image regions 352 to three or less and limit the number of white balance adjustment levels to three or less.
Figure 4 is a simplified flow chart that further illustrates one non-exclusive example the operation of the image apparatus. It should be noted that one or more of the steps can be omitted or the order of steps can be switched. First, the image apparatus is aimed toward the scene 410. Second, the user adjusts the zoom so as to adjust the size of the image as desired 412. Next, the user presses lightly on the shutter button to enable the image apparatus to automatically focus on the object(s) 414. Subsequently, the user presses the shutter button all the way, which resets the image sensor, and exposes the image sensor to light 416. Next, the ADC measures the charge at each photosite of the image sensor and creates a digital signal that represents the values of the charge at each photosite 418. Subsequently, the control system interpolates the data from the different photosites, with assistance from the filtering component, to create the raw captured image 420. Next, the control system evaluates the captured image, determines if the scene includes multiple illuminants, determines the characteristics of the illuminants and determines the pixels influence by which illuminant 422. Finally, the control system selectively applies white balance compensation to the raw captured image to get the adjusted image, and displays the adjusted image 424.
Figure 5 is a simplified illustration of a combination having features of the present invention, including a camera 510, and a post-processing adjustment system 512. In this embodiment, the camera 510 captures the captured images (not shown in Figure 5) and the adjustment system 512 provide the different levels of white balance adjustment to the captured image. In this embodiment, the adjustment system 512 includes a control system with software that (i) evaluates the captured image to determine if the scene is illumined by more one than one illuminant, (ii) determines the characteristics of the one or more illuminant, and (iii) performs white balance adjustment on the captured image based on the multiple illuminants to provide an adjusted image.
In Figure 5, the captured image can be transferred to the adjustment system 512 via an electrical connection line (not shown), a wireless connection, or in another fashion. For example, the camera 510 can include a removable storage system (not shown in Figure 5) that is selectively removed from the camera 510 and inserted into a docking port (not shown) of the adjustment system 512.
The design of the adjustment system 512 can be varied. For example, the adjustment system 512 can be a personal computer. Figure 6 is a simplified flow chart that illustrates another non-exclusive example the operation of the image apparatus. This example includes steps 610, 612, 614, 616, 618, 620, 624 that are similar to the corresponding steps described above and illustrated in Figure 4. However, in this embodiment, the input image is preprocessed and segmented at step 621 into a plurality of pixels groups prior to determining if the scene is illuminated by multiple illuminants. In this embodiment, during preprocessing, pixels that have a similar color and that are in the same neighborhood are grouped together into pixel groups. The pixels in a particular pixel group are the same color and are assumed to be illuminated by the same illuminant. Figure 7 illustrates the image apparatus 710, a raw captured image 716 and a plurality of representative pixel groups, namely a first pixel group 760A, a second pixel group 760B, and a third pixel group 760C that have been segmented. In this example, the pixels (not shown in Figure 7) in each pixel group 760A, 706B, 760C are similar in color and have been segmented by the control system 732 during preprocessing of the image 716.
Referring back to Figure 6, after segmenting the input image, at step 622, the control system utilizes an energy cost system to evaluate the pixel groups 622 both (i) for consistency in color to a set of plausible illuminants, and (ii) also for smoothness cost between neighboring pixel groups. Subsequently, the control system uses graph cuts techniques to solve the energy cost system. In one embodiment, a weight is given for each pixel group 760A-760C with respect to each illuminant in the illuminant database. For example, if the chromatic value of the pixel group 760A-760C exactly matches a possible chromatic value for an illuminant, the pixel group 760A-760C is given a perfect score of zero ("0") for that illuminant. Alternatively, if the chromatic value of the pixel group 760A-760C does not match a possible chromatic value for an illuminant, a matching score is measured between the pixel group color and the closest matching color for the illuminant. The matching score (e.g. weight) increases as the difference between the pixel group color and the corresponding color in the illuminant gamut increases. For example, if the difference between the pixel group color and the corresponding color in the illuminant gamut is small, the matching score is small. Alternatively, if the difference between the pixel group color and the corresponding color in the illuminant gamut is large, the matching score is large. Thus, for example, if the illuminant database includes the color gamut for thirty illuminants, each pixel group 760A-760C will have a value for each of the thirty illuminants.
Next, the control system 732 evaluates the color of each pixel group 760A- 760C and compares the color of each pixel group to that of its neighboring pixel groups (the pixel groups that surround the pixel group that is being evaluated). The control system 732 then performs a color analysis of each pixel group
760A-760C relative to its neighboring pixel groups 760A-760C to determine if there is a variation in color between the pixel group that is evaluated and its neighboring pixel groups 760A-760C. If a given pixel group and its neighboring pixel groups 760A-760C are similar in color, there is high probability they are under the same light and that pixel group is given a positive weight. Stated in another fashion, adjacent pixels that are similar in color are assumed to be under the same light. Alternatively, if the pixel group that is being evaluated is different in color from its neighboring pixel groups 760A-760C, a smaller non-negative weight is given. If pixel groups 760A-760C are spaced apart, there is less chance under the same illuminant.
Next, the control system 732 uses graph cuts techniques to solve the energy cost system, i.e. assign the illuminant to each pixel group of the captured image that best accommodates both the color/gamut consistency and the illuminant spatial smoothness. Thus, the control system 732 converts the color constancy problem to an energy optimization problem that finds the illuminant partition (or labeling) of each pixel group 760A-760C that gives best consistency between the captured colors and typical illuminant sets.
Accordingly, in this embodiment, for a scene with multiple illuminants, the control system 732 evaluates the pixel groups 760A-760C and determines which pixel groups 760A-760C have been toned with which illuminant. With this design, the control system 732 can partition the captured image 716 into different image regions 752 (defined by dashed lines) based upon the areas that are toned by the different illuminants.
Referring back to Figure 6, after determining that the scene was illuminated by multiple illuminants, the control system selectively applies white balance compensation to the raw captured image to get the adjusted image, and displays the adjusted image 624.
In this embodiment, because the input image is segmented into pixel groups, the number of computations necessary to determine if the scene was illuminated by multiple illuminants can be reduced.
While the current invention is disclosed in detail herein, it is to be understood that it is merely illustrative of the presently preferred embodiments of the invention and that no limitations are intended to the details of construction or design herein shown other than as described in the appended claims.

Claims

What is claimed is:
1. An image apparatus for providing an adjusted image of a scene, the image apparatus comprising: a capturing system that captures information for a captured image of the scene; and a control system that evaluates the captured image and determines if the scene is illuminated by more than one illuminant.
2. The image apparatus of claim 1 wherein the scene includes a first scene region that is illuminated by a first illuminant and a second scene region that is illuminated by a second illuminant; and wherein the control system evaluates the captured image and estimates which portion of the captured image was toned by the first illuminant and which portion of the captured image was toned by the second illuminant.
3. The image apparatus of claim 2 wherein the control system evaluates the captured image to estimate the characteristics of the first illuminant and the second illuminant.
4. The image apparatus of claim 3 wherein the control system separates the captured image into at least a first image region and a second image region, and wherein the control system performs a first level of white balance adjustment on the first image region and a second level of white balance adjustment on the second image region, the second level of white balance adjustment being different than the first level of white balance adjustment.
5. The image apparatus of claim 1 wherein the scene includes a first scene region that is illuminated by a first illuminant and a second scene region that is illuminated by a second illuminant; and wherein the control system evaluates the captured image to estimate the characteristics of the first illuminant and the second illuminant.
6. The image apparatus of claim 1 wherein the control system separates the captured image into at least a first image region and a second image region, and wherein the control system performs a first level of white balance adjustment on the first image region, and a second level of white balance adjustment on the second image region, the second level of white balance adjustment being different than the first level of white balance adjustment.
7. The image apparatus of claim 6 wherein the control system also separates the captured image into a third image region, and wherein the control system performs a third level of white balance adjustment on the third image region, the third level of white balance adjustment being different than the first level of white balance adjustment and the second level of white balance adjustment.
8. The image apparatus of claim 1 wherein the captured image is defined by a plurality of pixels, and wherein the control system evaluates the pixels relative to an illuminant database and evaluates each pixel relative to its neighbors to determine if the scene is illuminated by more than one illuminant.
9. The image apparatus of claim 1 wherein the captured image is defined by a plurality of pixels, and wherein the control system evaluates the captured image to divide the captured image into a plurality of pixel groups, and the control system evaluates the pixel groups relative to an illuminant database and evaluates each pixel group relative to its neighboring pixel groups to determine if the scene is illuminated by more than one illuminant.
10. The image apparatus of claim 1 further comprising a rigid apparatus frame and wherein the capturing system and the control system are fixedly secured to the apparatus frame.
11. An image apparatus for providing an adjusted image of a scene that is illuminated by a first illuminant and a second illuminant, the image apparatus comprising: an apparatus frame; a capturing system secured to the apparatus frame, the capturing system capturing information for a captured image of the scene; and a control system coupled to the apparatus frame, the control system evaluating the captured image to estimate the characteristics of the first illuminant and the second illuminant, and the control system performing a first level of white balance adjustment on at a first image region of the captured image and performing a second level of white balance adjustment on a second image region of the captured image, the second level of white balance adjustment being different than the first level of white balance adjustment.
12. The image apparatus of claim 11 wherein the control system evaluates the captured image and determines which portion of the captured image was toned by the first illuminant and which portion of the captured image was toned by the second illuminant.
13. The image apparatus of claim 11 wherein the scene is also illuminated by a third illuminant, and wherein the control system evaluates the captured image to estimate the characteristics of the third illuminant.
14. The image apparatus of claim 13 wherein the control system performs a third level of white balance adjustment on a third image region of the captured image, the third level of white balance adjustment being different than the first level of white balance adjustment and the second level of white balance adjustment.
15. The image apparatus of claim 11 wherein the control system performs a third level of white balance adjustment on a third image region of the captured image, the third level of white balance adjustment being different than the first level of white balance adjustment and the second level of white balance adjustment.
16. The image apparatus of claim 11 wherein the captured image is defined by a plurality of pixels, and wherein the control system evaluates the pixels relative to an illuminant database and evaluates each pixel relative to its neighbors to estimate if the scene is illuminated by more than one illuminant.
17. The image apparatus of claim 1 wherein the captured image is defined by a plurality of pixels, and wherein the control system evaluates the captured image to divide the captured image into a plurality of pixel groups, and the control system evaluates the pixel groups relative to an illuminant database and evaluates each pixel group relative to its neighboring pixel groups to determine if the scene is illuminated by more than one illuminant.
18. A method for providing an adjusted image of a scene, the method comprising the steps of: capturing information for a captured image of the scene with a capturing system; and evaluating the information and determining if the scene is illuminated by more than one illuminant with a control system.
19. The method of claim 18 wherein the step of evaluating the captured image includes the step of estimating which portion of the captured image was toned by a first illuminant and which portion of the captured image was toned by the second illuminant.
20. The method of claim 18 wherein the scene includes a first scene region that is illuminated by a first illuminant and a second scene region that is illuminated by a second illuminant; and wherein the step of evaluating includes the step of estimating the characteristics of the first illuminant and the second illuminant.
21. The method of claim 18 further comprising the step of performing a first level of white balance adjustment on a first image region of the captured image, and performing a second level of white balance adjustment on a second image region of the captured image, the second level of white balance adjustment being different than the first level of white balance adjustment.
22. The method of claim 20 further comprising the step of performing a third level of white balance adjustment on a third image region of the captured image, the third level of white balance adjustment being different than the first level of white balance adjustment and the second level of white balance adjustment.
23. The method of claim 18 wherein the captured image is defined by a plurality of pixels, and wherein the step of evaluating the captured image includes evaluating the pixels relative to an illuminant database and evaluating each pixel relative to its neighbors to determine if the scene is illuminated by more than one illuminant.
PCT/US2007/024229 2007-11-20 2007-11-20 White balance adjustment for scenes with varying illumination WO2009067097A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2007/024229 WO2009067097A2 (en) 2007-11-20 2007-11-20 White balance adjustment for scenes with varying illumination

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2007/024229 WO2009067097A2 (en) 2007-11-20 2007-11-20 White balance adjustment for scenes with varying illumination

Publications (2)

Publication Number Publication Date
WO2009067097A2 true WO2009067097A2 (en) 2009-05-28
WO2009067097A3 WO2009067097A3 (en) 2009-09-17

Family

ID=40668042

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2007/024229 WO2009067097A2 (en) 2007-11-20 2007-11-20 White balance adjustment for scenes with varying illumination

Country Status (1)

Country Link
WO (1) WO2009067097A2 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106204500B (en) * 2016-07-28 2018-10-16 电子科技大学 A method of realizing that different cameral shooting Same Scene color of image remains unchanged

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7146041B2 (en) * 2001-11-08 2006-12-05 Fuji Photo Film Co., Ltd. Method and apparatus for correcting white balance, method for correcting density and recording medium on which program for carrying out the methods is recorded
US20070230940A1 (en) * 2006-03-28 2007-10-04 Nikon Corporation Image apparatus with selective image compensation
US7286703B2 (en) * 2004-09-30 2007-10-23 Fujifilm Corporation Image correction apparatus, method and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7146041B2 (en) * 2001-11-08 2006-12-05 Fuji Photo Film Co., Ltd. Method and apparatus for correcting white balance, method for correcting density and recording medium on which program for carrying out the methods is recorded
US7286703B2 (en) * 2004-09-30 2007-10-23 Fujifilm Corporation Image correction apparatus, method and program
US20070230940A1 (en) * 2006-03-28 2007-10-04 Nikon Corporation Image apparatus with selective image compensation

Also Published As

Publication number Publication date
WO2009067097A3 (en) 2009-09-17

Similar Documents

Publication Publication Date Title
US9118880B2 (en) Image apparatus for principal components analysis based illuminant estimation
US9013596B2 (en) Automatic illuminant estimation that incorporates apparatus setting and intrinsic color casting information
EP2688302B1 (en) Image processing apparatus and control method thereof, and image capture apparatus
US7742637B2 (en) Apparatus, method, and program for taking an image, and apparatus, method, and program for processing an image
CN101489051B (en) Image processing apparatus and image processing method and image capturing apparatus
TWI442767B (en) Adaptive lens shading correction
US7084907B2 (en) Image-capturing device
US20120127334A1 (en) Adaptive spatial sampling using an imaging assembly having a tunable spectral response
CN1917611A (en) Imaging device and imaging method
US7801433B2 (en) Camera with AF auxiliary illumination
EP1834302A1 (en) Automatic white balancing of colour gain values
JP2006287362A (en) Digital camera and white balance adjustment method
US7437065B2 (en) Imaging apparatus
WO2012024163A2 (en) Image capture with identification of illuminant
JP4200428B2 (en) Face area extraction method and apparatus
CN105578062A (en) Light metering mode selection method and image acquisition device utilizing same
US6963362B1 (en) Image pickup apparatus having flash and color adjustment control
US8860838B2 (en) Automatic illuminant estimation and white balance adjustment based on color gamut unions
US20050134702A1 (en) Sampling images for color balance information
US20070041064A1 (en) Image sampling method for automatic white balance
CN1260951C (en) Camera
WO2009067097A2 (en) White balance adjustment for scenes with varying illumination
JP2012119780A (en) Imaging device, imaging method and program
US7903962B2 (en) Image capturing apparatus with an adjustable illumination system
JP2021002707A (en) White balance objective evaluation method, white balance objective evaluation program, and imaging device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07862145

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07862145

Country of ref document: EP

Kind code of ref document: A2