Nothing Special   »   [go: up one dir, main page]

Next Article in Journal
5G Technology in Healthcare and Wearable Devices: A Review
Next Article in Special Issue
A Comparative Study of Structural Deformation Test Based on Edge Detection and Digital Image Correlation
Previous Article in Journal
Sequential Variational Autoencoder with Adversarial Classifier for Video Disentanglement
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Tone Mapping Operator for High Dynamic Range Images Based on Modified iCAM06

1
National Professional Laboratory of Color Science and Engineering, School of Optoelectronics, Beijing Institute of Technology, Beijing 100081, China
2
Department of Chemical and Materials Engineering, University of Alberta, Edmonton, AB T6G 2V4, Canada
*
Authors to whom correspondence should be addressed.
Sensors 2023, 23(5), 2516; https://doi.org/10.3390/s23052516
Submission received: 29 January 2023 / Revised: 21 February 2023 / Accepted: 22 February 2023 / Published: 24 February 2023
(This article belongs to the Special Issue Digital Image Processing and Sensing Technologies)
Figure 1
<p>Flowchart of the modified algorithm for HDR images tone-mapping. The FBF was performed in the log domain, and then the images were converted to the XYZ space. All the images displayed are in the RGB space. The “White” was the adapted image, which is an extremely blurred image. The “Y-sur” is the luminance channel of “White” denoting the surrounding luminance. In MSD, B0 with no gradient was discarded, and the enhanced detail layer was the sum of three details: D1, D2, and D3. The calculation can refer to the following.</p> ">
Figure 2
<p>Diagram of the experimental procedure for subjective visual evaluation. (<b>a</b>) Tone mapped images, and the neutral grey background (20% grey). The evaluation content is on the upper right corner of the screen. The observation distance was 50 cm, and the field of view was 5°. (<b>b</b>) Each trail included 30 s adaptation [<a href="#B27-sensors-23-02516" class="html-bibr">27</a>], and then the images were evaluated by pressing the numbers on the keyboard.</p> ">
Figure 3
<p>Images after tone mapping by four algorithms: (<b>a1</b>–<b>a6</b>) tone mapped images performed by linear normalization mapping method; (<b>b1</b>–<b>b6</b>) images processed by GF; (<b>c1</b>–<b>c6</b>) images operated using MSD; (<b>d1</b>–<b>d6</b>) images processed by iCAM06; and (<b>e1</b>–<b>e6</b>) images operated by the proposed algorithm, iCAM06-m.</p> ">
Figure 4
<p>Tone mapped images. (<b>a</b>) Image performed by the linear normalization mapping method. (<b>b</b>) Image processed by GF. (<b>c</b>) Image operated MSD. (<b>d</b>) Image processed by iCAM06. (<b>e</b>) Images operated by iCAM06-m. The red enlarged area can check artifacts, and the blue enlarged area can check details.</p> ">
Figure 5
<p>Subjective evaluation results. (<b>a</b>) Results of four algorithms for the compression performance in image preference. <span class="html-italic">T</span>-test was conducted for the difference between the proposed algorithm and other three TMOs. * indicates <math display="inline"><semantics> <mrow> <mi>p</mi> <mo>&lt;</mo> <mn>0.05</mn> </mrow> </semantics></math>, ** indicates <math display="inline"><semantics> <mrow> <mi>p</mi> <mo>&lt;</mo> <mn>0.01</mn> </mrow> </semantics></math>, *** indicates <math display="inline"><semantics> <mrow> <mi>p</mi> <mo>&lt;</mo> <mn>0.001</mn> </mrow> </semantics></math>. (<b>b</b>) Results of four algorithms of each image for the compression performance in image preference. The error bars denote 1 standard deviation. The significance difference between the proposed algorithm and other three TMOs, including each image, is calculated.</p> ">
Figure 6
<p>Assessment results of each image from 4 IQAIs. (<b>a</b>) The IE distribution of each image processed by 4 TMOs and the mean entropy of 4 TMOs. (<b>b</b>) The IS distribution of each image processed by 4 TMOs. (<b>c</b>) The IC distribution. (<b>d</b>) The TMQI distribution.</p> ">
Versions Notes

Abstract

:
This study attempted to solve the problem of conventional standard display devices encountering difficulties in displaying high dynamic range (HDR) images by proposing a modified tone-mapping operator (TMO) based on the image color appearance model (iCAM06). The proposed model, called iCAM06-m, combined iCAM06 and a multi-scale enhancement algorithm to correct the chroma of images by compensating for saturation and hue drift. Subsequently, a subjective evaluation experiment was conducted to assess iCAM06-m considering other three TMOs by rating the tone mapped images. Finally, the objective and subjective evaluation results were compared and analyzed. The results confirmed the better performance of the proposed iCAM06-m. Furthermore, the chroma compensation effectively alleviated the problem of saturation reduction and hue drift in iCAM06 for HDR image tone-mapping. In addition, the introduction of multi-scale decomposition enhanced the image details and sharpness. Thus, the proposed algorithm can overcome the shortcomings of other algorithms and is a good candidate for a general purpose TMO.

1. Introduction

High dynamic range (HDR) images have wide application prospects in the fields of medical imaging, aerospace remote sensing, and cross-media color reproduction because of their higher dynamic range, wider color gamut, and richer details [1,2,3,4,5]. The dynamic range of HDR images must be mapped to the range of display devices, which is called tone mapping [2]. The HDR of an image generally has a dynamic range, which is defined as the ratio between the highest and lowest luminance and is higher than three or four log10 units [3,4]. However, the dynamic range reproducible on traditional display devices is generally lower than these values and thus limits the best display of HDR images, resulting in dynamic range mismatch. Therefore, tone mapping operators (TMOs) must be researched and solved to faithfully display HDR images on conventional devices, which has attracted extensive attention and research [4,5,6].
Currently, HDR image TMOs mainly consist of global and local algorithms. Both primarily consider the compression of image luminance. However, the algorithms inspired by human color vision, such as the image color appearance model, while focus on the luminance compression and accurate reproduction of color [7]. Conventional global algorithms mainly include adaptive log transform compression, linear-gradient compression, histogram adjustment compression, and photographic reproduction [8,9,10]. Other new algorithms based on traditional algorithms have also been researched. Lee et al. [6] proposed a global tone-mapping operator based on a new asymmetric sigmoid curve to enhance global contrast. Based on the luminance histogram, Yang and Khan et al. [9,11] presented efficient methods, by using a Gamma function and constructing a lookup table (LUT), to enhance visual details. Jung et al. [12] proposed a naturalness-preserved tone mapping by applying perceptual quantization (PQ). Global methods offer advantages, such as the ability to process the intensity of all pixels according to the same scale and simpler calculation. However, certain details are generally lost in the case of tone mapped images.
Local tone-mapping algorithms consider different perceptions of luminance in different image regions, and the local image contrast and detail information can be relatively enhanced [13]. Typical local algorithms include the retinex theory [13], bilateral filter [14], guided filter [15], and multi-scale edge-preserving decompositions [16]. Based on multi-scale retinex, Lu et al. [17] introduced guided filtering instead of Gaussian filtering to effectively preserve image details. Gu et al. [18] proposed a novel filter for local edge-preserving decomposition, based on the multi-scale edge-preserving decompositions proposed in ref. [19]. The filtered image contains local means everywhere and preserves local salient edges, which have three detail layers and one base layer. However, the different degrees of local processing results in the local algorithms exhibiting poor continuity of gradient and the “halo” artifacts in the tone mapped images [18].
Global and local tone-mapping algorithms mainly focus on the mapping of luminance and the preservation of image details, whereas the accurate reproduction of color is neglected. However, the image color appearance model considers both the nonlinear compression of image luminance (also known as luminance adaption) and the accurate reproduction of image color [19,20,21]. In 2002, Fairchild and Johnson proposed iCAM [19,20], and then applied and extended it to HDR image compression [21]. iCAM is an image appearance model that attempts to determine the perceptual response to spatially complex stimuli and was extended to the tone map of HDR images [14]. In 2007, Kuang and Fairchild revised the iCAM model, referred to as iCAM06, to render HDR images [7]. iCAM06 offers the advantage of combination with a fast bilateral filter [14,18], which decomposes the images into base and detail layers to maintain the edge details of the image. However, certain problems remain. In 2012, Chae et al. proposed a compensation method using the corrected channel gain function to ameliorate the problem of white point-shift. They aimed to correct hue shift in iCAM06 and achieved better performance [2]. In 2013, Kwon et al. proposed a new method to find global illuminant information to reduce the desaturation effect for iCAM06 in the HDR-image rendering process [22]. In 2019, Kwon et al. [23], proposed a global chromatic adaptation, based on the color appearance model (CAM02), to improve the desaturation effect in iCAM06. This method was a chromatic adaptation (CA)–tone compression (TC) decoupling method that reduced the interference between the CA and TC.
Based on the image color appearance model, HDR image tone-mapping operators consider the color interaction of adjacent pixels and the influence of surrounding light to match a real human visual attribute, which is more suitable for tone-mapping reproduction of HDR images. However, existing tone-mapping operators considering the image color appearance model, such as iCAM06, are plagued by problems such as hue drift, desaturation, and detail loss [2,22,23]. Therefore, this study proposed an improved algorithm based on iCAM06, which ameliorates the above shortcomings and ensures the display of pleasant tone-mapped images on traditional display devices.

2. Algorithm Method

2.1. Method Procedure

A modified algorithm, iCAM06-m, combining iCAM06 with the multi-scale local detail-preserving decomposition (MSD) method was proposed. It can compensate for image saturation and correct hue drift.
The procedure for iCAM06-m is shown in Figure 1. First, the image was decomposed into detail and base layers using fast bilateral filtering (FBF). Color adaptation and nonlinear compression were performed in the image base layer, followed by chroma compensation. The detail layer, including the luminance and chrominance components, preserved the image details. Further, in the improved algorithm, the MSD method [18,24] was added to the detail layer to enhance sharpness. Thereafter, the base and detail layers were combined and transferred into the IPT to adjust the surrounding illuminance and colorfulness. Finally, the compressed HDR image was output by inverse calculation and displayed on the conventional monitors.

2.2. iCAM06

The iCAM06 model [7] was proposed based on iCAM [20] for HDR-image tone mapping, which yields superior results owing to its ability to decompose images into base and detail layers. The base layer was obtained by using an FBF. Each image pixel was weighted by the product of Gaussian filtering in the spatial domain and another Gaussian filtering in the intensity domain. The detail layer was obtained by subtracting the base image from the original image in the log domain. Subsequently, they were converted to the XYZ space for the following process.
The base layer can be obtained using the following equation. The filtering calculation of central pixel q can be expressed as Equation (1):
J o u t = 1 W p , q S f ( | | p q | | ) g ( I p I q ) I p
with
W = p , q S f ( | | p q | | ) g ( I p I q )
where p and q denote the locations of the pixels and the central pixel, respectively; s represents all pixel positions in the filtering window; I is the subsampling image of the input image; W normalizes the sum of the weights; f(X) is the space-domain Gaussian filtering with the kernel scale set to a practical value of 2% of the image size; g(X) is the intensity-domain filtering with its scale set to a constant value of 0.35; and J is the filtered image.
In the base layer, chroma adaptation was performed in the cone response space, RGB, to adapt to the color component. Subsequently, TC was performed in the physiological cone response space R G B . The compression process is as follows:
[ R G B ] = M H P E M C A T 02 1 [ R c G c B c ] ;
R a = sign ( R ) 400 ( F L R / Y L o w ) p 27.13 + ( F L R / Y L o w 2 ) p + 0.1
G a = sign ( G ) 400 ( F L G / Y L o w ) p 27.13 + ( F L G / Y L o w ) p + 0.1
B a = sign ( B ) 400 ( F L B / Y L o w ) p 27.13 + ( F L B / Y L o w ) p + 0.1
[ R T C G T C B T C ] = [ R a G a B a ] + A s ;
where R c G c B c is the value of the CA obtained from XYZ; M C A T 02 1 is the inverse transformation matrix of the cone response space; M H P E is the transformation matrix of the physiological cone response; F L is the adaption factor of luminance level, which is the function of the luminance channel of reference white; Y L o w is the relative luminance image, also called surround luminance; p is the compression index from 0.6 to 0.85; R T C G T C B T C is the final TC image; and R a G a B a and A s are the TC of physiological cone response and rod response, respectively. The specific algorithm is referred to as the iCAM06 model [7].
In the base layer, following CA, color distortion occurs because each XYZ channel has a different intensity value [23]. Further, the detailed layer of iCAM06 includes the chrominance and luminance components. The recombined image of the base and detail layers affects the color distortion, which includes desaturation and hue drift. Furthermore, certain details of the base layer after CA and TC were lost. Therefore, in the improved algorithm iCAM06-m, the compensation chroma method by increasing saturation and correcting hue drift was considered. In addition, multi-scale local detail-preservation decomposition was applied to the detail layer, and the detail layer retained only the luminance component.

2.3. Multi-Scale Enhancement

The multi-scale local edge-preserving decomposition (MSD) was applied in iCAM06 in the detail layer to enhance the details [18,24]. Because low-pass filtering always causes a significant halo, the improved MSD decomposed the image into one base layer and three detail layers to avoid an artificial halo [18]. The improved MSD is based on the following three assumptions:
(1)
The base layer remains local means in each local window;
(2)
All scale’s salient details are relatively large gradients in every local window;
(3)
The gradient information in the detail layer is non-zero everywhere.
Based on assumptions (1) and (2), the filtered base layer contained smooth local information and salient details, obtained progressively by calculating the local approximate means, instead of Gaussian filtering. The constraint conditions are as follows:
i φ ( I i B i ) 2 + α | I | β | B i | 2 ε
B i = a φ i I i + b φ i ,         i φ
B φ = 1 N i φ B i a φ ¯ I j + b φ ¯
In the first part of Equation (7), B i represents the filtered pixels and the second part is the gradient in every local window. α and β maintain a balance between the two terms and render the filtered B i as close to I as possible. B i can be considered as a linear function of I i . When the cost function achieves the minimum ε , the values of a φ i and b φ i are the optimal solutions to Equation (8). In the local window, B φ is equal to the mean of the sum B i . If I j represents the central pixel, B φ can be obtained approximately using I j in Equation (9).
Based on assumption (3), the detail layers were obtained from the difference between the two recently filtered base layers. The salient edges depend on the size of the filtered local window. The process of the detail layers is as follows:
B n 1 = M S D ( B n ) ,   n = m : 1 : 2 ,   m = 3 ,   B m = I
D n = B n B n 1
The detail enhancement function is abbreviated as MSD. After decomposition, the image can be described using Equation (12). The base layer B 0 , was obtained considering the mean of B 1 , which is a smooth and uniform image with no gradient and will be discarded.
I = B 0 + D 1 + D 2 + D 3
D i = 2 π arctan ( 20 × D i )
O u t p = ( I n p I ) α i = 1 3 β i D i ,     p = r , g , b
The output image was accumulated in three detail layers and can be obtained using Equation (14). α can adjust the image saturation, which is in the range of 0.5–0.9. Further, β is the coefficient of the detail layer; β 1 = 0.5 ,   β 2 = β 3 = 1 . The coefficient can adjust the degree of salient details in the images.
The three detail layers at different scales were non-zero gradient salient edges, and their energy was remapped to enhance minor deviations around zero and compress large ones. Therefore, the accumulation results of the three detail layers possessed more detail than the original image.

2.4. Chroma Compensation

A chroma correction method was proposed and applied to the base layer. The CIELab color space, rather than the IPT, was used to determine whether it was suitable for HDR image correction and display. Both can be interconverted, L = 100 × I ,   a = 150 × P ,   b = 150 × T [25]. First, the chroma and hue angle of the original and tone mapped images were calculated and recorded as “before” and “after”, respectively. The chroma and hue angle of the tone- mapped image can be calculated using Equations (15) and (16):
C a f t e r = a a f t e r 2 + b a f t e r 2
θ a f t e r = artan ( b a f t e r a a f t e r )
where C a f t e r   and θ a f t e r denote the chroma and hue angle of the tone mapped image; a and b represent the chromaticity coordinates in the Lab color space, respectively. Assuming that the chromaticity of the image is   C b e f o r , the chroma compensation factor β and hue angle compensation value Δθ can be calculated as follows:
β = C b e f o r e C a f t e r
θ b e f o r e = artan ( b b e f o r a b e f o r )
Δ θ = θ b e f o r e θ a f t e r  
Finally, the compensated chroma C c o m p and corrected hue angle θ c o m p were calculated, and the chromaticity coordinates of a and b were obtained using C c o m p and θ c o m p . In Equations (20) and (21), the coefficients α = λ × β and Δ θ c = Δ θ + ε , where the coefficients λ and ε adjust the chroma balance for optimal visual effects:
C c o m p = α C a f t e r
θ c o m p = θ a f t e r + Δ θ c
a c o m p = C c o m p cos ( θ c o m p )
b c o m p = C c o m p sin ( θ c o m p )
The chroma compensation and hue correction were performed after tone mapping in the base layer, and after the combination of the detail and base layers, the chroma was further corrected for the adjustment of surroundings and illuminance in the IPT color space. The combination of both can produce a better compensation effect.

3. Experimentation

In the subjective evaluation experiment, the images processed by iCAM06-m were compared with three other tone-mapping operators (TMOs), specifically iCAM06 [7], guided filtering (GF) [15,26], and the MSD algorithm [16,18]. Psychophysics evaluation methods are divided into reference and no-reference comparative methods [27,28]. The former method is generally compared with high-quality images or realistic scenes, whereas the latter method is generally compared with the memory image. In this study, the no reference comparative and categorical judgment methods [28,29] were adopted to quantify and evaluate the tone mapped images displayed on the screen.

3.1. Stimuli and Apparatus

In the experiment, the HDR images after tone mapping were presented on an HP liquid crystal monitor (HP24MQ,59983704055), with 2560 × 1440 pixels, D65 white point, and peak luminance of 300 cd/m2, which is a traditional display device. The display device was calibrated and turned on for 30 min before the experiment [27,28], satisfying the equipment required for psychophysical experimental tests.
Ledda et al. [30] and Cadik et al. [31] compared various TMOs, including iCAM06 and other TMOs, and their study showed contradictions in ranking caused by the selection of various attributes and scenarios of images [32]. Therefore, the HDR images were selected from six different scenes representing a typical environment, including indoor and outdoor scenery, landscapes, and specific objects. These are all representative images used to evaluate the TMOs for tone mapped images [28,32]. Their format was radiance RGBE. The image names were Paul Bunyan, Great lounge, Doll, Snowman, Yosemite, and Tinterna and had dynamic ranges of 5.3, 8.3, 5.4, 7.23, 6.4, and 3.3, respectively (according to the formula: D R = l o g 10 ( m a x / m i n ) ) [3,4].

3.2. Experimental Scheme of the Subjective Evaluation

The experiment was conducted in a dark room, where the illuminance was 0.4 lux measured by the illumination photometer of SPIC 300 (Everfine Corp., Hangzhou, Zhejiang, China) Then, five gender-balanced and color-normal observers (with an average age of 27) evaluated the images [33]. They were all naive to the imaging experiment and experimental purposes [32], complied with ITU_R subjective evaluation standards [27,28], and were informed of the evaluation content and criteria to avoid the influence of personal factors on the experimental results.
From the previous psychophysical evaluation study for HDR tone-mapped images [27,28,29,30,31,32,33,34], Kuang et al. [29] and Drago et al. [33] used the method of no reference comparison with pairwise comparative judgement. Luo et al. [27] adopted a psychophysical categorical judgment method. In this study, no-reference HDR scenes and categorical judgment were performed [28,33]. The experimental setup and procedure for the subjective evaluation are shown in Figure 2. The corresponding evaluation levels were divided into seven categories from −3 to 3 quality levels, based on the degree to which the observer liked the compressed images (called the index of image preference). When evaluating, the tone-mapped image attributes, in terms of definition, saturation, contrast, details in shadows and highlights, and global appearance, were considered. Each observer performed three rounds of evaluation. The index of image preference was evaluated 72 times by each observer, for a total of 360 times.

4. Results and Discussion

The evaluation results of the proposed algorithm for the compression performance of HDR images were analyzed from both subjective and objective aspects. All the tone mapped images are shown in Figure 3.
Considering the proposed method combined iCAM06 and MSD, iCAM06 was compared with six methods and the MSD was compared with seven methods [29,33], the results of which all showed the performances of iCAM06 and MSD were optimal. Therefore, the proposed TMO was mainly compared with the two methods, which is simple and effective. The images in group (a1–a6) in Figure 3 were obtained using the linear normalization mapping method. Meanwhile, those of group (b1–b6), (c1–c6), (d1–d6), and (e1–e6) were obtained by guided filtering (GF), a Multi-scale local decomposition algorithm (MSD), iCAM06, and iCAM06-m, respectively.
It is evident that the image tone in group (b1–b6), processed by GF, is quite dark, and the shadow details are not clearly displayed. Moreover, color distortion is a serious issue. In contrast, the tone of the image mapped in group (c1–c6) is appropriate. The details in the highlights and shadows are better displayed. However, the images exhibit poor natural fidelity and “halo” artifacts, particularly at the edge of the blue sky in the red enlarged area in Figure 4c. The image tone in group (d1–d6) processed by iCAM06 is slightly dimmed, and the color is distorted in saturation and hue. Compared with the images in groups (c1–c6) and (d1–d6), in group (e1–e6), image saturation and hue are appropriate. In addition, the images have no “halo” artifacts and exhibit better continuity of gradient and natural fidelity, which retains the advantage of iCAM06. Thus, among the four TMOs, iCAM06-m exhibits the best performance for tone mapping and chroma correction.
From the above analysis, it can be concluded that the global appearance of tone mapped images depends on the appropriate image tone and on certain other local image attributes, including the reproduction of color, “halo” artifacts, and details. Compared with previous studies, this deduction was consistent with other results [27,28,31]. Ledda et al. compared six TMOs, and the results showed that the performance of iCAM06 was the best [30]. The MSD was compared with other seven methods by Gu et al., the results of which indicated that the performance of MSD was the best [30]. Furthermore, the performance of the proposed TMO was better than that of iCAM06 and MSD. Then, the performance of the proposed TMO was analyzed from subjective and objective aspects as follows.

4.1. Subjective Evaluation

In the subjective experiment, owing to the different scoring benchmarks of each observer, all evaluation scores were processed with Z-scores, placing them on a unified ruler for comparison [35]. The subjective evaluation results of image preferences are shown in Figure 5. The evaluation preference scores are relative values. Figure 5a shows the evaluation results of four TMOs represented by a box-plot diagram. Figure 5b is the preference evaluation results of each image for the four algorithms about the image tone mapping performance.
In Figure 5a, the difference between the evaluation results of the proposed algorithm and other TMOs was tested using a t-test, and statistically significant differences were observed. This shows that the results of the subjective evaluation are reliable. Figure 5a shows that the proposed iCAM06-m exhibits a higher performance in terms of image preference. This is followed by iCAM06 and MSD (both relatively close), and the worst performer is GF. Figure 3b1–b6 shows the poor performance of GF, particularly in the image dim parts. Although the results of MSD and iCAM06 are relatively good, the tone mapped image cannot reproduce the image color appearance accurately. Compared to iCAM06, the image compressed by iCAM06-m exhibits improved chroma, hue angle, and details.
In Figure 5b, the statistical significance of the evaluation results in preference between iCAM06-m and other TMOs for each image is calculated. Almost every image exhibits significant differences between the proposed and other algorithms, thereby confirming the reliability of subjective assessment and the discrepancies in tone mapping performance for each TMO. Moreover, the results of the GF algorithm are the worst, and the MSD algorithm performs better in certain images, but not in others. In general, the compression performances of iCAM06 and iCAM06-m are better, and iCAM06-m is more stable. Furthermore, from Figure 3e1–e6 and Figure 5b, the iCAM06-m exhibits the better performance for images ‘Paul’, ‘Yosemite’, and ‘Tinterna’ in terms of image preference, which have high saturation and luminance.

4.2. Objective Evaluation

Objective image quality assessment indices (IQAIs) are simple and efficient for predicting the real perception of images by human vision. This study adopted the typical IQAIs, tone-mapped quality index (TMQI) [36], and the universal IQAIs, including image information entropy (IE) [37,38], image sharpness (IS) [39], and image chroma (IC) [40], to evaluate the tone mapped images [27]. The IQAIs can be calculated as follows:
TMQI = w S α ( 1 w ) N β ,   0 < w < 1
IE = i = 0 255 p i l o g ( p i )
IS = 1 ( R 1 ) ( C 1 ) i = 1 R 1 j = 1 C 1 ( x i , j x i + 1 , j ) 2 + ( x i , j x i , j + 1 ) 2 2
IC = 1 ( R 1 ) ( C 1 ) a i , j 2 + b i , j 2
where w represents the proportion of structural fidelity S and natural retention N; α and β are the adjustment index of S and N, which are 0.304 and 0.708, respectively. The calculation of S and N can be referred to [36]. In Equation (25), the parameter i is the gray values of tone- mapped images, and p i represents the probability of i . In Equation (26), x i , j denotes the image gray value of different pixels, R and C are the numbers of image rows and columns. The a i , j and b i , j represent the image chroma in CIELAB space of the tone- mapped images.
The IQAIs of the tone mapped images of the four TMOs are listed in Table 1. The IQAIs results of each image and mean performance of all images are shown in Figure 6. The IE distributions of each image processed by four TMOs are shown in Figure 6a, the histogram represents the average IE of all images for the four TMOs. Figure 6b shows the IS distributions of each image and the histogram is the mean IS of all images processed by four TMOs. Figure 6c,d were the IC and TMQI distribution, respectively.
The tone-mapped quality index (TMQI) [36] is an evaluation index that assesses the ability of tone mapped images to maintain the original structure and natural fidelity. The TMQI, TMQI-S, and TMQI-N represent the comprehensive index, structure, and natural fidelity, respectively. The larger the indices, the higher the structure and natural fidelity. As evident in Table 1, the TMQI performance of iCAM06-m was better than that of the other TMOs and was close to that of iCAM06, which can also be observed in Figure 6d. The MSD can cause artificial ‘halo’ to destroy the image structure fidelity. Thus, the proposed algorithm can keep the advantages of iCAM06 to maintain the original image structure and natural fidelity.
Image entropy (IE) [37,38] refers to the grey distribution information contained in the image. The greater the IE, the more uniform is the gray distribution of the image. From the IE values, the performances of iCAM06-m and MSD were similar. The method of MSD has a good ability to adjust the gray distribution in IE, but MSD is only applied in the detail layer of the image. When the detail layer and the base layer are combined, the IE is reduced in iCAM06-m. Figure 6a shows the IE of each image was relatively consistent between iCAM06-m and MSD, indicating that the gray distribution of iCAM06-m added MSD was significantly improved compared to that of iCAM06.
Image sharpness (IS) [39] is the ability to present minute details in images. From the values of IS in Table 1, it is evident that the iCAM06-m exhibited the best performance in preserving image details, followed by MSD, iCAM06, and GF, which can also be observed in Figure 6c. The results indicate that iCAM06-m has a better ability to enhance details than iCAM06. In colorimetry [40], the larger the chroma of images, the higher the image colorfulness, and the more pleasant visual perception experienced. As in Table 1, the chromas of iCAM06 and iCAM06-m are higher than that of MSD and GF, indicating the better performance of chroma prediction than that of other TMOs without considering the color vision model. Simultaneously, the image chroma tone-mapped by iCAM06-m is the highest, indicating that the chroma correction is effective.
Overall, iCAM06-m performs better in the four IQAs by combining the advantages of iCAM06 and MSD method and compensating the image chroma, especially when processing images with high saturation, high luminance and rich details, such as images ‘Paul’, ‘Yosemite’, ‘Tinterna’, and ‘Lounge’, which can be seen in Figure 3e1–e6 and Figure 6. The entropy (IE) and clarity (IS) performances of iCAM06-m are similar to those of MSD, indicating that the introduction of MSD into iCAM06 improved its ability. Further, about the chroma and TMQI, the performances of iCAM06-m and iCAM06 are similar and better; thus, it can be concluded that iCAM06-m can more accurately reproduce the image color appearance than iCAM06. The chroma compensation method corrected the hue drift and improved the saturation of tone mapped images.

5. Conclusions

The modified algorithm, iCAM06-m, combines iCAM06 and MSD, and compensates for the image chroma by correcting saturation and hue drift. The performance of iCAM06-m to predict the tone mapped images was evaluated from both objective and subjective aspects. The results of the subjective scores and objective IQAIs tended to be consistent and showed that iCAM06-m performed well and had satisfactory effects on tone mapped images.
In summary, the iCAM06-m improved the defects of iCAM06, including the loss of image details, desaturation, and hue drift, and combined the advantages of both iCAM06 and MSD. (1) The proposed chroma compensation algorithm in iCAM06-m improved the saturation and hue shift. (2) It inherited the stability of iCAM06 for tone mapping and retained the image color appearance, structure, and natural fidelity. (3) It preserves and enhanced the image details and image sharpness of MSD. Moreover, iCAM06-m based on iCAM06 can offered more advantages than other TMOs to predict the real image color appearance. Therefore, the tone mapped images obtained by iCAM06-m can accurately reproduce the image color appearance and provide more image details, indicating that iCAM06-m is a good candidate for a general purpose TMO.

Author Contributions

Conceptualization, Y.L. (Yumei Li), N.L. and C.L.; Methodology, Y.L. (Yumei Li), N.L., W.W., C.D., Y.L. (Yasheng Li), Q.F. and C.L.; Software, Y.L. (Yumei Li), N.L., W.W., C.D., Y.L. (Yasheng Li), Q.F. and C.L.; Validation, Y.L. (Yumei Li), N.L., W.W., C.D., Y.L. (Yasheng Li), Q.F. and C.L.; Formal Analysis, Y.L. (Yumei Li), N.L., C.D., Y.L. (Yasheng Li) and C.L.; Investigation, Y.L. (Yumei Li); Resources, N.L., W.W., Q.F. and C.L.; Data Curation, Y.L. (Yumei Li), N.L., C.D., Y.L. (Yasheng Li) and C.L.; Writing—Original Draft Preparation, Y.L. (Yumei Li); Writing—Review and Editing, Y.L. (Yumei Li), N.L., W.W. and C.L.; Visualization, N.L.; Supervision, N.L., W.W., Q.F. and C.L.; Project Administration, Y.L. (Yumei Li) and N.L.; Funding Acquisition, N.L. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Shenzhen Science and Technology Innovation program of China (Grant No. JSGG201602121151381818) and the National Natural Science Foundation of China (Grant No. 61975012).

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Ethics Committee of Beijing Institute of Technology (protocol code 21/2020, date of approval 19 June 2020).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data will be made available on request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ok, J.; Lee, C. HDR tone mapping algorithm based on difference compression with adaptive reference values. J. Vis. Commun. Image Represent. 2017, 43, 61–76. [Google Scholar] [CrossRef]
  2. Chae, S.-M.; Lee, S.-H.; Kwon, H.-J.; Sohng, K.-I. A tone compression model for the compensation of white point shift generated from HDR rendering. IEICE Trans. Fundam. Electron. Commun. Comput. Sci. 2012, 95, 1297–1301. [Google Scholar] [CrossRef]
  3. Choudhury, A.; Wanat, R.; Pytlarz, J.; Daly, S. Image quality evaluation for high dynamic range and wide color gamut applications using visual spatial processing of color differences. Color Res. Appl. 2021, 46, 46–64. [Google Scholar] [CrossRef]
  4. An, G.H.; Ahn, Y.D.; Lee, S.; Kang, S.-J. Perceptual brightness-based inverse tone mapping for high dynamic range imaging. Displays 2018, 54, 1–8. [Google Scholar] [CrossRef]
  5. Patle, M.K.; Chourasia, B.; Kurmi, Y. High dynamic range image analysis through various tone mapping techniques. Int. J. Comput. Appl. 2016, 153, 14–17. [Google Scholar]
  6. Lee, D.-H.; Fan, M.; Kim, S.-W.; Kang, M.-C.; Ko, S.-J. High dynamic range image tone mapping based on asymmetric model of retinal adaptation. Signal Process. Image Commun. 2018, 68, 120–128. [Google Scholar] [CrossRef]
  7. Kuang, J.; Johnson, G.M.; Fairchild, M. iCAM06: A refined image appearance model for HDR image rendering. J. Vis. Commun. Image Represent. 2007, 18, 406–414. [Google Scholar] [CrossRef]
  8. Shan, Q.; Jia, J.; Brown, M.S. Globally optimized linear windowed tone mapping. IEEE Trans. Vis. Comput. Graph. 2009, 16, 663–675. [Google Scholar] [CrossRef] [Green Version]
  9. Khan, I.R.; Rahardja, S.; Khan, M.M.; Movania, M.M.; Abed, F. A tone-mapping technique based on histogram using a sensitivity model of the human visual system. IEEE Trans. Ind. Electron. 2017, 65, 3469–3479. [Google Scholar] [CrossRef]
  10. Reinhard, E.; Stark, M.; Shirley, P.; Ferwerda, J. Photographic tone reproduction for digital images. ACM Trans. Graph. 2002, 21, 267–276. [Google Scholar] [CrossRef] [Green Version]
  11. Yang, K.-F.; Li, H.; Kuang, H.; Li, C.-Y.; Li, Y.-J. An adaptive method for image dynamic range adjustment. IEEE Trans. Circuits Syst. Video Technol. 2018, 29, 640–652. [Google Scholar] [CrossRef]
  12. Jung, C.; Xu, K. Naturalness-preserved tone mapping in images based on perceptual quantization. In Proceedings of the 2017 IEEE International Conference on Image Processing (ICIP), Beijing, China, 17–20 September 2017; pp. 2403–2407. [Google Scholar]
  13. Meylan, L.; Susstrunk, S. High dynamic range image rendering with a retinex-based adaptive filter. IEEE Trans. Image Process. 2006, 15, 2820–2830. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  14. Durand, F.; Dorsey, J. Fast Bilateral Filtering for the Display of High-Dynamic-Range Images. ACM Trans. Graph. 2002, 21, 257–266. [Google Scholar] [CrossRef] [Green Version]
  15. He, K.; Sun, J.; Tang, X. Guided image filtering. IEEE Trans. Pattern Anal. Mach. Intell. 2012, 35, 1397–1409. [Google Scholar] [CrossRef]
  16. Farbman, Z.; Fattal, R.; Lischinski, D.; Szeliski, R. Edge-preserving decompositions for multi-scale tone and detail manipulation. ACM Trans. Graph. 2008, 27, 1–10. [Google Scholar] [CrossRef]
  17. Lu, B.; Chen, J.; Wang, J.L.; Zheng, Y.M. An improved multi-scale Retinex method for tone mapping. Comput. Eng. Sci. 2017, 39, 951. [Google Scholar]
  18. Gu, B.; Li, W.; Zhu, M.; Wang, M. Local Edge-Preserving Multiscale Decomposition for High Dynamic Range Image Tone Mapping. IEEE Trans Image Process 2013, 22, 70–79. [Google Scholar]
  19. Fairchild, M.; Johnson, G.M. Meet iCAM: An Image Color Appearance Model. 2002, pp. 33–38. Available online: https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&doi=34b7f37f09bcc52897e604dca310f3a0e5fef296 (accessed on 28 January 2023).
  20. Fairchild, M.D.; Johnson, G.M. iCAM framework for image appearance, differences, and quality. J. Electron. Imaging 2004, 13, 126–138. [Google Scholar] [CrossRef] [Green Version]
  21. Johnson, G.M.; Fairchild, M. Rendering HDR Images. Proceedings of Color & Imaging Conference. Available online: https://scholarworks.rit.edu/cgi/viewcontent.cgi?article=1154&context=other (accessed on 28 January 2023).
  22. Kwon, H.-J.; Lee, S.-H.; Bae, T.-W.; Sohng, K.-I. Compensation of de-saturation effect in HDR imaging using a real scene adaptation model. J. Vis. Commun. Image Represent. 2013, 24, 678–685. [Google Scholar] [CrossRef]
  23. Kwon, H.J.; Lee, S.H. CAM-based HDR image reproduction using CA–TC decoupled JCh decomposition. Signal Process. Image Commun. 2019, 70, 1–13. [Google Scholar] [CrossRef]
  24. Subr, K.; Soler, C.; Durand, F. Edge-preserving multiscale image decomposition based on local extrema. ACM Trans. Graph. 2009, 28, 1–9. [Google Scholar] [CrossRef] [Green Version]
  25. Ebner, F.; Fairchild, M.D. Development and testing of a color space (IPT) with improved hue uniformity. In Proceedings of the Sixth Color Imaging Conference: Color Science, Systems, and Applications, Scottsdale, AZ, USA, 17–20 November 1998; pp. 8–13. [Google Scholar]
  26. Lu, B.; Chen, J.; Zheng, Y.; Wang, J. Tone mapping algorithm of iCAM06 based on guide filtering. Opt. Tech. 2016, 42, 130–135. [Google Scholar]
  27. Mehmood, I.; Liu, X.; Khan, M.U.; Luo, M.R. Method for developing and using high quality reference images to evaluate tone mapping operators. J. Opt. Soc. Am. A 2022, 39, B11–B20. [Google Scholar] [CrossRef]
  28. Cerdá-Company, X.; Párraga, C.A.; Otazu, X. Which tone-mapping operator is the best? A comparative study of perceptual quality. J. Opt. Soc. Am. A 2018, 35, 626–638. [Google Scholar] [CrossRef] [PubMed]
  29. Kuang, J.; Yamaguchi, H.; Johnson, G.M.; Fairchild, M.D. Testing HDR image rendering algorithms. Color Imaging Conf. 2004, 2004, 315–320. [Google Scholar]
  30. Ledda, P.; Chalmers, A.; Troscianko, T.; Seetzen, H. Evaluation of tone mapping operators using a High Dynamic Range display. ACM Trans. Graph. 2005, 24, 640–648. [Google Scholar] [CrossRef]
  31. Cadik, M.; Wimmer, M.; Neumann, L.; Artusi, A. Image attributes and quality for evaluation of tone mapping operators. Proc. Pac. Graph. 2006, 2006, 35–44. [Google Scholar]
  32. Barkowsky, M.; Le Callet, P. On the perceptual similarity of realistic looking tone mapped high dynamic range images. In Proceedings of the 2010 IEEE International Conference on Image Processing, Hong Kong, China, 26–29 September 2010; pp. 3245–3248. [Google Scholar]
  33. Drago, F.; Martens, W.L.; Myszkowski, K.; Seidel, H.-P. Perceptual evaluation of tone mapping operators. In Proceedings of the ACM SIGGRAPH 2003 Sketches & Applications, San Diego, CA, USA, 27–31 July 2003; p. 1. [Google Scholar]
  34. Kuang, J.; Yamaguchi, H.; Liu, C.; Johnson, G.M.; Fairchild, M.D. Evaluating HDR rendering algorithms. ACM Trans. Appl. Percept. 2007, 4, 9. [Google Scholar] [CrossRef]
  35. Lo, M.; Luo, M.R.; Rhodes, P.A. Evaluating colour models’ performance between monitor and print images. Color Res. Appl. 2015, 21, 277–291. [Google Scholar] [CrossRef]
  36. Yeganeh, H.; Wang, Z. Objective quality assessment of tone-mapped images. IEEE Trans. Image Process. 2012, 22, 657–667. [Google Scholar] [CrossRef]
  37. Tsai, D.Y.; Lee, Y.; Matsuyama, E. Information entropy measure for evaluation of image quality. J. Digit. Imaging 2008, 21, 338–347. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  38. Wasson, V.; Kaur, B. Image Quality Assessment: Edge Based Entropy features estimation using Soft Computing Techniques. Mater. Today Proc. 2022, 56, 3261–3271. [Google Scholar] [CrossRef]
  39. Lu, Q.; Zhou, W.; Li, H. A no-reference image sharpness metric based on structural information using sparse representation. Inf. Sci. 2016, 369, 334–346. [Google Scholar] [CrossRef]
  40. Liao, N. Advanced Colorimetry; Beijing Polytechnic University Press: Beijing, China, 2019. [Google Scholar]
Figure 1. Flowchart of the modified algorithm for HDR images tone-mapping. The FBF was performed in the log domain, and then the images were converted to the XYZ space. All the images displayed are in the RGB space. The “White” was the adapted image, which is an extremely blurred image. The “Y-sur” is the luminance channel of “White” denoting the surrounding luminance. In MSD, B0 with no gradient was discarded, and the enhanced detail layer was the sum of three details: D1, D2, and D3. The calculation can refer to the following.
Figure 1. Flowchart of the modified algorithm for HDR images tone-mapping. The FBF was performed in the log domain, and then the images were converted to the XYZ space. All the images displayed are in the RGB space. The “White” was the adapted image, which is an extremely blurred image. The “Y-sur” is the luminance channel of “White” denoting the surrounding luminance. In MSD, B0 with no gradient was discarded, and the enhanced detail layer was the sum of three details: D1, D2, and D3. The calculation can refer to the following.
Sensors 23 02516 g001
Figure 2. Diagram of the experimental procedure for subjective visual evaluation. (a) Tone mapped images, and the neutral grey background (20% grey). The evaluation content is on the upper right corner of the screen. The observation distance was 50 cm, and the field of view was 5°. (b) Each trail included 30 s adaptation [27], and then the images were evaluated by pressing the numbers on the keyboard.
Figure 2. Diagram of the experimental procedure for subjective visual evaluation. (a) Tone mapped images, and the neutral grey background (20% grey). The evaluation content is on the upper right corner of the screen. The observation distance was 50 cm, and the field of view was 5°. (b) Each trail included 30 s adaptation [27], and then the images were evaluated by pressing the numbers on the keyboard.
Sensors 23 02516 g002
Figure 3. Images after tone mapping by four algorithms: (a1a6) tone mapped images performed by linear normalization mapping method; (b1b6) images processed by GF; (c1c6) images operated using MSD; (d1d6) images processed by iCAM06; and (e1e6) images operated by the proposed algorithm, iCAM06-m.
Figure 3. Images after tone mapping by four algorithms: (a1a6) tone mapped images performed by linear normalization mapping method; (b1b6) images processed by GF; (c1c6) images operated using MSD; (d1d6) images processed by iCAM06; and (e1e6) images operated by the proposed algorithm, iCAM06-m.
Sensors 23 02516 g003
Figure 4. Tone mapped images. (a) Image performed by the linear normalization mapping method. (b) Image processed by GF. (c) Image operated MSD. (d) Image processed by iCAM06. (e) Images operated by iCAM06-m. The red enlarged area can check artifacts, and the blue enlarged area can check details.
Figure 4. Tone mapped images. (a) Image performed by the linear normalization mapping method. (b) Image processed by GF. (c) Image operated MSD. (d) Image processed by iCAM06. (e) Images operated by iCAM06-m. The red enlarged area can check artifacts, and the blue enlarged area can check details.
Sensors 23 02516 g004
Figure 5. Subjective evaluation results. (a) Results of four algorithms for the compression performance in image preference. T-test was conducted for the difference between the proposed algorithm and other three TMOs. * indicates p < 0.05 , ** indicates p < 0.01 , *** indicates p < 0.001 . (b) Results of four algorithms of each image for the compression performance in image preference. The error bars denote 1 standard deviation. The significance difference between the proposed algorithm and other three TMOs, including each image, is calculated.
Figure 5. Subjective evaluation results. (a) Results of four algorithms for the compression performance in image preference. T-test was conducted for the difference between the proposed algorithm and other three TMOs. * indicates p < 0.05 , ** indicates p < 0.01 , *** indicates p < 0.001 . (b) Results of four algorithms of each image for the compression performance in image preference. The error bars denote 1 standard deviation. The significance difference between the proposed algorithm and other three TMOs, including each image, is calculated.
Sensors 23 02516 g005
Figure 6. Assessment results of each image from 4 IQAIs. (a) The IE distribution of each image processed by 4 TMOs and the mean entropy of 4 TMOs. (b) The IS distribution of each image processed by 4 TMOs. (c) The IC distribution. (d) The TMQI distribution.
Figure 6. Assessment results of each image from 4 IQAIs. (a) The IE distribution of each image processed by 4 TMOs and the mean entropy of 4 TMOs. (b) The IS distribution of each image processed by 4 TMOs. (c) The IC distribution. (d) The TMQI distribution.
Sensors 23 02516 g006
Table 1. Objective evaluation results of image a.
Table 1. Objective evaluation results of image a.
MeanTMQITMQI-STMQI-NIEISIC
GF0.7930.71880.2671.8822.98112.095
MSD0.8450.7310.5247.58512.72115.454
iCAM060.8740.7890.5434.42810.16816.527
iCAM06_m0.8800.7740.6347.49414.33817.965
a Note. Because of the instability of the MSD for chroma prediction, as shown in Figure 6c, the mean chroma does not include image 1 and 3.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Li, Y.; Liao, N.; Wu, W.; Deng, C.; Li, Y.; Fan, Q.; Liu, C. Tone Mapping Operator for High Dynamic Range Images Based on Modified iCAM06. Sensors 2023, 23, 2516. https://doi.org/10.3390/s23052516

AMA Style

Li Y, Liao N, Wu W, Deng C, Li Y, Fan Q, Liu C. Tone Mapping Operator for High Dynamic Range Images Based on Modified iCAM06. Sensors. 2023; 23(5):2516. https://doi.org/10.3390/s23052516

Chicago/Turabian Style

Li, Yumei, Ningfang Liao, Wenmin Wu, Chenyang Deng, Yasheng Li, Qiumei Fan, and Chuanjie Liu. 2023. "Tone Mapping Operator for High Dynamic Range Images Based on Modified iCAM06" Sensors 23, no. 5: 2516. https://doi.org/10.3390/s23052516

APA Style

Li, Y., Liao, N., Wu, W., Deng, C., Li, Y., Fan, Q., & Liu, C. (2023). Tone Mapping Operator for High Dynamic Range Images Based on Modified iCAM06. Sensors, 23(5), 2516. https://doi.org/10.3390/s23052516

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop