Mobile QR Code QR CODE

2024

Acceptance Ratio

21%


  1. (College of Electronics and Information Engineering, Ankang University, Ankang, China)



Image dehazing evaluation, Inverse degradation, SSIM, Blue channel prior

1. Introduction

Owing to air pollution and the increasing number of hazy days, outdoor imaging systems will always have limitations. The visibility of the captured images is poor, resulting in color offsets and other phenomena [1]. This will seriously affect the clarity and usability of the image [2, 3]. In order to obtain satisfactory images in hazy days and further process image information, researchers have proposed numerous image dehazing algorithms. Evaluating image quality after dehazing has become a crucial issue, as it aids to choose the best or further iterative improvement.

The assessment of image quality can be categorized into three groups: the full reference quality evaluation methods [4- 6]; the semi-reference quality evaluation method [7], and the quality evaluation method of no-reference image [8- 10]. Due to the lack of reference images and the ever-changing content of images, non-reference evaluation is more challenging for the non-reference evaluation. Reference image evaluation methods are becoming increasingly mature and widely used, in which, structural the similarity index (SSIM) is a relatively complete model.

Because SSIM integrates the brightness, contrast, and structural information of the image being evaluated with the reference image, it can provide an objective evaluation result that is highly thorough. Due to its effectiveness and objectivity, it is widely adopted. However, during the evaluation process of standard SSIM, a clear image of high quality is often required as the reference image. In the field of weak image signal restoration, such a reference image cannot be easily obtained. The input fuzzy picture signal is frequently used directly as the reference image in general evaluation methods. The potential issue is that even if the restoration result improves, the structural similarity index might decrease. This is because improving the restoration effect necessarily leads to enhancements in contrast and color deviation correction, which will inevitably reduce the reliability of the evaluation of SSIM index. To compensate for the flaw, we inversely degrade the dehazed image to assess the dehazing effect using the SSIM indicator.

The contributions of proposed scheme can be summarized as outlined below.

(1) In order to obtain a more reliable SSIM index for image dehazing evaluation, the dehazed images are inverse degraded based on the blue channel prior, and then used as a reference image.

(2) The blue channel prior is used to estimate the transmission for adding fog on the dehazed image. the images progressively increase in magnitude with depth due to haze accumulation.

(3) The quad-tree search concept enables a more precise estimation of atmospheric light in the image’s confined region. Additionally, the brightness and texture properties are taken into account.

The remainder of this paper is organized as follows. Section II briefly outlines related works. Section III describes the principle of SSIM. Section IV describes the proposed approach to inverse degrade the dehazed image. Section V discusses the experiment results. Ultimately, Section VI concludes the paper in the final section.

2. Related Work

The comprehensive evaluation system of image defogging effect built by Guo et al. which objectively assesses the dehazing performance based on human visual perception [11]. Chen et al., on the other hand introduced a set of quality ranking methods utilizing support vector machine [12] to compare the performance of various image enhancement algorithms. However, this method needs to extract 521 features from each image for both training and classification purposes. Additionally, Gibson et al. proposed contrast enhancement indicators specifically for marine images based on Ada-boost algorithm [13]. While these two evaluation algorithms are relatively novel, they are also notably complex.

Wang et al. utilized the SSIM between images to assess the restoration effect among different restoration algorithms [4]. A higher structural similarity between the restored image and the input image indicates superior restoration effect. However, the classical SSIM evaluation algorithm necessitates the use of a high-quality clear image as the reference image. In practical applications involving hazy image signal restoration, the original, input hazy image is frequently chosen as the reference. Consequently, a larger the SSIM value does not mean a better restoration effect. For instance, the SSIM between two identical weak image signals would inevitably be greater than the SSIM value between the enhanced image and the original input weak image signal. It is conceivable that a more effective image restoration could result in a lower SSIM value, as the process of removing haze or dust may change the structure of the image. Therefore, the direct application of traditional SSIM to the evaluation of weak image signal restoration poses certain limitations.

3. SSIM Evaluation Principle

SSIM is based on the assumption that the human eye extracts structured information from an image when viewing it. From the perspective of image composition, the structure information is redefined as the property of the object’s structure in the scene, independent of brightness and contrast. The distortion is modeled as being the combination of three different factors: brightness, contrast and structure. The mean serves as an estimate of brightness, the standard deviation serves as an estimate of contrast, and covariance serves as a measure of structural similarity. The schematic for SSIM indicator can be seen in Fig. 1. A comprehensive evaluation index is obtained by comprehensively considering the difference between brightness, contrast and structure between reference image and evaluation image.

Fig. 1. Schematic for SSIM indicator.

../../Resources/ieie/IEIESPC.2026.15.1.1/fig1.png

Assuming that the reference image is R and the evaluation image is E, The calculation steps for the SSIM (Structural Similarity Index) are as follows:

Step 1: Prior to proceeding, the two images must be precisely aligned.

Step 2: Estimate the brightness LRE using the mean values, as shown in formula (1). Where, uR and uE are the mean values of reference image R and evaluation image E respectively, C1 is a positive constant.

(1)
$ L_{RE} = \frac{2u_R u_E + C_1}{u_R^2 + u_E^2 + C_1}. $

Step 3: Estimate the contrast CRE using the variances, as shown in formula (2), where σR and σE are the variances of reference image R and evaluation image E respectively, C2 is a positive constant.

(2)
$ C_{RE} = \frac{2\sigma_R \sigma_E + C_2}{\sigma_R^2 + \sigma_E^2 + C_2}. $

Step 4: Estimate the structural similarity SRE using the covariance, as shown in formula (3), where σRE represents the covariance of reference image R and evaluation image E, C3 is a positive constant.

(3)
$ S_{RE} = \frac{\sigma_{RE} + C_3}{\sigma_R \sigma_E + C_3}. $

Step 5: Use formula (4) to calculate the structural similarity index between the reference image and the image to be estimated.

(4)
$ SSIM_{RE} = L_{RE}^\alpha \times C_{RE}^\beta \times S_{RE}^\gamma. $

In general, the default setting is α = β = γ = 1. Therefore, the SSIM index is as shown in formula (5).

(5)
$ SSIM = \frac{(2u_R u_E + C_1) * (2\sigma_R \sigma_E + C_2)}{(u_R^2 + u_E^2 + C_1) * (\sigma_R^2 + \sigma_E^2 + C_2)}. $

4. Inverse Degradation of Dehazed Image

In order to make the reference image have reference significance when calculating the SSIM index, the restored haze image is inverse degraded by adding fog . By using this as the reference image, the SSIM value is calculated . If the SSIM value is larger,distortion is smaller, and the effect of the dehazing algorithm is better.

The specific implementation process is as follows:

Firstly, we obtain the dehazing results of various image dehazing algorithms.

Then, based on the atmospheric scattering model, we perform inverse degradation and add fog to these images. During the inverse degradation process, we utilize a quad-tree method to search for the atmospheric light estimation region and estimate the transmittance based on the blue channel prior.

Finally, we calculate the structural similarity index (SSIM) between the original foggy image and the inverse-degraded image, and evaluate the quality of the dehazing algorithms by comparing the SSIM values.

4.1. Atmospheric Pcattering Model for Hazy Image

According to the imaging model described by McCartney and Earl J [14], the mathematical representation of weak image signal degradation on hazy days can be expressed as illustrated in Eq. (6).

(6)
$ I_c(x, y) = J_c(x, y) \cdot t(x, y) + A_c \cdot (1 - t(x, y)), $

where Ic(x, y) is the weak image signal captured in fog; t(x, y) represents the transmission, reflecting the ability of light to penetrate the fog layer, when the depth of field d(x, y) = 0, it can be seen from its expression, that is, the reflected light of the scene at this time completely passes through the atmosphere and reaches the imaging sensor without any loss. When depth of field d(x, y) → ∞, t(x, y) = 0, it means the penetrability of the reflected is zero. That is to say, the closer the spot is to the imaging sensor, the less the scattering attenuation of light, and thus the greater the penetration capacity. Conversely, the opposite; Jc(x, y) represents the reflected light of the scene itself, that is, the expected restored image. Ac represents ambient light.

The haze restoration image is reverse degraded according to the formula (6), where Jc(x, y) is the haze image after restoration, which is calculated according to various dehazing algorithms; Ic(x, y) is the expected result, which is inverse degraded, obtained by solving formula (5). Ambient light Ac and transmittance t(x, y) need to be estimated.

4.2. Estimation of Ambient Light

Generally, the area with the densest fog is identified as the area to be estimated for ambient light [15], which often corresponds to the farthest or brightest sky area in the image.Because the depth is infinite, the transmission rate equals to 0, i.e., t(x) = exp(−βd(x)) = 0, where d(x) represents the depth of the scene. Combined with the physical model of fog degradation formula (6), it can be seen that I(x) = Ac at this time. The assumption of atmospheric light estimation is reasonable. To more reliably identify the region of atmospheric light estimation, a quad-tree search method is proposed, which fully considers the characteristics of brightness, texture and spatial location of atmospheric light to be estimated.

(1) Limitation of the search space area

Under normal circumstances, the area experiencing the thickest fog is usually at the farthest distance. In accordance with the conventional understanding of image display content, distant and uncovered scenes such as the sky usually appear in the top half of the image. Therefore, according to the spatial location characteristics, this paper limits the search area of atmospheric light to the upper part top L lines of the image. This kind of region limit enhances the reliability of the search target and reduces the workload of the search.

(2) Determination of search methods and criteria

The concept of a quad-tree idea is employed for iterative segmentation search. With the aim of excluding cloud areas in the sky, a region characterized by high brightness and relatively smoothness is chosen. Consequently, these two attributes serve as the benchmark for identifying the target region.

Based on the hypothesis of the dark color prior, the dark channel in foggy images can serve as an indicator of fog concentration to a certain degree. To identify the area with the highest fog density, this paper additionally conducts a search within the dark channel map Imin(x) of foggy images in the top L lines. For convenience without loss of generality, dark primary color is obtained pixel by pixel here, that is Imin(x) = minc∈{r,g,b}(Ic(x)).

Divide the top line into four rectangular regions, as illustrated in Fig. 2(b), and select the region with the highest brightness and the smoothest texture as the target area for further quarter segmentation. Repeat these segmentation steps, continuing to choose the target area for the next segmentation based on the same criteria, until the size of the segmented area falls below the preset threshold sH * sW . The target region finally found in Imin(x) corresponds to the original input fog image I(x), and the pixel value with the smallest difference between RGB in I(x) and the highest brightness is selected in the corresponding region ΩA(x) as the estimation of atmospheric light, that is, the final value Ac is:

(7)
$ A_c = \max_{x \in \Omega_A(x)} \left( \sum_{c \in \{R,G,B\}} \left( I_c(x) - \text{std}_{c \in \{r,g,b\}} I_c(x) \right) \right), $

where std(·) denotes the standard deviation operator. Fig. 2 demonstrates the process and outcomes of searching for atmospheric light using the proposed method. For the foggy image presented in Fig. 2(a), following the quad-tree iterative search approach, the final region selected for atmospheric light estimation is the one highlighted in red in Fig. 2(c). Notably, the highlighted region is precisely positioned within the sky area, and the resulting estimated value of atmospheric light is Ac = [168, 179, 190]. According to the method of He et al. [16], the brightest region identified in Fig. 2(b) is designated as the estimation region for atmospheric light, specifically the area highlighted in red in Fig. 2(c). However, it is evident that the marked region is incorrectly positioned on a building, and consequently, the estimated value of atmospheric light Ac = [253, 250, 239] is inaccurate.

Fig. 2. Quadtree search of atmospheric light. (a) Original image with fog. (b) The thickest fog area searched by the literature [16] method. (c) The thickest fog area seared be the proposed method.

../../Resources/ieie/IEIESPC.2026.15.1.1/fig2.png

4.3. Estimation of Transmission

Fattal [17] has enforced that the surface shading and transmission functions are locally statistically uncorrelated with color. Wang et al. [18] make the observation that in typical photographs of natural landscapes, the surface shading and transmission are inherently separated by wavelength into the red, green and blue channels of the image respectively. This observation is evident in Fig. 3, where the red channel predominantly contains shading information, whereas the blue channel exhibits no shading information. The underlying reason is that most natural landscapes predominantly reflect light in the red and green spectral channels, with significantly less reflection in the blue channel.

Fig. 3. Shading and transmission are locally uncorrelated.

../../Resources/ieie/IEIESPC.2026.15.1.1/fig3.png

In the image, the blue channel is primarily influenced by atmospheric airlight and the illumination from the background sky. Consequently, the magnitude of the blue channel in the images tends to progressively increase with depth, due to the accumulation of haze. This phenomenon can be referred to as the “blue channel prior.”

Based on the blue channel prior, it can be concluded that haze accumulation typically follows an exponential pattern with increasing depth [18]. Thus, the logarithmic relationship between the normalized blue channel and the depth of the image can be formulated as follows.

(8)
$ d(x) = -\log(1 - B(x)), $

where d(x) represents the image depth of field, B(x) represents the normalized blue channel in the image, the range is [0, 1]. Because the transformation of the scene is continuous, the depth of field has the characteristics of local smoothness, so the guided filter or bilateral filter is used for edge preserving and denoising processing. And thus, the texture details of d(x) is smoothed, and d1(x) is achieved. According to the corresponding relationship between transmittance and depth of field, the transmission t(x) can be expressed as follows.

(9)
$ t(x) = 1 - \beta d_1(x). $

Here β is the scattering coefficient, which is set to 0.95 in the experiment of this paper. Then, the dehazed image J(x), t(x) and Ac are brought into the model (6) to calculate inverse degradation image Î(x). And then the achieved inverse degradation image Î(x) and the original input weak image signal I(x) are combined to calculate the SSIM index.In general, a higher SSIM index indicates less distortion introduced by the algorithm and a more favorable recovery effect.

Fig. 4. Comparison of transmission estimation. (a) Input image. (b) Result of He et al. [16]. (c) Result of the proposed method. (d) The residual between (b) and (c).

../../Resources/ieie/IEIESPC.2026.15.1.1/fig4.png

4.4. Analysis of the Inverse Degradation

The scattering model presented in Eq. (6) is widely recognized and adopted by recent dehazing methodologies. Our inverse degradation process is also based on this model. Within this framework, the precise estimation of atmospheric light and transmission is of paramount importance. To evaluate the performance, the parameters estimated by our proposed inverse degradation method were compared with those obtained by the state-of-the-art algorithm proposed by He et al. [16] , which is based on the dark channel prior.

In Fig. 2, it can be observed that the proposed method accurately locates the atmospheric light in the sky region, whereas the algorithm proposed by He et al. [16] incorrectly locates it on the building. .

Fig. 4 shows the transmission comparison results between our method and He et al.’s [16]. As can be seen, the two transmission maps are highly comparable. The residual map is presented in Fig. 4(d). The only notable difference lies in the fact that our results capture more intricate details on foreground objects, such as stones or man-made structures like roofs. However, a significant advantage of our approach is that transmission values can be estimated in a much simpler and more easily implementable manner. In summary, the proposed algorithm is grounded in a widely recognized model, and the estimated parameters are reasonable. Consequently, reliable inverse degradation processing can be performed to obtain hazy images.

5. Experimental Results and Analysis

In order to test the effectiveness of the proposed evaluation algorithm, we conducted an evaluation of a series of classical dehazing algorithms, including Fattal’s algorithm [17], Xiao and Gan’s algorithm [20], GDCP, He et al.’s algorithm [16], Kopf et al.’s algorithm [19], Tan’s algorithm [15], Tarel et al.’s algorithm [21], and Yu et al.’s algorithm [22]. We selected two representative images for display, named ‘ny12’ and ‘ny17’. Firstly, we obtained the dehazed images using the aforementioned methods, as depicted in Figs. 4 and 5. Subsequently, we generated the inverse degradation images corresponding to Figs. 5 through 7, which are presented in Figs. 8 through 10

Fig. 5. Dehazing experiment result comparison on image “ny12”. (a) Input hazy image “ny12”. (b) Kopf et al.’s algorithm [19]. (c) Xiao and Gan’s algorithm [20]. (d) Tan’s algorithm [15]. (e) Tarel et al.’s algorithm [21]. (f) Fattal’s algorithm [17]. (g) He et al.’s algorithm [16]. (h) Yu et al.’s algorithm [22].

../../Resources/ieie/IEIESPC.2026.15.1.1/fig5.png

As seen in Fig. 5, it is a fog image taken at high altitudes with complex scenes, the algorithm [19] in Fig. 5(b) and algorithm [16] in Fig. 5(g) is slightly inadequate for the building clusters. In contrast, the restoration effect of the algorithm [15] shown in Fig. 5(d) and algorithm [17] shown in Fig. 5(f) appears significantly overenhanced. The method of [20] shown in Fig. 5(c) and the method of [21] shown in Fig. 5(e) are similar to the algorithm of reference [22] shown in Fig. 5(h), all providing a relatively ideal fog removal effect.

Fig. 6. Dehazing experiment result comparison on image “ny17”. (a) Input hazy image “ny17”. (b) Kopf et al.’s algorithm [19]. (c) Xiao and Gan’s algorithm [20]. (d) Tan’s algorithm [15]. (e) Tarel et al.’s algorithm [21]. (f) Fattal’s algorithm [17]. (g) He et al.’s algorithm [16]. (h) Yu et al.’s algorithm [22].

../../Resources/ieie/IEIESPC.2026.15.1.1/fig6.png

As shown in Fig. 6, it is a fog image captured at a high altitude with a complex scene. Unlike Fig. 5, this image also includes a river area similar to the sky. The method [19] shown in Fig. 6(b), the method [20] shown in Fig. 6(c) , the method [15] shown in Fig. 6(d) , the method [21] shown in Fig. 6(e) and the method [16] shown in Fig. 6(g) all exhibit significant deviations in the treatment effect of the sky and river areas. In contrast, the method of [17] as shown in Fig. 6(f) is similar to the algorithm of [22] as shown in Fig. 6(h) provide a relatively ideal fog removal effect.

Fig. 7. Dehazing experiment result comparison for image “y01”. (a) Input hazy image “y01”. (b) Kopf et al.’s algorithm [19]. (c) Xiao and Gan’s algorithm [20]. (d) Tan’s algorithm [15]. (e) Tarel et al.’s algorithm [21]. (f) Fattal’s algorithm [17]. (g) He et al.’s algorithm [16]. (h) Yu et al.’s algorithm [22].

../../Resources/ieie/IEIESPC.2026.15.1.1/fig7.png

Fig. 7 displays the restoration result of an outdoor landscape photo with fog. The methods [16] as shown in as shown in Fig. 7(g) and method [22] as shown in Fig. 7(h) can restore better cloud profiles and sky colors in the distant regions, while simultaneously achieving superior detail in closer area such as the rock and green grass area. Based on subjective impression, both methods outperform other algorithms in terms of detail and color restoration.

Fig. 8. Manual degraded results for the restored results in Fig. 5 for image “ny12”. (a) Input hazy image “ny12”. (b) Kopf et al.’s algorithm [19]. (c) Xiao and Gan’s algorithm [20]. (d) Tan’s algorithm [15]. (e) Tarel et al.’s algorithm [21]. (f) Fattal’s algorithm [17]. (g) He et al.’s algorithm [16]. (h) Yu et al.’s algorithm [22].

../../Resources/ieie/IEIESPC.2026.15.1.1/fig8.png

Fig. 9. Manual degraded results for the restored results in Fig. 6 for image “ny17”. (a) Input hazy image “ny12”. (b) Kopf et al.’s algorithm [19]. (c) Xiao and Gan’s algorithm [20]. (d) Tan’s algorithm [15]. (e) Tarel et al.’s algorithm [21]. (f) Fattal’s algorithm [17]. (g) He et al.’s algorithm [16]. (h) Yu et al.’s algorithm [22].

../../Resources/ieie/IEIESPC.2026.15.1.1/fig9.png

Fig. 10. Manual degraded results for the restored results in Fig. 7 for image “y01”. (a) Input hazy image “ny12”. (b) Kopf et al.’s algorithm [19]. (c) Xiao and Gan’s algorithm [20]. (d) Tan’s algorithm [15]. (e) Tarel et al.’s algorithm [21]. (f) Fattal’s algorithm [17]. (g) He et al.’s algorithm [16]. (h) Yu et al.’s algorithm [22].

../../Resources/ieie/IEIESPC.2026.15.1.1/fig10.png

The results of the inverse deterioration process for restoring the foggy images "ny12" in Fig. 5, “ny17” in Fig. 6, and “y01” in Fig. 7 are presented in Figs. 8- 10, respectively. Table 1 lists the computed Structural Similarity Index (SSIM) values between the original input images and their corresponding inversely degraded versions. Additionally, Fig. 11 depicts a bar graph that visually represents each of these SSIM indices. The computed indicators show that the SSIM indicators for algorithms [16] and [22] are rank highest on the list. When compared to subjective visual assessments, the subjective evaluations align closely with the objective evaluations calculated by the system presented in this paper.

Table 1. SSIM indicators of manual degraded results for “ny12”, “ny17”, and “y01” from Figs. 8-10.

SSIM

Kopf et al. [19]

Xiao and Gan [20]

Tan [15]

Tarel et al. [21]

Fattal [17]

He et al. [16]

Yu et al. [22]

ny12

0.9046

0.8893

0.8958

0.9318

0.9013

0.9264

0.9432

ny17

0.9199

0.9417

0.9371

0.9604

0.9517

0.9476

0.9758

y01

0.9340

0.9537

0.9112

0.9142

0.9593

0.9770

0.9785

Fig. 11. Bar graph chart of SSIM indicator for the methods in Table 1.

../../Resources/ieie/IEIESPC.2026.15.1.1/fig11.png

6. Conclusion

This work introduces an inverse degradation method to tackle the challenge of lacking sufficient reference images for evaluating image dehazing algorithms. The structural similarity between the dehazed images and their inversely degraded counterparts is assessed using the Structural Similarity Index (SSIM). During the inverse degradation process, the transmission map is estimated utilizing the blue channel prior, while the airlight is obtained through a quad-tree search approach. Subsequently, the atmospheric scattering model is employed to generate the inverse degraded image. In this study, seven traditional dehazing algorithms are examined and tested. The results indicate that, in most cases, the subjective visual assessments align with the objective evaluation indices derived from the proposed methodology. However, a limitation of the algorithm is that the blue channel prior is primarily applicable to outdoor natural scene images; it faces specific constraints when dealing with indoor or manufactured scenes, as well as scenes with fog at night. Future work will explore a more comprehensive estimation technique to address these limitations.

Using the SSIM index alone as an evaluation criterion has certain limitations. In future research, we will build upon the inverse degradation reference image generation method proposed in this paper by incorporating more evaluation indices to assess the effectiveness of dehazing algorithms.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Funding

This article is funded by the Natural Science Foundation of China (No.61801005); the Science Research Program of the Department of Education of Shaanxi Province (No. 25JS002); the Educational Science Planning Project of Shaanxi Province (No. AK25-25);

References

1 
Wei H. , Yun W. , 2023, Single image dehazing via color balancing and quad-decomposition atmospheric light estimation, Optik, Vol. 275DOI
2 
Van Nguyen T. , Vien A. G. , Lee C. , 2022, Real-time image and video dehazing based on multiscale guided filtering, Multimedia Tools and Applications, Vol. 81, No. 25, pp. 36567-36584DOI
3 
Ting C. , Mengni L. , Tao G. , 2022, A fusion-based defogging algorithm, Remote Sensing, Vol. 14, No. 2DOI
4 
Wang Z. , Bovik A. C. , Sheikh H. R. , Simoncelli E. P. , 2004, Image quality assessment: From error visibility to structural similarity, IEEE Transactions on Image Processing, Vol. 13, No. 4, pp. 600-612DOI
5 
Min X. , Zhai G. , Gu K. , Yang X. , 2019, Quality evaluation of image dehazing methods using synthetic hazy images, IEEE Transactions on Multimedia, Vol. 21, No. 9, pp. 2319-2333DOI
6 
Zhao S. , Zhang L. , Huang S. , Jiang T. , Tian Y. , 2020, Dehazing evaluation: Real-world benchmark datasets, criteria, and baselines, IEEE Transactions on Image Processing, Vol. 29, pp. 6947-6962DOI
7 
Carnec M. , Le Callet P. , Barba D. , 2008, Objective quality assessment of color images based on a generic perceptual reduced reference, Signal Processing: Image Communication, Vol. 23, No. 4, pp. 239-256DOI
8 
Sheikh H. R. , Bovik A. C. , Cormack L. , 2005, No-reference quality assessment using natural scene statistics: JPEG2000, IEEE Transactions on Image Processing, Vol. 14, No. 11, pp. 1918-1927DOI
9 
Li Y. , Wu G. , Zhang X. , 2021, Image dehazing method based on dual channel and image quality evaluation model, Journal of Optoelectronics and Laser, Vol. 32, No. 7, pp. 703-710Google Search
10 
Kim W. , Yim C. , 2022, No-reference image contrast quality assessment based on the degree of uniformity in probability distribution, IEIE Transactions on Smart Processing and Computing, Vol. 11, No. 2, pp. 85-91Google Search
11 
Guo F. , Tang J. , Cai Z.-X. , 2014, Objective measurement for image defogging algorithms, Journal of Central South University, Vol. 21, pp. 272-286DOI
12 
Chen Z. , Jiang T. , Tian Y. , 2014, Quality assessment for comparing image enhancement algorithms, Proc. of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 3003-3010Google Search
13 
Gibson K. B. , Nguyen T. Q. , 2013, A no-reference perceptual-based contrast enhancement metric for ocean scenes in fog, IEEE Transactions on Image Processing, Vol. 22, No. 10, pp. 3982-3993DOI
14 
McCartney E. , 1976, Scattering by molecules and particles, Optics of the Atmosphere, pp. 177-178Google Search
15 
Tan R. T. , 2008, Visibility in bad weather from a single image, Proc. of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1-8Google Search
16 
He K. , Sun J. , Tang X. , 2011, Single image haze removal using dark channel prior, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 33, No. 12, pp. 2341-2353DOI
17 
Fattal R. , 2015, Dehazing using color-lines, ACM Transactions on Graphics, Vol. 34, No. 1Google Search
18 
Ghosh P. W. , Rerendering D. B. A. , 2014, Rerendering landscape photographs, Proc. of the European Conference on Visual Media Production, pp. 13-14Google Search
19 
Kopf J. , Neubert B. , Chen B. , Cohen M. , Cohen-Or D. , Deussen O. , Lischinski D. , 2008, Deep photo: Model-based photograph enhancement and viewing, ACM Transactions on Graphics, Vol. 27, No. 5DOI
20 
Xiao C. , Gan J. , 2012, Fast image dehazing using guided joint bilateral filter, The Visual Computer, Vol. 28, No. 6-8, pp. 713-721DOI
21 
Tarel J.-P. , Hautière N. , Caraffa L. , Cord A. , Halmaoui H. , Gruyer D. , 2012, Vision enhancement in homogeneous and heterogeneous fog, IEEE Intelligent Transportation Systems Magazine, Vol. 4, No. 2, pp. 6-20DOI
22 
Yu S. , Zhu H. , Fu Z. , Wang Q. , 2016, Single image dehazing using multiple transmission layer fusion, Journal of Modern Optics, Vol. 63, No. 6, pp. 519-535Google Search
Shun-yuan YU
../../Resources/ieie/IEIESPC.2026.15.1.1/au1.png

Shun-yuan YU is an Associate Professor at the Faculty of Electronics and Information Engineering, Ankang University. She earned her master of science degree in materials science and engineering from Xi’an Jiao Tong University (XJTU), China, in 2008, and subsequently obtained her Ph.D. degree from the Faculty of Automation and Information Engineering at Xi’an University of Technology, China, in 2017.