Mobile QR Code QR CODE

  1. (Department of Computer Software, Korean Bible University / Seoul, Korea wkim@bible.ac.kr)
  2. (Department of Computer Science and Engineering, Konkuk University / Seoul, Korea cyim@konkuk.ac.kr )



Contrast, Entropy, Image quality assessment, Image dehazing

1. Introduction

The contrast of an image greatly affects its overall visual quality. Fundamentally, any high-quality image must have little distortion and appropriate contrast [1]. The role of contrast in image quality assessment (IQA) has been relatively neglected, compared to the importance placed on distortion. Previous research focused on measuring various distortion-related artifacts, such as noise and blur [2-4]. To measure the amount of image contrast change for image contrast enhancement, the peak signal-to-noise ratio (PSNR) and the absolute mean brightness error (AMBE) have been popularly used as full-reference (FR) IQA [5-8]. The FR-IQA methods as PSNR and AMBE evaluate the degree of distortion and have the limitation that the original image is required as the reference image.

Typical approaches for assessing image quality rely on the assumption that visual quality is assessed by error signals, and the degree of distortion in structural information is greatly related to perceptual quality [9]. The structural similarity (SSIM) index [16], which is known as the best FR-IQA method, measures the degradation of image quality, and uses the standard deviation in pixel intensities as an estimate of image contrast, because image contrast is commonly regarded as the difference in pixel intensities.

In this paper, we propose a new no-reference IQA method for assessing image contrast quality without the need for a reference image, referred to as image contrast quality assessment based on the degree of uniformity in probability distribution (ICQA-DUPD). This approach is based on the fundamental assumption that images with uniform probability distributions for pixel intensity values should have better contrast compared to images with more concentrated distributions. Our idea does not consider error signals or structural information for no-reference IQA.

The organization of this paper is as follows. Section 2 presents related work and contributions. In Section~3, we propose the ICQA-DUPD method. Experimental results are presented in Section 4. In Section 5, we present the conclusion.

2. Related Work and Contributions

Currently, contrast-based no-reference image quality assessment (NR-IQA) remains a challenging task. In general, entropy has been widely used for image contrast assessment as NR-IQA [2-8].

Entropy is a metric to measure the amount of information, and it describes the uncertainty of an information source [10]. In image processing, discrete entropy $H$ quantifies the number of bits per pixel to encode the image into data for lossless coding. Entropy has the maximum value (or score) for a histogram with uniform distribution. As the entropy value becomes higher, richer details can be expressed in an image. Let $L$ be the number of bits per pixel for an image. Let $p_{i}$ be the probability of intensity level $i$ in the image for $i=0,1,\cdots ,2^{L}-1$. Discrete entropy $H$ is defined as

(1)
$ H=\sum _{i=0}^{2^{L}-1}p_{i}\log _{2}\frac{1}{p_{i}} $

Discrete entropy $H$ in (1) has the maximum value (score) of $L$ for an image with $L$ bits per pixel. We define normalized entropy, $H_{N}$, which has the maximum value of 1, as follows:

(2)
$ H_{N}=\frac{1}{L}\sum _{i=0}^{2^{L}-1}p_{i}\log _{2}\frac{1}{p_{i}} $

In [11], the authors presented an entropy-based no-reference image quality assessment (ENIQA) for the quality of a distorted image using entropy in the spatial domain, and the statistical characteristics of entropy in the frequency domain. In [12], a natural image quality evaluator (NIQE) that assesses the quality of an image using a quality-aware collection of natural scene statistics (NSS) derived from patches of an undistorted image was presented. In [13], the authors presented a perception-based image quality evaluator (PIQUE) that quantifies the distortion of an image by extracting local features from a fine-grained block-level distortion map. In [14], a no-reference IQA method that evaluates the quality of a contrast-distorted image by extracting unnatural characteristics from NSS models based on moment and entropy features was presented. In [15], a blind/referenceless image spatial quality evaluator (BRISQUE) estimated the quality of an image using a number of statistical features based on NSS.

The previous NR-IQA methods have some limitations in assessing image contrast quality. For example, previous methods give inconsistent quality values for images with various degrees of contrast distortion. One important application of NR-IQA methods would be quality assessment of hazy and dehazed images. The previous NR-IQA methods do not do well at representing the relative quality improvement of a dehazed image from the hazy image. To overcome these limitations, we propose a new NR-IQA method for estimating contrast-based image quality.

The main contributions of this paper are summarized as follows.

· We propose a new NR-IQA method based on the degree of uniformity in probability distribution.

· We propose a unique method for symmetric hierarchical recursive calculation in contrast measurement.

· The proposed method can consistently assess the quality of images with various degrees of contrast distortion as well as the relative improvement in a dehazed image.

· The proposed method can assess the contrast quality of an image with lower computational complexity compared to the previous methods for real-time applications.

3. Proposed Method

To quantify the degree of uniformity in a probability distribution for image contrast assessment, we divided the intensity range into left and right ranges with an equal number of intensity levels recursively at each depth as a binary tree. Then, we calculated the quality scores from the probability distribution of the nodes at each depth.

Let $I_{d,2j}$ and $I_{d,2j+1}$, respectively, be the index sets of the left and right intensity ranges for $j=0,1,\cdots ,2^{d}-1$ at depth $d,$ with an equal number of intensity levels:

(3)
$ I_{d,2j}=\left\{2^{L-d}\cdot j,\cdots ,2^{L-d}\cdot j+2^{L-d-1}-1\right\} \\ $
(4)
$ I_{d,2j+1}=\left\{2^{L-d}\cdot j+2^{L-d-1},\cdots ,2^{L-d}\cdot j+2^{L-d}-1\right\} $

Let $I_{d}$ be the union of $I_{d,2j}$ and $I_{d,2j+1}$:

(5)
$ I_{d}=I_{d,2j}\cup I_{d,2j+1} $

Let $r_{d,2j}$ and $r_{d,2j+1}$, respectively, be the probability ratios of the left and right intensity ranges of the $j$th node at depth $d$, defined as

(6)
$ r_{d,2j}=\frac{\sum _{i\in {I_{d,2j}}}p_{i}}{\sum _{i\in {I_{d}}}p_{i}} \\ $
(7)
$ r_{d,2j+1}=\frac{\sum _{i\in {I_{d,2j+1}}}p_{i}}{\sum _{i\in {I_{d}}}p_{i}} $

Let $\alpha _{d,j}$ represent the smaller probability ratio of the left and right probability ratios of the $j$th node at depth $d$ in the binary tree for $j=0,1,\cdots ,2^{d}-1$:

(8)
$ \alpha _{d,j}=\min \left(r_{d,2j},r_{d,2j+1}\right) $

The node probability weights, $w_{d,2j}$ and $w_{d,2j+1}$ at depth $d$, are obtained from probability weight $w_{d-1,j}$ of the parent node at depth $d-1$. The probability weight of the root node, $w_{0,0}$, is set to 1. The parent node of the binary tree has two child (left and right) nodes. Probability weights $w_{d,2j}$ and $w_{d,2j+1}$ of the left and right child nodes are calculated as

(9)
$ w_{d,2j}=w_{d-1,j}\cdot r_{d-1,2j} \\ $
(10)
$ w_{d,2j+1}=w_{d-1,j}\cdot r_{d-1,2j+1} $

The node quality score, $S_{d,j}$, which is the $j$th score at depth $d$, is calculated from the smaller probability ratio, $\alpha _{d,j}$, and node probability weight $w_{d,j}$ as follows:

(11)
$ S_{d,j}=2^{-d}\cdot w_{d,j}\cdot \alpha _{d,j} $

Quality score $S_{d}$ at depth $d$ is calculated as the sum of node scores at depth $d$:

(12)
$ S_{d}=\sum _{j=0}^{2^{d}-1}S_{d,j} $

The proposed ICQA-DUPD score, $V$, is the sum of node scores $S_{d}$ for depths $d=0,1,\cdots ,L-1$, calculated as follows:

(13)
$ V=\sum _{d=0}^{L-1}S_{d}+2^{-L} $

In (13), $2^{-L}$ is added so that the maximum score for quality assessment is 1.

The perfectly uniform probability distribution achieves the maximum value of 1.0 for the normalized entropy and the proposed ICQA-DUPD.

Fig. 1 shows histograms that have $2^{l}$ intensity levels with non-zero probability ($p_{i}>0$) for $l$ = 7, $\cdots $, 2. In the histograms, the vertical axis represents the number of pixels, which is proportional to probability. The horizontal axis represents the intensity levels, which are proportional to brightness. Table 1 shows the entropy and the proposed ICQA-DUPD values for the cases in Fig. 1.

The probability distributions in Figs. 1(a)-(c) are concentrated at the left (dark) end of the intensity range. It means that the images corresponding to these cases would be dark with low contrast quality. For example, Fig. 1(c) has intensity levels with non-zero probability within a very narrow intensity range at the left end, which would be a very dark image with low contrast. The image corresponding to Fig. 1(c) would be much darker than the images corresponding to Figs. 1(a) and (b). The proposed ICQA-DUPD values in these cases are as low as $2^{-\left(L-l\right)}$, i.e., 0.5, 0.25, and 0.125 for $l$ = 7, 6, and 5, respectively. On the other hand, the normalized entropy values are relatively high (more than half the maximum score) as $l\cdot L^{-1}$, i.e., 0.875, 0.75, and 0.625 for $l$ = 7, 6, and 5, respectively. Since the probability distributions are concentrated for intensity levels with non-zero probability in these cases, the contrast quality would be worse.

The probability distributions in Figs. 1(d)-(f) are uniform for intensity levels with non-zero probability. The corresponding images would have relatively good contrast, and their proposed ICQA-DUPD values are relatively large. On the other hand, normalized entropy values are low as $l\cdot L^{-1}$, i.e., 0.5, 0.375, and 0.25 for $l$ = 4, 3, and 2, respectively. The proposed ICQA-DUPD metric gives a larger value for Fig. 1(d) than for Figs. 1(e) and (f), because Fig. 1(d) has a larger number of intensity levels with non-zero probability. The examples in Fig. 1 show that (normalized) entropy is not suitable, whereas the proposed ICQA-DUPD is more suitable for indicating image contrast quality.

Fig. 1. Parts (a)-(c) are histograms with concentrated probability distribution of intensity levels, $\boldsymbol{i}$ = 0, $\cdots $, $2^{\boldsymbol{l}}-1$ for $\boldsymbol{l}$ = 7, 6, 5, respectively, with non-zero probability. Parts (d)-(f) are histograms with uniform probability distributions of 16, 8, and 4 intensity levels, respectively, with non-zero probability. The vertical axis represents the number of pixels, which is proportional to probability. The horizontal axis represents the intensity levels, which are proportional to brightness.
../../Resources/ieie/IEIESPC.2022.11.2.85/fig1.png
Table 1. The comparison of values for normalized entropy and the proposed ICQA-DUPD forFig. 1.

(a)

(b)

(c)

(d)

(e)

(f)

H$_{N}$

0.875

0.750

0.625

0.5

0.375

0.25

Proposed

0.5

0.25

0.125

0.941

0.879

0.754

4. Experimental Results

To validate the performance of the proposed ICQA-DUPD method, we compared the proposed method with state-of-the-art and classic NR-IQA methods, including entropy-based no-reference image quality assessment (ENIQA) [11], natural image quality evaluator (NIQE) [12], perception-based image quality evaluator (PIQUE) [13], natural scene statistics (NSS) [14], blind/referenceless image spatial quality evaluator (BRISQUE) [15], and normalized entropy. ENIQA is compared only for color images.

To quantify how the distorted images are transformed from the ground-truth image, we use the values given by widely adopted FR-IQA methods that require the ground-truth image as a reference, including SSIM [16] and PSNR.

In the first experiment, we evaluated contrast-distorted images with different levels of global contrast decrement. We calculated the average ranking values for NR-IQA methods on a scale from 1 to 5 (5 being the best quality) depending on the contrast values for all contrast-distorted images with different levels of global contrast decrement. Fig. 2 shows the contrast-distorted images from the ground-truth image with different levels of global contrast decrement from categorical subjective image quality (CSIQ) [17] and the corresponding histograms. The probability distributions in contrast-distorted images are shrunk and concentrated into a narrow intensity range according to the level of global contrast decrement to degrade image contrast for the images in Fig. 2.

The average ranking values for normalized entropy and the proposed ICQA-DUPD are mostly consistent with the levels of global contrast decrement. However, other NR-IQA methods are not consistent with the levels of image contrast decrement, as shown in Table 2. In particular, as the image contrast quality becomes worse, the ranking value for NSS becomes higher.

In the subsequent experiments, the original image with low contrast and the contrast-enhanced image were evaluated for comparison between the proposed method and other NR-IQA methods. Note that higher values (scores) given by normalized entropy and the proposed ICQA-DUPD imply better image quality, while lower values given by other NR-IQA methods (ENIQA, NIQE, PIQUE, NSS, BRISQUE) imply better image quality (opposite). Since the original image with low quality is used as the reference image for FR-IQA, the SSIM and PSNR values become relatively low for the enhanced image in these cases.

In Fig. 3(a), the original image is relatively dark, and as such, the corresponding probability distribution is concentrated in the left (low) intensity range. In this case, the contrast quality of the original image is poor, and the proposed ICQA-DUPD value is low, at 0.422. However, the normalized entropy value is high at 0.854 because most intensity levels have non-zero probability. In the probability distribution of the contrast-enhanced image in Fig. 3(b), the number of intensity levels with non-zero probability is higher on the right side of the intensity range. On the other hand, the number of pixels represented on the left side of the intensity range is higher. Hence, there is more symmetry between the left and right sides of the intensity range in the probability distribution of the contrast-enhanced image than in the probability distribution of the original image. In other words, the intensity probability is distributed more uniformly between the left and right intensity ranges in the contrast-enhanced image compared to the original image. As a result, the contrast-enhanced image has good contrast quality, and the value given by the proposed ICQA-DUPD for the enhanced image (0.956) is much higher than for the original image (0.422). The normalized entropy value of the enhanced image (0.810) is slightly lower than for the original image (0.854), although the image contrast has been greatly enhanced by histogram equalization.

Histogram equalization for image contrast enhancement actually reduces the number of intensity levels with non-zero probability in Fig. 3. The normalized entropy value becomes lower since the number of intensity levels with non-zero probability becomes smaller from histogram equalization. Fig. 3 shows how histogram equalization greatly enhances image contrast quality. In this case, the proposed method gives a much higher value, while normalized entropy gives a lower value, and other NR-IQA methods give inconsistent values.

This shows that entropy may not be suitable as a quality metric to measure the degree of image contrast enhancement. In this case, the SSIM value for the enhanced image is relatively low at 0.425, and the PSNR value for the enhanced image is also low (9.867), as shown in Table 3. This is because there are large intensity and structural changes in the enhanced image relative to the low-quality original image. Nevertheless, the values given by other NR-IQA methods show relatively few differences between the original and contrast-enhanced images. Moreover, other NR-IQA methods incorrectly give lower values (indicating better quality) for the original image compared to the enhanced image.

In Fig. 4(a), the original image is contrast-distorted due to thick haze, and the corresponding probability distribution is concentrated in the middle intensity range. This hazy image was obtained from the real-world task-driven testing set (RTTS) [19]. The contrast quality of this original image is poor. The proposed ICQA-DUPD value is low (0.508), while the normalized entropy is high (0.802). The probability distribution of the dehazed image in Fig. 4(b) has a wider intensity range with non-zero probability, and as such, the contrast quality of the dehazed image is greatly enhanced compared to the original image. The proposed ICQA-DUPD value reflects this change and increased significantly, to 0.851. However, other NR-IQA methods, including NIQE, PIQUE, NSS, and BRISQUE, incorrectly give lower values (indicating better quality) to the original image. This shows that the proposed ICQA-DUPD metric can be effectively used for NR-IQA with hazy and dehazed images without the need for a ground-truth image.

To compare the overall computational complexity, we ran the proposed ICQA-DUPD and the other NR-IQA methods on 30 ground-truth images (512 $\times $ 512 $\times $ 3) from the CSIQ image quality database [17]. The hardware environment was a tenth-generation Intel Core i5-10210U processor with 8G RAM, and the software platform was Matlab R2020b. Table 5 compares the average running times of all the NR-IQA methods.

We can see that the proposed ICQA-DUPD is clearly superior to the other NR-IQA methods except normalized entropy in terms of computational complexity. This clearly shows that the proposed ICQA-DUPD is well-suited for use in real-time image applications.

Fig. 2. Contrast-distorted images generated from the ground-truth image with different levels of global contrast decrement from CSIQ[17]and their corresponding histograms: (a) the ground-truth image (Level 0), compared to contrast-distorted images; (b) Level 1; (c) Level 2; (d) Level 3; (e) Level 4.
../../Resources/ieie/IEIESPC.2022.11.2.85/fig2.png
Fig. 3. Original (relatively dark) image and contrast-enhanced image from histogram equalization (HE) and the corresponding histograms.
../../Resources/ieie/IEIESPC.2022.11.2.85/fig3.png
Fig. 4. Original hazy image and the dehazed image with corresponding histograms from using the method in[18]: (a) the original image from RTTS[19]; (b) the dehazed image.
../../Resources/ieie/IEIESPC.2022.11.2.85/fig4.png
Table 2. Value evaluations based on contrast-distorted images from CSIQ[17]. The values for NR-IQA (in italics) and FR-IQA methods are the average ranking values and average values, respectively.

Level 0

Level 1

Level 2

Level 3

Level 4

SSIM

N/A

0.933

0.857

0.648

0.513

PSNR

N/A

30.005

22.808

17.341

15.793

ENIQA

3.800

3.867

3.433

2.467

1.433

NIQE

2.267

3.067

3.233

3.667

2.767

PIQUE

2.267

3.233

3.600

3.100

2.400

NSS

1.600

1.900

2.700

4.000

4.800

BRISQUE

3.400

3.567

3.333

2.733

1.967

H$_{N}$

5.000

3.933

3.067

2.000

1.000

Proposed

4.867

4.000

3.133

2.000

1.000

Table 3. Comparison of IQA values forFig. 3.

(a)

(b)

SSIM

N/A

0.425

PSNR

N/A

9.867

NIQE

8.892

9.564

PIQUE

38.207

44.426

NSS

2.517

2.584

BRISQUE

27.571

35.179

H$_{N}$

0.854

0.810

Proposed

0.422

0.956

Table 4. Comparison of IQA values forFig. 4.

(a)

(b)

SSIM

N/A

0.682

PSNR

N/A

15.732

ENIQA

0.215

0.198

NIQE

2.900

3.643

PIQUE

36.887

46.458

NSS

2.513

3.442

BRISQUE

29.381

41.834

H$_{N}$

0.802

0.974

Proposed

0.508

0.851

Table 5. Comparison of average running times for NR-IQA methods.

Time (in seconds)

ENIQA

4.7342

NIQE

0.0334

PIQUE

0.0599

NSS

0.0323

BRISQUE

0.0431

H$_{N}$

0.0005

Proposed

0.0020

5. Conclusion

In this paper, we proposed a novel NR-IQA method that quantifies the contrast of an image according to the degree of symmetric uniformity in its probability distribution. The proposed ICQA-DUPD method does not require a reference ground-truth image and has much lower computational complexity than other NR-IQA methods. We evaluated the proposed ICQA-DUPD method using contrast-distorted and hazy images from the CSIQ and RTTS databases. Experimental results showed that the proposed ICQA-DUPD method is better suited for quantifying image contrast quality than the other NR-IQA methods.

Matlab and Python source code are available at https://github.com/wkim1986/ICQA-DUPD

ACKNOWLEDGMENTS

This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (NRF-2021R1F1A1047396).

References

1 
Gu K., Zhai G., Lin W., Liu M., Jan. 2016, The analysis of image contrast: from quality assessment to automatic enhancement, IEEE Trans. Cybernetics, Vol. 46, No. 1, pp. 284-297DOI
2 
Wang S., Ma K., Yeganeh H., Wang Z., Lin W., Dec. 2015, A patch-structure representation method for quality assessment of contrast changed images, IEEE Signal Process. Lett., Vol. 22, No. 12, pp. 2387-2390DOI
3 
Abdoli M., Nasiri F., Brault P., Ghanbari M., 2019, Quality assessment tool for performance measurement of image contrast enhancement methods, IET Image Process., Vol. 13, No. 5, pp. 833-842DOI
4 
Ye Z., Mohamadian H., Pang S.-S., Iyengar S., 2007, Image contrast enhancement and quantitative measuring of information flow, in 6th WSEAS Int. Conf. Information Security and Privacy, Tenerife, Spain, pp. 172-177URL
5 
Chen S.-D., Apr. 2012, A new image quality measure for assessment of histogram equalization-based contrast enhancement techniques, Digit. Signal Process., Vol. 22, pp. 640-647DOI
6 
Mahmood A., Khan S. A., Hussain S., Almaghayreh E. M., Nov. 2019, An adaptive image contrast enhancement technique for low-contrast images, IEEE Access, Vol. 7, pp. 161584-161593DOI
7 
Mello Román J., Vázquez Noguera J., Legal-Ayala H., Pinto-Roa D., Gomez-Guerrero S., García Torres M., Mar. 2019, Entropy and contrast enhancement of infrared thermal images using the multiscale top-hat transform, Entropy, Vol. 21, No. 3DOI
8 
Park J. Shin and R.-H., Sep. 2015, Histogram-based locality-preserving contrast enhancement, IEEE Signal Process. Lett., Vol. 22, No. 9, pp. 1293-1296DOI
9 
Ahar A., Barri A., Schelkens P., Feb. 2018, From sparse coding significance to perceptual quality: a new approach for image quality assessment, IEEE Trans. Image Process., Vol. 27, No. 2, pp. 879-893DOI
10 
Shannon C. E., 1948, A mathematical theory of communication, Bell System Technical Journal, Vol. 27, pp. 379-423DOI
11 
Chen X., Zhang Q., Lin M., Yang G., He C., Sep. 2019, No-reference color image quality assessment: from entropy to perceptual quality, EURASIP J. Image Video Process., No. 77, pp. 1-14DOI
12 
Mittal A., Soundararajan R., Bovik A. C., Mar. 2013, Making a completely blind image quality analyzer, IEEE Signal Process., Vol. 20, No. 3, pp. 209-212DOI
13 
Venkatanath N., Praneeth D., Chandrasekhar Bh. M., Channappayya S. S., Medasani S. S., 2015, Blind image quality evaluation using perception based features, in NCCDOI
14 
Fang Y., Ma K., Wang Z., Lin W., Fang Z., Zhai G., Jul. 2015, No-reference quality assessment of contrast-distorted images based on natural scene statistics, IEEE Signal Process. Lett., Vol. 22, No. 7, pp. 838-842DOI
15 
Mittal A., Krishna A., Bovik A. C., Dec. 2012, No-reference image quality assessment in the spatial domain, IEEE Trans. Image Process., Vol. 21, No. 12, pp. 4695-4708DOI
16 
Wang Z., Bovik A. C., Sheikh H. R., Simoncelli E. P., Apr. 2004, Image quality assessment: from error visibility to structural similarity, IEEE Trans. Image Process., Vol. 13, No. 4, pp. 600-612DOI
17 
Chandler E. C. Larson and D. M., Mar. 2010., Most apparent distortion: full-reference image quality assessment and the role of strategy, Journal of Electronic Imaging, Vol. 19, No. 1DOI
18 
Kim S. E., Park T. H., Eom I. K., 2020, Fast single image dehazing using saturation-based transmission map estimation, IEEE Trans. Image Process., Vol. 29, pp. 1985-1998DOI
19 
Li B., Ren W., Fu D., Tao D., Feng D., Zeng W., Wang Z., Jan. 2019, Benchmarking single-image dehazing and beyond, IEEE Trans. Image Process., Vol. 28, No. 1, pp. 492-505DOI

Author

Wonvin Kim
../../Resources/ieie/IEIESPC.2022.11.2.85/au1.png

Wonvin Kim received his BE in Computer Software from the Korean Bible University, Seoul, Korea, in 2009. He obtained his ME in Electrical and Computer Engineering from the University of Seoul, Korea, in 2011, and his PhD in Internet and Multimedia Engineering from the Konkuk University, Seoul, Korea, in 2021. From 2014 to 2018, he was a Software Engineer at BD, Seoul, Korea. Since 2020, he has been a faculty member and is currently an Assistant Professor in the Department of Computer Software, Korean Bible University, Seoul, Korea. His research interests include digital image processing and deep learning.

Changhoon Yim
../../Resources/ieie/IEIESPC.2022.11.2.85/au2.png

Changhoon Yim received his B.S. from the Department of Control and Instrumentation Engineering, Seoul National University, Korea, in 1986, an M.S. in Electrical and Electronics Engineering from the Korea Advanced Institute of Science and Technology, in 1988, and his Ph.D. in Electrical and Computer Engineering from the University of Texas at Austin, in 1996. He was a research engineer working on HDTV for the Korean Broadcasting System from 1988 to 1991. From 1996 to 1999, he was a Member of the Technical Staff in the HDTV and Multimedia Division, Sarnoff Corporation, New Jersey, USA. From 1999 to 2000, he worked at Bell Labs, Lucent Technologies, New Jersey, USA. From 2000 to 2002, he was a Software Engineer in KLA-Tencor Corporation, California, USA. From 2002 to 2003, he was a Principal Engineer at Samsung Electronics, Suwon, Korea. Since 2003, he has been a faculty member and is currently a Professor in the Department of Computer Science and Engineering, Konkuk University, Seoul, Korea. His research interests include digital image processing, video processing, computer vision, and deep learning.