Jun Mao^{1,2*}, Uthai Phommasak^{2}, Shinya Watanabe^{2} and Hiroyuki Shioya^{2}  
^{1}College of Computer Science and Technology, Henan Polytechnic University, 2001, Century Avenue, Jiaozuo (454003), Henan, P.R. China  
^{2}Department of Information and Electronic Engineering, Muroran Institute of Technology, Mizumoto, Muroran 0508585, Japan  
Corresponding Author :  Jun Mao Department of Information and Electronic Engineering Muroran Institute of Technology Mizumoto, Muroran 0508585,Japan Email: [email protected] 
Received September 06, 2014; Accepted September 25, 2014; Published September 28, 2014  
Citation: Mao J, Phommasak U, Watanabe S, Shioya H (2014) Detecting Foggy Images and Estimating the Haze Degree Factor. J Comput Sci Syst Biol 7:226228.doi:10.4172/jcsb.1000161  
Copyright: 2014 Mao J, et al. This is an openaccess article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.  
Related article at Pubmed Scholar Google 
Visit for more related articles at Journal of Computer Science & Systems Biology
Limited visibility in haze weather strongly influences the accuracy and the general functions of almost outdoor video surveillance or driver assistance systems. Actual weather condition is valuable information to invoke corresponding approaches. Based on the atmospheric scattering model analysis and the statistics of various outdoor images, for most foggy images, we find that the lowest and highest value in color channels tends to be the same value of atmospheric light. A function for estimating the haze degree is developed for the automatic detection of the foggy image with different haze degrees. Experimental results show that our haze classification method achieves high performance.
Keywords 
Dehazing; Atmospheric scattering model; Haze degree 
Introduction 
At present, most outdoor videosurveillance, driverassistance and optical remote sensing systems have been designed to work under good visibility and weather conditions. Poor visibility often occurs in foggy or hazy weather conditions and can strongly influence the accuracy or even the general functionality of such vision systems. Consequently, it is important to import actual weathercondition data to the appropriate processing mode. Recently, significant progress has been made in haze removal from a single image [1,2]. Based on the hazy weather classification, specialized approaches, such as a dehazing process, can be employed to improve recognition. Figure 1 shows a sample processing flow of our dehazing program. 
Despite its remarkable value, determining weather information from a single image has not been thoroughly studied. Traditional algorithms are designed for specific applications or require human intervention. Weatherrecognition systems for vehicles which depend on vehiclespecific priors have been proposed [36]. Another proposed system [7] can automatically label images with high confidence by assigning weather labels, such as sunny, or cloudy; however, manual input constraints are required. 
Against this background, the main aim of the current study is to develop a set of stable algorithms for the detecting foggy images and labeling the haze degree of images by using a factor with universal applications. In this paper, we propose a haze degree estimation function to automatically distinguish foggy images and label images with their corresponding haze degrees. We relied on the atmospheric scattering model analysis and statistics derived from various outdoor images in order to develop the estimation function. 
Atmospheric Scattering Model Analysis 
By default, we discuss the case of image that uses the RGB color model. A hazy image can be modeled as shown in [1,8] (Figure 2): 
(1) 
where x denotes the pixel location, I(x) is the observed haze image, and J(x) is the hazefree image. For n ∈ {r, g, b}, In (x) is one of three color channels of I(x), and Jn(x) is one of three color channels of Jn (x). A is the global atmosphere light and is generally a fixed element A0 in all three color channels, An=A0. t(x) is the medium transmission and it is supposed to be the same in all three color channels at one pixel location. When the atmosphere is homogeneous, t(x)=exp(−β · dep(x)). Here, β is the scattering coefficient of the atmosphere, and dep(x) is the scene depth. To determine the haze degree of an image rapidly and reliably, we define the following: 
(2) 
(3) 
(4) 
(5) 
(6) 
where d(x) is the minimum value of three channels, and b(x) is the maximum value. d and c, which are the average values of dI (x) and cI (x), are referred to as the dark and contrast values, respectively. Here, we assume that the size of image I is . Take the minimum and maximum of the three channels on both sides of Equation 1: 
(8) 
(9) 
(10) 
By Equation 8Equation 7, we get the following 
(11) 
For most hazefree outdoor images, dJ (x)<A0 even dJ (x) ≪ A0, Equation 9 and Equation 10 show that the smaller t(x), cI(x) and A0 −dI(x) are closer to 0. As above, the values d and c may be correlated with the overall haze degree of an image. In the next section, we use a statistical method to evaluate the relationships. 
Note that, for simplicity, we estimate A as follows. Clearly is established, and then, A0 can be expressed as 
(14) 
Here, we set λ=1/3. 
Haze Degree Estimation Function 
We selected 300 outdoor images that use the RGB color model (component values are stored as integer numbers in the range 0255) and manually divided them into six groups according to the standards in Table 1. Figure 3 shows the sample images from the six groups. A grade from 0 to 5 representing the haze degree is assigned to each group; the higher the grade, the hazier the image. Figure 4 shows the values of (A0−d) (horizontal axis) and c (vertical axis) of images from six groups and the haze degrees of 
all selected images. The point colors represent the groups, and the point size indicates the haze degree, i.e., the larger points indicate greater haziness. It is evident that for most images from groups 3 to 5, (A0−d) is less than 75 and c is less than 50. For most hazefree images (group 0, the smallest blue points), (A0−d) is greater than 100 and c is larger than that in other groups 
To limit ω ∈ (0, 1), we introduce the following to estimate the haze factor ω: 
(15) 
Obviously, ln (ω) is a linear function of x1, x2 and σ. Each hazedegree was assigned anω in Table 1. Using multiple linear regression analysis on our data set we can get raw μ, ν and σ. Because μ, ν, σ are experience constants, we recommend μ=5.1, ν=2.9, σ=0.2461. 
The main process of our algorithm is as follows 
1. Input processing image I(x). 
2. Obtain bI (x), dI(x), cI(x) from I(x). 
3. Calculate d, b, c, estimate the air light A 
4. Get haze factor ω by using Equation 12. 
Experimental Results 
We used the Foggy Road Image Database (FRIDA) [9,10] to test the haze factor estimation function Equation 12. FRIDA is comprised of 90 synthetic images of 18 urban road scenes. Each image is 640×480 pixels. (Mean execution time is 230 ms on an Intel Core I7 CPU). Each image without fog (Lima set) is associated with four foggy images. Different types of fog are added to each of the four associated images: uniform fog (U080), heterogeneous fog (K080), cloudy fog (L080), and cloudy heterogeneous fog (M080), as an example Figure 5 shows the No.1 and No.10 images from five sets of FRIDA. 
As Figure 6 shows, the lowest haze degree (between 0.4 to 0.6) is found for images of Lima. Note that the sky area of images from the Lima set is different from a real situation. Real fogfree image in our experimental results, have a degree of below 0.3. The four associated foggy images have regularity values (between 0.6 and 1) that correspond to different types of fogs. Images from U080 set always get the highest degree and M080 always get the lowest. For different images in the same set, for instance, No.1 and No. 10, No. 10 gets higher degree than No. 1, which is consistent with the actual situation. The experimental results show that Equation 12 can accurately distinguish haze degree. 
In addition, we also randomly collected 48 real images to test our method dividing them by artificially picking three groups: hazefree, haze, and thickhaze. Figure 7 shows the haze factors of three groups and provides some sample images and the factor value. About 94% hazefree images get a hazefactor value below 0.4, 88% haze images get a value between 0.4 and 0.6, and, 85% thick images get a value between 0.7 and 1. Note that photo in the red circle, has hazefactor below 0.1, despite being a thickhaze airport night view obviously, this is because this picture has a monochrome light source, which means that A was not the same in three channels which make our model fail. 
Conclusion 
We introduced a numerical foggy image detecting method by using the atmospheric scattering model analysis and statistics of various outdoor images, which can estimate the hazefactor from single image by using an adjustable empirical function without manual input constraint. Because its complexity is linear, it can be applied as an initial classification step of dehazing processing and does not exhaust processing resources. Experimental results show that the method can be applied to usual weather conditions in videosurveillance, driver assistance and optical remote sensing system with high accuracy 
Our proposed prior is inspired by the atmospheric scattering mode, and supposes that the air light is equal in all three channels which might not always be true. Moreover, our method cannot be applied to the case of a monochrome light source. Thus we leave these problems for further research. 
References 

Table 1 
Figure 1  Figure 2  Figure 3  Figure 4 
Figure 5  Figure 6  Figure 7 