Asian Journal of atmospheric environment
Asian Journal of atmospheric environment Asian Journal of atmospheric environment
Asian Journal of atmospheric environment
  Aims and Scope Type of Manuscripts Best Practices Contact Information  
  Editor-in-Chief Associate Editors Editorial Advisory Board  

Journal Archive

Asian Journal of Atmospheric Environment - Vol. 5 , No. 1

[ Research Article ]
Asian Journal of Atmospheric Environment - Vol. 5, No. 1
Abbreviation: Asian J. Atmos. Environ
ISSN: 1976-6912 (Print) 2287-1160 (Online)
Print publication date 31 Mar 2011
Received 12 Aug 2010 Accepted 02 Dec 2010
DOI: https://doi.org/10.5572/ajae.2011.5.1.008

Algorithm Development of a Visibility Monitoring Technique Using Digital Image Analysis
Rajib Pokhrel ; Heekwan Lee*
Department of Civil and Environmental Engineering, University of Incheon 12-1, Songdo-Dong, Yeonsu-Gu, Incheon 406-840, Korea

Correspondence to : * Tel: +82-32-835-8468, E-mail: airgroup@incheon.ac.kr


Abstract

Atmospheric visibility is one of the indicators used to evaluate the status of air quality. Based on a conceptual definition of visibility as the maximum distance at which the outline of the selected target can be recognized, an image analysis technique is introduced here and an algorithm is developed for visibility monitoring. Although there are various measurement techniques, ranging from bulk and precise instruments to naked eye observation techniques, each has their own limitations.

In this study, a series of image analysis techniques were introduced and examined for in-situ application. An imaging system was built up using a digital camera and was installed on the study sites in Incheon and Seoul separately. Visual range was also monitored by using a dual technology visibility sensor in Incheon and transmissometer in Seoul simultaneously.

The Sobel mask filter was applied to detect the edge lines of objects by extracting the high frequency from the digital image. The root mean square (RMS) index of variation among the pixels in the image was substantially correlated with the visual ranges in Incheon and Seoul with correlations of R2=0.88 and R2=0.71, respectively. The regression line equations between the visual range and the RMS index in Incheon and Seoul were VR=2.36e0.46×(RMS) and VR=3.18e0.15×(RMS), respectively. It was also confirmed that the fine particles (PM2.5) have more impacts to the impairment of visibility than coarse particles.


Keywords: Ambient air quality, Atmospheric visibility, Digital image analysis, Image filter, RMS index

1. INTRODUCTION

The issue of air quality has been evaluated by means of various indices. Atmospheric visibility is frequently used as an important indicator of air quality in addition to concentrations of other air pollutants. Visibility has been monitored for many decades in the U.S. for airport safety and has been monitored for esthetic reasons since at least the 1970’s. Atmospheric visibility is monitored by using various instruments and techniques. For example, relatively recently, Horvath (1995) used a telephotometer for visibility monitoring in Europe and evaluated the visibility in terms of spectral extinction coefficient (Agarwala et al., 2004). The scattering extinction by aerosols was evaluated by using an integrated nephelometer. Subsequently, the corresponding visual range was estimated in terms of scattering coefficient (Yan, 2007; Jayaraman et al., 2006). Jayaraman et al. (2006) used both nephelometer and aethalometer to evaluate the total extinction coefficient where the total extinction coefficient was the sum of scattering and absorption coefficients. Transmissometer was used to more accurately estimate visual range based on the total extinction coefficient (Kim, 2004; Malm, 1999).

Human-eye based observation was a visibility monitoring technique based on perceptual visibility (Luo et al., 2005; Ministry for the environment of New Zealand, 2001). Luo et al. (2005) used the distant contrast measurement method to evaluate urban visibility. Kim (2004) studied the color difference method for visibility monitoring and he reported that haze did not significantly affect the hue of sky but predominantly reduced its saturation. In contrast, Kim et al. (2005) reported that cloud affects the color of the sky when he used images extracted from WinHaze software. The extracted images were already processed and the noise was inserted artificially, he separated the sky view from the images when calculating the contrast difference.

Although several techniques are available for visibility measurement, there are several limitations; (i) transmissometer measurements cannot avoid part of the forward light scattered by illuminated aerosols (Gazzi et al., 2001; Gumprecht et al., 1953), (ii) integrated nephelometer lose part of forward and backward scattered lights (Ensor and Waggoner, 1970), (iii) contrast reduction law does not hold in fog because the target luminance depends on the droplet size distribution (Gazzi et al., 2001), and (iv) the variation in human-eye observation could be high due to different personal characteristics (Malm, 1999).

In addition to the technical difficulties and accuracy measures, expense and bulkiness are other aspects that require environmental engineers and scientists to seek to develop alternative techniques for in-situ applications. In this study, we introduce an efficient way of perceptual visibility monitoring using digital image analysis that could be useful and applicable to estimate atmospheric visibility. The visibility monitoring system was built and tested, then the system was validated using an existing visibility monitoring technique.

Table 1. 
Common techniques for visibility monitoring.
Type of instrument Principle of visibility monitoring
Transmissometer Total extinction coefficient (both scattering and absorption coefficient)
Dual technology visibility sensor Total extinction coefficient (both scattering and absorption coefficient)
Nephelometer Extinction due to scattering
Athelometer Extinction due to absorption
Image analysis Contrast reduction law
Human-eye observation Personal judgment based on natural scene


2. METHODOLOGY
2. 1 Digital Image Processing

Image processing is a technique to enhance the quality of images and to extract information from them (Petrou and Bosdogianni, 1999). Image processing for visibility applications has recently been used (Luo et al., 2005; Kim, 2004), for lightness determination (Horn, 1974), image editing (Agarwala et al., 2004; Perez et al., 2003), image matting (Sun et al., 2004), and color to gray mapping (Gooch et al., 2005).

Contrast and luminance values of pixels vary with the color and illumination properties of a target. There are higher contrast and luminance in a clear image than in a polluted image (Scott, 2005; Peter et al., 2002). A digital image is displayed in the form of matrix as in eq. (1) where M and N are the number of row and column, respectively (Petrou and Bosdogianni, 1999).

fx,y=f1,1f1,2f1,Nf2,1fM,1fM,N(1) 

The gray level image can be presented by one component, i.e. brightness, and a color image is presented in terms of red, green and blue colors. The color image can be converted into gray level image with brightness L as in eq. (2) where LR, LG and LB are the values of the brightness of red, green and blue color components, respectively (Luo et al., 2005). The gray level value of 8 bit digital image varies from 0-255 where the gray level value of 0 is set as perfect black and 255 as perfect white.

L=LR+LG+LB3(2) 

In image processing, Sobel mask, Prewitt mask, Laplacian mask, Fourier transform etc. are well known and frequently applied for high frequency extraction or edge line detection of images. Prewitt and Sobel masks are similar to having a (3×3) mask size but different in coefficients. According to Scott (2005), Sobel mask emphasizes the pixel close to the mask center, which is suitable in most of the application than Prewitt mask. Laplacian masks are symmetric and the sign of the result (positive or negative) from two adjacent pixels provides the information for the bright side. Similarly Fast Fourier Transform (FFT) is useful for frequency filtration that extracts or removes the frequency based on the configured block size from original image (Luo et al., 2005; Scott, 2005). Fig. 1 demonstrates the original image and filtered images using different image filters.


Fig. 1. 
Original image and processed images; (a) original color image, (b) gray level image, (c) filtered image using Sobel mask, (d) filtered image using Prewitt mask, (e) filtered image using Laplacian mask and (f) filtered image using FFT with 8 block size.

Gx=-101-202-101(3) 
Gy=-1-2-1000121(4) 
Gx,y=G1,1GM,1G1,NGM,N(5) 

Based on the clarity of objects in processed images as shown in Fig. 1, Sobel mask filter was selected for image processing in this study. The original color image was converted to the gray level image and was convolved with both horizontal and vertical masks as in eqs. (3) and (4), respectively where Gx is the horizontal Sobel mask and Gy is the vertical Sobel mask. It estimates the gradients by using those masks which approximate the first order derivative in each direction (Scott, 2005). Output image is the absolute sum of the convolved image of horizontal and vertical mask. It was also in the matrix form of same size as the original image in eq. (5). Finally the high frequency components are extracted and the outline of the objects in the image is detected.

2. 2 Post-Processing of Visibility Image

For further analysis of the processed image, statistical indices such as mean and RMS indices are introduced in this study. The Mean index is the average luminance, so the average condition of visual information. It is appropriate for constant and low visibility conditions. RMS is an index for the variation in the luminance values among the pixels. If the variation is wide, the index value is larger. The RMS is of interest for highly varied visual conditions. Mean and root mean square of luminance of image are mathematically calculated by using eqs. (6) and (7).

LMean=Lx,yN(6) 
LRMS=L2x,y-LMean2N(7) 

Where,L (x, y)=Luminance value of a pixel in imageN =Total number of pixels in digital imageLMean =Mean luminance (Mean index)LRMS =Root mean square of luminance (RMS index)x, y =Numbers of pixel in row and column, respectively

2. 3 Model Study for Visibility Image

Recently there was a study to model sky view images based on the characteristics of aerosols. In this study, the results are employed to validate the algorithm of image processing. WinHaze (version 2.9.6, Air Resource Specialist) is a computer model that simulates visual air-quality difference for numerous urban and suburban scenes. WinHaze allows analysts to view visual air-quality scenarios. It has different options to model images by considering different optical parameters, aerosol species, extinction properties, the visual ranges, and to simulate effects on the images (ARM, 2007). It is known that there are noticeable differences in land coverage type between rural, suburban, and urban areas. Mostly, the rural and suburban areas are covered with natural scenes but artificial objects such as buildings, roads, etc. dominate the scene in an urban area. With the variation in visible land coverage properties, the color of objects, pollution level, etc are also highly varied in urban and suburban areas which subsequently varies the contrast among the objects. For a typical case study, sample images of suburban (Durango) and urban area (Denver) were extracted from WinHaze model to identify the relation between index values with corresponding visual range of 3 km to 30 km.

The application of image analysis was then validated by using the images from WinHaze. Fig. 2 displays the original and filtered images of Denver, and Fig. 3 displays the original and filtered image of Durango. The image of Durango has more, but smaller artificial structures than the image of Denver. Images of both urban and suburban areas were modeled and convolved with Sobel mask which minimize the noise and enhance the edge line of the covered objects.


Fig. 2. 
Images of urban area of Denver; left hand side images are original images for visual ranges 30 km, 20 km and 10 km and right hand side images are corresponding filtered images.


Fig. 3. 
Images of suburban area of Durango; left hand side images are original images for visual ranges 30 km, 20 km and 10 km and right hand side images are corresponding filtered images.

The RMS indices of filtered images were then calculated and the correlation analysis was conducted between the visual ranges calculated by WinHaze and the computed Mean and RMS indices. The visual ranges and RMS indices were highly correlated at both sites with correlations of R2=0.91 in Durango and R2=0.99 in Denver as plotted in Fig. 4. It also confirms that the visual range has exponential relation with the RMS index but the coefficients were varied in different places. Durango covered with similar types of objects as the suburban which decreases the index value of filtered image even the corresponding visual range was same of Denver. The slope of equations for urban image is lower than for suburban image, which means the index value is more sensitive with visual range for the image covering variety of objects than similar objects. It is also known that the digital images covering multiple types of objects are more sensitive for the determination of atmospheric visibility. Subsequently, the algorithm for image processing was applied for in-situ application.


Fig. 4. 
Model test between RMS index and visual range of digital image.

2. 4 Field Test Setup

In order to capture the image for visibility analysis, firstly an imaging system was built by using a digital camera (IMI:TECH:IMC_FF:1700018) and a personal computer. The preliminary tests for the system were carried out under lab conditions. Meanwhile study sites, in the coastal area of Incheon, and urban, inland area of Seoul were selected. Fig. 5 displays the study sites where the imaging system set up and study was carried out. Fig. 5(b) displays the Incheon area which is situated at the west coast of the Korean peninsula. Apartments, schools, small hills covered with green trees make up the surrounding area. Diurnal flow of wind (sea/land breeze) allows fresh air to enter from sea to land and vice versa. Air pollutants generated by vehicles, marine vessels, power plants, residents, etc. are the major air pollution sources in the Incheon area and the high level of air pollutants aggravates the visibility. Dual Technology Visibility Sensor, SPMS (Suspended Particle Monitoring System) and imaging system as summarized in Table 2 and Fig. 6 were setup on the roof of Incheon City Hall which is located at approximately 10 km far from the nearest coast line. Instrumentation height was approximately 20 m from the ground surface.


Fig. 5. 
(A) Geographic location of study site in Korean Peninsula, (B) Aerial view of instrumentation site Seoul, and (C) Aerial view of instrumentation site Incheon.

Table 2. 
Instrument types and frequency of data monitoring.
Data type Instruments used Time interval
Particle concentration (PM2.5 & PM10) Suspended particulate matter monitoring (SPM 612) 1 hour
Visual range in Incheon Dual technology visibility sensor (8364-E) 1 hour
Visual range in Seoul Transmissometer (LPV-2) 1 hour
Digital image Digital camera (IMI:TECH:IMC_FF:1700018) 1 hour


Fig. 6. 
Major instruments used for visual range, airquality and visual air-quality monitoring in this study.

Fig. 5(c) displays the second study site located in downtown and inland area of Seoul. The area surrounding this site was covered with commercial high-rise buildings and residential areas. Major sources of air pollution in the Seoul area are vehicle emission and daily facilities for occupants. A transmissometer and SPMS had already been set on the roof of the building approximately 12 m from the ground surface for visibility and suspended particle monitoring, respectively. The same imaging system used in Incheon was relocated and installed beside the existing instruments (Transmissometer transmitter and SPMS) for visible image monitoring.

A temporary shelter was put over the camera to protect the lens from direct sun light, rainfall etc., as shown in Fig. 6. Generally, there is less effect of direct sunlight, if camera lens set towards north direction. Geographically, the north-east direction was selected as the proper direction for capturing the image of the Incheon site so that the image covers both natural and artificial backgrounds up to a horizontal line. The average distance of the target site was approximately 3 km from the observation point. Data monitoring of Incheon site was conducted from May 4 to 23 in 2006. By overcoming the obstruction of far vision by buildings camera was focused towards the south-east direction of Seoul site where the transmitter of transmissometer was located. Data monitoring in Seoul was conducted from Oct. 20 to Nov. 10 in 2006.

Although the camera has options of zoom and brightness; the default option was setup to capture the natural scene without any further adjustment. The camera was setup with horizontal line of site which covered parts of both land and sky. Captured images were in 8 bit digital color scale with the resolution of 1,024×768 pixels. The images were continuously taken with an hour interval and saved to the computer automatically where the image processing was then conducted. The visual ranges and particles (PM2.5 and PM10) concentrations were also monitored for 24 hours with an hour interval simultaneously as in Table 2.


3. RESULTS AND DISCUSSION
3. 1 Analysis of Air Quality

Particle concentrations of PM2.5 and PM10 were monitored by using SPM612. In this study, particles less than 2.5 μm are denoted as fine particle and particle 2.5-10 μm are the coarse particles. The role of the fine particles (PM2.5) and coarse particles (PM10-PM2.5) for visibility impairment was analyzed using correlation analysis. Fig. 7(a) shows the relationship between the visual ranges and the fine particles. Visual ranges and fine particles were substantially correlated with a correlation of R2=0.66. Fig. 7(b) displays the relationship between the visual ranges and the coarse particles where the coarse particles had the low correlation of R2=0.33. It reveals that fine particle has stronger influence on the visibility than coarse particle in Incheon area. Yuan et al. (2006) reported that visibility impairment is caused by the haze which is mostly due to scattering of light by tiny gases and particles. Kim et al. (2004) and many others have reported haze increases with the increase in PM2.5. Similarly, Chung et al. (2003) reported that the visibility was mostly influenced by PM2.5 rather than PM10. The hourly and 24 hours average concentration of PM2.5 reached up to 95 μg/m3 and 52.5 μg/m3, respectively. Hourly and 24 hours average concentration of PM10 reached up to 170 μg/m3 and 118.90 μg/m3, respectively. The average PM10 and PM2.5 concentration during the study period were about 69 μg/m3 and 32 μg/m3 respectively.


Fig. 7. 
(a) Atmospheric visibility vs. fine particles (PM2.5), (b) atmospheric visibility vs. coarse particles (PM2.5-PM10) in Incheon; the visual range is expressed in km and the particle concentration is in μg/m3.

Fig. 8(a) demonstrates the relationship between the visual ranges and the fine particles and Fig. 8(b) demonstrates the relationship between visual ranges and coarse particles in Seoul. The average concentrations of PM10 and PM2.5 during the study period were 54.3 μg/m3 and 35.3 μg/m3, respectively. Regression analysis shows that the airborne fine particles were substantially correlated with the atmospheric visibility with the correlation of R2=0.79 and the coarse particles produced correlation of R2=0.38. From the result, we concluded that the fine particle was a dominant factor for the visibility impairment in Seoul too. The concentration of fine particle was higher than the coarse particle in Seoul, evaluated from Fig. 8(a) and (b). The main source of air pollution in Seoul area is vehicular emission which mostly generates fine particles rather than coarse particles. But the study period of Incheon area was before monsoon when long distance transport of sand storm was frequent. Despite this we did not observe a noticeable sand storm day during the data monitoring. Fugitive dust caused by a sand storm will increase the PM10 level. Particle emitted from the power plant and industries, sea salt blown by sea breeze from sea shore, etc. were additional for PM10 concentration in Incheon area.


Fig. 8. 
(a) Atmospheric visibility vs. fine particles (PM2.5), (b) atmospheric visibility vs. coarse particles (PM2.5-PM10) in Seoul; the visual range is expressed in km and the particle concentration is in μg/m3.

3. 2 Analysis of Visibility Monitoring

The visual ranges of Incheon ranged from 2 km to 30 km while the average visual range was approximately 12 km during the test period (May 4-22, 2006). The minimum and maximum visual ranges were observed during and after the rainy day, respectively. Fig. 9 demonstrates the natural digital images during the visual ranges 30 km, 19 km and 9 km when the Sobel mask was applied to extract the high frequency components from the original digital images. The images at right-hand- side display filtered images that contain only the high frequency components or detect the edge lines of the object from original images.


Fig. 9. 
Original images with the corresponding filtered images in Incheon for different visibility conditions (Note: LHS: original images, RHS: filtered images).

Similarly, the visual ranges in Seoul varied from 2 km to 11.5 km during the test period (Oct. 20-Nov. 10, 2006). The average visual range was 6.5 km which was about half of Incheon. The captured images were mostly covered by artificial structures as in Fig. 10. Images were highly illuminated during the day time. Likewise, the right-hand-side images in Fig. 10 are the filtered images where the outlines of the objects are illuminated in different manners compared to the filtered images in Incheon.


Fig. 10. 
Original images with the corresponding filtered images at inland area in Seoul for different visibility conditions (Note: LHS: original images, RHS: filtered images).

The statistical indices such as RMS and Mean luminance were quantified to evaluate the changes due to different visibility conditions. The values of these statistical indices were higher in the clear images than the hazy images. RMS index varied from 0.2 to 5.5 for 2 km to 30 km visual ranges in Incheon as shown is Fig. 11 (a). RMS index and the visual range are substantially correlated with the correlation of R2=0.88. Fig. 11 (b) displays the mean index and visual range producing the correlation of R2=0.65. Even both results have reasonable correlations with visual ranges; RMS index and visual range produce more reliable correlation as shown in eq. (8).


Fig. 11. 
(a) RMS index vs. visual range (km) of the original digital image and (b) Mean index vs. visual range (km) of the natural digital image in Incheon.

VR=2.36e0.46×RMS(8) 

The luminance value of an image depends on the color of the observed object, the distance of the object from the observation point, air quality, etc. The luminance value of images in Seoul were higher than that of images in Incheon despite the visual range of Seoul being less than in Incheon. The RMS index varied from 0.3 to 10 for 2 to 11.5 km visual range in Seoul for this study. The RMS index and the visual range are substantially correlated with the correlation of R2=0.71. The mean index of digital image varied from 0.9 to 10. Mean index and visual range has the correlation of R2=0.56. The relation between visual range and calculated indices can be found in Fig. 12 (a) and (b). In both cases, the calculated indices produce reliable correlations with the visual ranges; either the RMS index and visual range or the mean index and the visual range. While comparing the analysis result, the RMS index produces more reliable correlation than the mean index with the visual range.


Fig. 12. 
(a) RMS index vs. visual range (km) of the natural digital image and (b) Mean index vs. visual range (km) of the original digital image in Seoul.


4. CONCLUSIONS

A visibility monitoring system was designed and tested for perceptual visibility monitoring by introducing a digital image processing technique. Especially, the mathematical algorithm employing Sobel mask filter is properly working for image processing by extracting the required information from the images. Correlation results between the measured visual ranges and the statistical indeces endorsed that the built system for visibility monitoring is reliable for in-situ application which is also beneficial to save financial investments compared to conventional systems.

The contrast property of an image is valuable information for visibility analysis, which has an exponential relation with corresponding visual range. It also found that the fine particle less than PM2.5 is more responsible for visibility impairment compared to the coarse particle from PM2.5-PM10 in both study site. During the test period, the visibility range in Incheon located in the coastal area was generally longer than Seoul located in in-land area.


Acknowledgments

Authors are thankful to the Research Institute of Public Health & Environment in Seoul and Incheon, especially for the assistance of field monitoring work and data sharing. Authors are also grateful to Kristi A. Gebhart, National Park Service Air Resource Division, CIRA-Foothills Campus, Colorado State University for the review of the manuscript and invaluable suggestion.


REFERENCES
1. Agarwala, A., Dontcheva, M., Agrawala, M., Drucker, S., Colburn, A., Curless, B., Salesin, D., Cohen, M., (2004), Interactive digital photomontage, ACM Transactions on Graphics, 23(3), p294-302.
2. Air Resource Managements (ARM), (2007), WinHaze Model, http://webcam.srs.fs.fed.us/winhaze.htm, (19 Sep 2007).
3. Chung, Y.S., Kim, H.S., Dulam, J., Harris, J., (2003), On heavy dust fall observed with explosive sandstorms in Chongwon-Chongju, Korea in 2002, Atmospheric Environment, 37(24), p3425-3433.
4. Ensor, D.S., Waggoner, A.P., (1970), Angular truncation error in the integrating nephelometer, Atmospheric Environment, 4(5), p481-487.
5. Gazzi, M., Georgiadis, T., Vicentini, V., (2001), Distant contrast measurement through fog and thick haze, Atmospheric Environment, 35(30), p5143-5149.
6. Gooch, A.A., Olsen, S.C., Tumblin, J., Gooch, B., (2005), Color2gray: Salience-preserving color removal, ACM Transactions on Graphics, Proceeding of SIGGRAPH, 24(3).
7. Gumprecht, R.O., Sliepcevich, C.M., (1953), Scattering of light by large spherical particles, Journal of Chemical Physics, 57, p90-95.
8. Horn, B., (1974), Determining lightness from an image, Computer Graphics and Image Processing, 3(1), p277-299.
9. Horvath, H., (1995), Estimation of the average visibility in central Europe, Atmospheric Environment, 29(2), p241-246.
10. Jayaraman, A., Gadhavi, H., Ganguly, D., Misra, A., Ramachandran, S., Rajesh, T.A., (2006), Spatial variation in aerosol characteristics and regional radiative forcing over India: measurement and modeling of 2004 road campaign experiment, Atmospheric Environment, 40(34), p6504-6515.
11. Kim, B.K., Kang, B.W., Lee, K.S., Choi, J.C., (2005), Development of Real-time visibility monitoring system using image contrast, Journal of The Korean Meteorological Society, 41(3), p449-459.
12. Kim, K.W., (2004), Physico-Chemical Characteristics of Visibility Impairment in an Urban Area & Development of a Remote Digital Visibility Monitoring, Ph.D. Dissertation, Department of Environmental Science and Engineering. Kwangju Institute of Science and Technology, p65-71.
13. Luo, C.H., Wen, C.H., Yuan, C.S., Liaw, J.J., Lo, C.C., Chiu, S.H., (2005), Investigation of urban atmospheric visibility by high-frequency extraction: Model development and field test, Atmospheric Environment, 39(14), p2545-2552.
14. Malm, W.C., (1999), Introduction to visibility. Cooperative Institute for Research in the Atmosphere, Colorado State University, p3-29.
15. Ministry for the environment, (2001), Good practice guide for monitoring and management of visibility in New Zealand, http://www.mfe.gov.nz.
16. Perez, P., Gangnet, M., Blake, A., (2003), Poisson image editing, ACM Transactions on Graphics, 22(3), p313-318.
17. Peter, J.B., Walter, M., (2002), Spatial frequency, phase, and the contrast of natural image, Journal of Optical Society of America, 19, p6.
18. Petrou, M., Bosdogianni, P., (1999), Imaging Processing: the Fundamentals, Wiley, England, p22-86.
19. Scott, E.U., (2005), Computer Imaging-Digital Image Analysis and Processing, Taylor & Francis, p212-224.
20. Sun, J., Jia, J., Tang, C.K., Shum, H.Y., (2004), Poisson matting, ACM Transactions on Graphics, 23(3), p315-321.
21. Yan, H., (2007), Aerosol scattering properties in north China, Atmospheric Environment, 41(32), p6916-6922.
22. Yuan, C.S., Lee, C.G., Liu, S.H., Chang, J.C., Yuan, C., Yang, H.Y., (2006), Correlation of atmospheric visibility with chemical composition of Kaohsiung aerosols, Atmospheric Research, 82(3-4), p663-679.