Underwater images suffer from color distortion and low contrast, because light is attenuated while it propagates through water. Attenuation under water varies with wavelength, unlike terrestrial images where attenuation is assumed to be spectrally uniform. The attenuation depends both on the water body and the 3D structure of the scene, making color restoration difficult. Unlike existing single underwater image enhancement techniques, our method takes into account multiple spectral profiles of different water types. By estimating just two additional global parameters: the attenuation ratios of the blue-red and blue-green color channels, the problem is reduced to single image dehazing, where all color channels have the same attenuation coefficients. Since the water type is unknown, we evaluate different parameters out of an existing library of water types. Each type leads to a different restored image and the best result is automatically chosen based on color distribution. We collected a dataset of images taken in different locations with varying water properties, showing color charts in the scenes. Moreover, to obtain ground truth, the 3D structure of the scene was calculated based on stereo imaging. This dataset enables a quantitative evaluation of restoration algorithms on natural images and shows the advantage of our method.
The paper is availbale on arXiv.
If you use this dataset please cite it as SQUID [ref].
[Ref] Berman, Dana, Deborah Levy, Shai Avidan, and Tali Treibitz. "Underwater Single Image Color Restoration Using Haze-Lines and a New Quantitative Dataset." IEEE Transactions on Pattern Analysis and Machine Intelligence (2020).
Bibtex entry:
@article{berman2020underwater,
title={Underwater Single Image Color Restoration Using Haze-Lines and a New Quantitative Dataset},
author={Berman, Dana and Levy, Deborah and Avidan, Shai and Treibitz, Tali},
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
year={2020}
}
The code is availbale on GitHub: https://github.com/danaberman/underwater-hl.
The dataset includes RAW images, TIF files, camera clibration files, and distance maps.
The database contains 57 stereo pairs from four different sites in Israel, two in the Red Sea (representing tropical water) and two in the Mediterranean Sea (temperate water).
In the Red Sea the sites were a coral reef ('Katzaa') which is 10-15 meters deep (15 pairs) and a shipwreck ('Satil'), 20-30 meters deep (8 pairs).
In the Mediterranean Sea both sites were rocky reef environments, separated by 30km, Nachsholim at 3-6 meters depth (13 pairs), and Mikhmoret at 10-12 meters depth (21 pairs).
For convenience it is divided to the 4 dive sites.
Several examples are displayed below each link.
README file.
If you use this data, please cite the paper.
To evaluate your own results, please use this evaluation code.
The transmission maps are displayed along with the images. They are color-mapped: warm colors indicate high values, while cold color indicate low values.
Please note that the buttons on the left switch both the image and the transmission map.
In order to increase contrast, as well as for methods by Ancuti et al., no transmission map is estimated.
Restoration Methods
|
Input Image
Output Image |
True distance based on stereo
Output Transmission Map |
Restoration Methods
|
Input Image
Output Image |
True distance based on stereo
Output Transmission Map |
Restoration Methods
|
Input Image
Output Image |
True distance based on stereo
Output Transmission Map |
Restoration Methods
|
Input Image
Output Image |
True distance based on stereo
Output Transmission Map |
Restoration Methods
|
Input Image
Output Image |
True distance based on stereo
Output Transmission Map |
Restoration Methods
|
Input Image
Output Image |
True distance based on stereo
Output Transmission Map |
Restoration Methods
|
Input Image
Output Image |
True distance based on stereo
Output Transmission Map |
Restoration Methods
|
Input Image
Output Image |
True distance based on stereo
Output Transmission Map |
Restoration Methods
|
Input Image
Output Image |
True distance based on stereo
Output Transmission Map |
[Drews et al. 2013] P. Drews, E. Nascimento, F. Moraes, S. Botelho, and M. Campos. Transmission estimation in underwater single images. In Proc. IEEE ICCV Underwater Vision Workshop, pages 825–830, 2013.
[Peng et al. 2015] Y.-T. Peng, X. Zhao, and P. C. Cosman. Single underwater image enhancement using depth estimation based on blurriness. In Proc. IEEE ICIP, 2015.
[Ancuti et al. 2016] C. Ancuti, C. O. Ancuti, C. De Vleeschouwer, R. Garcia, and A. C. Bovik. Multi-scale underwater descattering. In Proc. ICPR, 2016.
[Ancuti et al. 2017] C. O. Ancuti, C. Ancuti, C. De Vleeschouwer, L. Neumann, and R. Garcia. Color transfer for underwater dehazing and depth estimation. In Proc. IEEE ICIP, 2017(All color transfers were done with a single image).
[Emberton et al. 2017]S. Emberton, L. Chittka, and A. Cavallaro, Underwater image and video dehazing with pure haze region segmentation, Computer
Vision and Image Understanding, 2017.
[Ancuti et al. 2018] . O. Ancuti, C. Ancuti, C. De Vleeschouwer, and P. Bekaert. Color balance and fusion for underwater image enhancement. IEEE Transactions on Image Processing, 27(1):379–393, 2018.