SubmergeStyleGAN:Synthetic Underwater Data Generation with Style Transfer for Domain Adaptation
In: E. Fathy , M , Ahmed , S A , I. Awad, 2024
Online
academicJournal
Zugriff:
Underwater computer vision applications are challenged by limited access to annotated underwater datasets. Additionally, convolutional neural networks (CNNs) trained on in-air datasets do not perform well underwater due to the high domain variance caused by the degradation impact of the water column. This paper proposes an air-to-water dataset generator to create visually plausible underwater scenes out of existing in-air datasets. SubmergeStyleGAN, a generative adversarial network (GAN) designed to model attenuation, backscattering, and absorption, utilizes depth maps to apply range-dependent attenuation style transfer. In this work, the generated attenuated images and their corresponding original pairs are used to train an underwater image enhancement CNN. Real underwater datasets were used to validate the proposed approach by assessing various image quality metrics, including UCIQE, UIQM and CCF, as well as disparity estimation accuracy before and after enhancement. SubmergeStyleGAN exhibits a faster and more robust training procedure compared to existing methods in the literature.
Titel: |
SubmergeStyleGAN:Synthetic Underwater Data Generation with Style Transfer for Domain Adaptation
|
---|---|
Autor/in / Beteiligte Person: | E. Fathy, Mohamed ; Ahmed, Samer A. ; I. Awad, Mohammed ; E. Abd El Munim, Hossam |
Link: | |
Zeitschrift: | E. Fathy , M , Ahmed , S A , I. Awad, 2024 |
Veröffentlichung: | IEEE, 2024 |
Medientyp: | academicJournal |
ISBN: | 979-8-3503-8221-1 (print) |
DOI: | 10.1109/DICTA60407.2023.00081 |
Schlagwort: |
|
Sonstiges: |
|