Shadow Correction in UAV Imagery Using a Deep Convolutional U-Net Neural Network

Document Type : Original Article

Authors
1 K. N. Toosi University of Technology
2 Tarbiat Modares University
3 Imam Hossein Comprehensive University
4 Hekmat Higher Educational Institution
5 Tehran University
Abstract
Shadows in unmanned aerial vehicle (UAV) imagery introduce significant visual and radiometric distortions that adversely affect feature extraction, classification, and three-dimensional reconstruction accuracy. To mitigate these effects, this study proposes a deep learning–based shadow correction framework built upon a customized U-Net architecture enhanced with a VGG19-based encoder. The model was trained on paired UAV images (shadowed and shadow-free) normalized to the [0,1] intensity range. Network parameters were optimized using the ADAM optimizer and a hybrid loss function combining mean absolute error (MAE) and mean squared error (MSE) to balance pixel-level accuracy and structural preservation.

Quantitative evaluation using Root Mean Square Error (RMSE) and Peak Signal-to-Noise Ratio (PSNR) demonstrated that the proposed model achieved an RMSE of 0.0404 in the normalized [0,1] range (equivalent to 10.31 in the 8-bit [0–255] scale) and a PSNR of 27.87 dB, indicating accurate reconstruction of shadowed regions while preserving fine structural and textural details. Qualitative assessments further confirmed stable performance across high-resolution short-range imagery without introducing noticeable artifacts.

The results suggest that enhanced convolutional architectures remain highly effective for shadow correction in high-detail UAV scenarios, providing a favorable balance between reconstruction accuracy and computational efficiency. Integrating such deep convolutional frameworks into UAV image preprocessing pipelines can significantly improve radiometric consistency and the analytical reliability of photogrammetric datasets.

Keywords

Subjects



Articles in Press, Accepted Manuscript
Available Online from 02 May 2026