西安电子科技大学学报 ›› 2021, Vol. 48 ›› Issue (5): 23-29.doi: 10.19665/j.issn1001-2400.2021.05.004

• • 上一篇    下一篇

一种生成对抗网络的遥感图像去云方法

王军军(),孙岳(),李颖()   

  1. 西安电子科技大学 综合业务网理论及关键技术国家重点实验室,陕西 西安 710071
  • 收稿日期:2021-05-13 出版日期:2021-10-20 发布日期:2021-11-09
  • 通讯作者: 孙岳
  • 作者简介:王军军(1996—),男,西安电子科技大学硕士研究生,E-mail: 2287435181@qq.com|李 颖(1973—),女,教授,博士,E-mail: yli@mail.xidian.edu.cn
  • 基金资助:
    国家自然科学基金(61971333)

Cloud removal method for the remote sensing image based on the GAN

WANG Junjun(),SUN Yue(),LI Ying()   

  1. State Key Laboratory of Integrated Service Networks,Xidian University,Xi’an 710071,China
  • Received:2021-05-13 Online:2021-10-20 Published:2021-11-09
  • Contact: Yue SUN

摘要:

遥感图像在获取过程中会受到气候等因素的影响,导致得到的图像包含云层信息,在很大程度上影响了遥感图像的后续使用。基于深度学习的图像去云方法可以较好地移除云层,但已有方法存在训练时间长,去云效果不充分和颜色失真等问题。针对这些问题,提出了一种端到端生成,对抗网络模型对遥感图像去云,可以从含有云层的遥感图像中恢复出清晰的原始图像。首先,使用U-Net网络作为生成器的主要结构,并在编码模块和解码模块中间加入连续记忆残差模块来挖掘输入信息的深度特征;然后,使用卷积神经网络作为判别器来判别输入数据的真伪;最后,联合对抗性损失函数和L1损失函数,通过计算网络模型的输出与真实数据之间的差距,来衡量网络模型预测的优劣。实验结果表明,该方法在定量指标(峰值信噪比和结构相似性)和运行时间上均优于现有的去云方法,并且在参数量一致的条件下,计算量GFLOPs最低,具有更低的算法复杂度。此外,该方法得到的遥感图像细节信息更丰富,颜色几乎没有失真,具有更好的主观视觉效果。

关键词: 遥感图像, 图像去云, 生成对抗网络, 连续记忆残差

Abstract:

Since remote sensing images will inevitably be affected by the climate in the acquisition process,obtained images may contain cloud information,which affects the subsequent use of images to a large extent.Image cloud removal methods based on deep learning can remove clouds well,but they have problems such as long training time,insufficient cloud removal effect and color distortion.To solve these problems,a cloud removal method based on the end-to-end generative adversarial network (GAN) is proposed to recover clear images from remote sensing images containing clouds.First,the U-Net is used as the main structure of the generator,and a continuous memory residual module is added between the encoder module and the decoder module to mine the depth characteristics of the input information.Then,a convolutional neural network is adopted as the discriminator to distinguish authenticity.Finally,the loss function,by combining the adversarial function with the absolute loss function,is designed to measure the advantages and disadvantages of the model by calculating the gap between the output of the network model and the real data.Experimental results show that the proposed method is superior to existing cloud removal methods in both quantitative indexes (peak signal to noise ratio and structural similarity) and running time.Under the same number of parameters,the proposed method has the lowest calculation amount (GFLOPs) and a lower algorithm complexity.Besides,remote sensing images obtained by the proposed method can lead to richer detailed information,almost no color distortion,and a better subjective visual effect.

Key words: remote sensing image, image cloud removal, generative adversarial network, continuous memory residual

中图分类号: 

  • TP391.41