Journal of Xidian University ›› 2025, Vol. 52 ›› Issue (1): 1-13.doi: 10.19665/j.issn1001-2400.20241012

• Information and Communications Engineering •     Next Articles

Decoder-side enhanced image compression network under distributed strategy

ZHANG Jing(), WU Huixue(), ZHANG Shaobo(), LI Yunsong()   

  1. State Key Laboratory of Integrated Services Networks,Xidian University,Xi’an 710071,China
  • Received:2024-06-21 Online:2024-11-05 Published:2024-11-05

Abstract:

With the rapid development of multimedia,large-scale image data causes a great pressure on network bandwidth and storage.Presently,deep learning-based image compression methods still have problems such as compression artifacts in the reconstructed image and a slow training speed.To address the above problems,we propose a decoder-side enhanced image compression network under distributed strategy to reduce the reconstructed image compression artifacts and improve the training speed.On the one hand,the original information aggregation subnetwork cannot effectively utilize the output information of the hyperpriori decoder,which inevitably generates compression artifacts in the reconstructed image and negatively affects the visual effect of the reconstructed image.Therefore,we use the decoder-side enhancement module to predict the high-frequency components in the reconstructed image and reduce the compression artifacts.Subsequently,in order to further improve the nonlinear capability of the network,a feature enhancement module is introduced to further improve the reconstructed image quality.On the other hand,distributed training is introduced in this paper to solve the problem of slow training of traditional single node networks,and the training time is effectively shortened by using distributed training.However,the gradient synchronization during distributed training generates a large amount of communication overhead,so we add the gradient sparse algorithm to distributed training,and each node passes the important gradient to the master node for updating according to the probability,which further improves the training speed.Experimental results show that distributed training can accelerate training on the basis of ensuring the quality of the reconstructed image.

Key words: distributed training, decode-side enhancement, deep learning, image compression

CLC Number: 

  • TP391.4