Journal of Xidian University ›› 2023, Vol. 50 ›› Issue (4): 54-64.doi: 10.19665/j.issn1001-2400.2023.04.006

• Special Issue on Cyberspace Security • Previous Articles     Next Articles

Efficient deep learning scheme with adaptive differential privacy

WANG Yuhua(),GAO Sheng(),ZHU Jianming(),HUANG Chen()   

  1. School of Information,Central University of Finance and Economics,Beijing 102206,China
  • Received:2023-01-12 Online:2023-08-20 Published:2023-10-17
  • Contact: Jianming ZHU;;;


While deep learning has achieved a great success in many fields,it has also gradually exposed a series of serious privacy security issues.As a lightweight privacy protection technology,differential privacy makes the output insensitive to any data in the dataset by adding noise to the model,which is more suitable for the privacy protection of individual users in reality.Aiming at the problems of the dependence of iterations on the privacy budget,low data availability and slow model convergence in most existing differential private deep learning schemes,an efficient deep learning scheme based on adaptive differential privacy is proposed.First,an adaptive differential privacy mechanism is designed based on the Shapley additive explanation model.By adding noise to the sample features,the number of iterations is independent of the privacy budget,and then the loss function is perturbed by the function mechanism,thus achieving the dual protection of original samples and labels while enhancing the utility of data.Second,the adaptive moment estimation algorithm is used to adjust the learning rate to accelerate the model convergence.Additionally,zero-centralized difference privacy is introduced as a statistical mechanism of privacy loss,which reduces the risk of privacy leakage caused by the privacy loss exceeding the privacy budget.Finally,a theoretical analysis of privacy is made,with the effectiveness of the proposed scheme verified by comparative experiments on the MNIST and Fashion-MNIST datasets.

Key words: deep learning, differential privacy, self-adaptation, privacy loss, model convergence

CLC Number: 

  • TP309