Journal of Xidian University ›› 2024, Vol. 51 ›› Issue (1): 41-51.doi: 10.19665/j.issn1001-2400.20230402

• Information and Communications Engineering • Previous Articles     Next Articles

Attention autocorrelation mechanism-based residual clutter suppression method

SHEN Lu1(), SU Hongtao1(), WANG Jin1,2(), MAO Zhi1(), JING Xinchen1(), LI Ze1()   

  1. 1. National Key Laboratory of Radar Signal Processing,Xidian University,Xi’an 710071,China
    2. Nanjing Research Institute of Electronics Technology,Nanjing 210039,China
  • Received:2022-10-25 Online:2023-05-17 Published:2023-05-17
  • Contact: SU Hongtao E-mail:imlbtr@163.com;suht@xidian.edu.cn;wjkbf1926@sina.com;zmao@stu.xidian.edu.cn;18292814263@163.com;himmelize@163.com

Abstract:

Radar systems are subject to an ever-changing and complex environment that creates a non-uniform and time-varying clutter.The unsuppressed residual clutter can produce a significant number of false alarms,leading to a degraded target tracking performance,spurious trajectories creation,or saturation data processing systems,which in turn decreases the detection ability of the radar system.Conventional residual clutter suppression algorithms typically require feature extraction and classifier construction.These steps can result in poor generalization capability,difficulty in feature combination,and high requirements for the classifier.To address these issues,inspired by self-attention mechanisms and domain knowledge,this paper proposes a data- and knowledge-driven attention autocorrelation mechanism,which can effectively extract deep features of the radar echo to distinguish between targets and clutter,on the basis of which a residual clutter suppression method is constructed using the attention autocorrelation mechanism,which makes full use of the radar echo feature,thereby improving the residual clutter suppression capability.Simulation and measurement results demonstrate that this method has advantages of a significant performance and generalization capability for residual clutter suppression.Additionally,its parallel computing structure enhances the operational efficiency of the algorithm.

Key words: residual clutter suppression, attention autocorrelation mechanism, deep-features, neural networks

CLC Number: 

  • TN957