电子科技 ›› 2023, Vol. 36 ›› Issue (9): 86-92.doi: 10.16180/j.cnki.issn1007-7820.2023.09.013

• • 上一篇    下一篇

基于Transformer的机动目标跟踪技术

党晓方,蔡兴雨   

  1. 西安电子工程研究所, 陕西 西安 710100
  • 收稿日期:2022-03-03 出版日期:2023-09-15 发布日期:2023-09-18
  • 作者简介:党晓方(1987-),男,博士,高级工程师。研究方向:防空情报雷达。|蔡兴雨(1967-),男,研究员。研究方向:雷达系统。
  • 基金资助:
    国防科工局RT9低空监视雷达基金(AS216)

Transformer-Based Maneuvering Target Tracking

DANG Xiaofang,CAI Xingyu   

  1. Xi'an Electronic Engineering Research Institute, Xi'an 710100, China
  • Received:2022-03-03 Online:2023-09-15 Published:2023-09-18
  • Supported by:
    RT9 Low Altitude Surveillance Radar Foundation of the State Administration for Science, Technology and Industry for National Defense(AS216)

摘要:

为解决递归神经网络(Recurrent Neural Network,RNN)和长短期记忆网络(Long Short-Term Memory,LSTM)在跟踪机动目标时,由于序列过长容易出现梯度消失和梯度爆炸导致目标发生机动后跟踪效果变差的问题,文中构建了一种基于Transformer的网络(Transformer-Based Network,TBN)来跟踪机动目标。该网络使用基于注意力机制设计的编码器提取目标序列的历史航迹特征,提高对目标机动情况的捕获能力。使用基于卷积网络设计的解码器输出最终的航迹序列,提高机动目标跟踪能力。通过中心最大值(Center-Max,CM)归一化方法,将所有序列减去其初值,降低了网络学习的复杂度,增强了网络的泛化性。实验结果证明,在存在机动情况的大规模航迹数据集下,与长短期记忆网络相比,CM归一化和TBN相组合的方法的位置精度提高了11.2%,速度精度提高了41.9%。文中所提方法在观测值存在缺失时仍能正确跟踪目标。

关键词: 机动目标跟踪, 注意力机制, Transformer网络, 循环神经网络, 长短期记忆网络, 归一化, 状态空间模型, 神经网络

Abstract:

In view of the problems of maneuvering targets tracking when the Recurrent Neural Network (RNN) and Long Short-Term Memory (LSTM), such as the gradient disappears and explodes due to the long sequence, which leads to the poor tracking effect after the target maneuvers, a Transformer-Based Network (TBN) is proposed.TBN uses an encoder designed based on attention mechanism to extract the historical track features of the target sequence and improves the ability of capturing the maneuvering situation of the target.Using the Center Max(CM) normalization method, all sequences are subtracted from their initial values, which reduces the complexity of network learning and enhances the generalization of network.The results show that the position accuracy and velocity accuracy of the CM normalization and TBN method are improved by 11.2% and 41.9%, compared with the LSTM network in the large track data set with maneuvering conditions.The method proposed in this study can track the target correctly even when the observed value is missing.

Key words: maneuvering target tracking, attention mechanism, Transformer network, recurrent neural network, Long Short-Term Memory, normalization, state space model, neural network

中图分类号: 

  • TN953