Electronic Science and Technology ›› 2023, Vol. 36 ›› Issue (9): 86-92.doi: 10.16180/j.cnki.issn1007-7820.2023.09.013

Previous Articles     Next Articles

Transformer-Based Maneuvering Target Tracking

DANG Xiaofang,CAI Xingyu   

  1. Xi'an Electronic Engineering Research Institute, Xi'an 710100, China
  • Received:2022-03-03 Online:2023-09-15 Published:2023-09-18
  • Supported by:
    RT9 Low Altitude Surveillance Radar Foundation of the State Administration for Science, Technology and Industry for National Defense(AS216)

Abstract:

In view of the problems of maneuvering targets tracking when the Recurrent Neural Network (RNN) and Long Short-Term Memory (LSTM), such as the gradient disappears and explodes due to the long sequence, which leads to the poor tracking effect after the target maneuvers, a Transformer-Based Network (TBN) is proposed.TBN uses an encoder designed based on attention mechanism to extract the historical track features of the target sequence and improves the ability of capturing the maneuvering situation of the target.Using the Center Max(CM) normalization method, all sequences are subtracted from their initial values, which reduces the complexity of network learning and enhances the generalization of network.The results show that the position accuracy and velocity accuracy of the CM normalization and TBN method are improved by 11.2% and 41.9%, compared with the LSTM network in the large track data set with maneuvering conditions.The method proposed in this study can track the target correctly even when the observed value is missing.

Key words: maneuvering target tracking, attention mechanism, Transformer network, recurrent neural network, Long Short-Term Memory, normalization, state space model, neural network

CLC Number: 

  • TN953