Journal of Xidian University ›› 2020, Vol. 47 ›› Issue (3): 97-104.doi: 10.19665/j.issn1001-2400.2020.03.014

Previous Articles     Next Articles

Model of abstractive text summarization for topic-aware communicating agents

ZHANG Zheming,REN Shuxia(),GUO Kaijie   

  1. Department of Computer Science and Technology, Tiangong University, Tianjin 300387, China
  • Received:2019-11-26 Online:2020-06-20 Published:2020-06-19
  • Contact: Shuxia REN E-mail:t_rsx@126.com

Abstract:

To solve the problem that the traditional automatic text summary model cannot generate a high-quality long text summary due to the limitation of the length of the RNN (Recurrent Neural Network), a model of abstractive text summarization for topic-aware communicating agents has been proposed. First, the problem that the LSTM (Long Short-Term Memory) input sequence is too long to generate the abstract with prior information has been solved by dividing the encoder into multiple collaborating agents. Then for providing topic information and improving the correlation between the generated abstract and the source text, the joint attention mechanism has been added into our model. Finally, a hybrid training method with reinforcement learning has been employed in order to solve the problem of exposure bias, and optimize the evaluation index directly. The results show that our model not only generate long text summaries with prominent themes, but also has a higher score than the state-of-the-art models, which indicates that with the help of topic information, the model for communicating agents can be expected to generate long text summaries better.

Key words: abstractive text summarization, communicating agents, topic awareness, joint attention, reinforcement learning

CLC Number: 

  • TP183