Electronic Science and Technology ›› 2020, Vol. 33 ›› Issue (9): 10-15.doi: 10.16180/j.cnki.issn1007-7820.2020.09.002

Previous Articles     Next Articles

Sentiment Analysis Method of Financial Text Based on Transformer Encoder

LI Fupeng,FU Dongxiang   

  1. School of Optical-Electrical and Computer Engineering,University of Shanghai for Science and Technology, Shanghai 200093,China
  • Received:2019-07-04 Online:2020-09-15 Published:2020-09-12
  • Supported by:
    National Natural Science Foundation of China(61703277);National Natural Science Foundation of China(61605114)

Abstract:

Sentiment analysis plays an important role in many fields, but most of these studies focus on the field of commodity reviews and microblogs, and lack of sentiment analysis of financial texts. To solve this problem, a financial text sentiment analysis method based on Transformer encoder is presented in this paper. Transformer encoder is a feature extraction unit based on the self-attention mechanism. When Transformer encoder processes text sequence information, it can link any two words in a sentence without distance restriction, which overcomes the problem of long-term dependencies. The multi-head attention mechanism is used to calculate the same sentence several times and capture more semantic features implied in the context. The experiment is car+ried out on a balanced corpus data set based on financial news. The experimental results show that the method based on Transformer encoder has best effect in the field of financial text sentiment analysis compared with the models based on convolution neural network and recurrent neural network.

Key words: sentiment analysis, finance, self-attention mechanism, transformer encoder, scaled dot-product attention, multi-head attention

CLC Number: 

  • TP391