Electronic Science and Technology ›› 2022, Vol. 35 ›› Issue (2): 46-51.doi: 10.16180/j.cnki.issn1007-7820.2022.02.008

Previous Articles     Next Articles

CNNCIFG-Attention Model for Text Sentiment Classifcation

LI Hui1,WANG Yicheng2   

  1. 1. School of Physics and Electronic Information Engineering,Henan Polytechnic University,Jiaozuo 454000,China
    2. School of Electrical Engineering and Automation,Henan Polytechnic University,Jiaozuo,454000,China
  • Received:2020-10-13 Online:2022-02-15 Published:2022-02-24
  • Supported by:
    National Natural Science Foundation of China(11804081)


Neural networks are weak in text salient feature extraction and have relatively slow learning rate in processing Chinese text sentiment classification tasks. To solve this problem, this study proposes a hybrid network model based on attention mechanism. This study preprocesses the text corpus, uses the traditional convolutional neural network to extract the feature of the local information of the sample vector. Then, extracted features are input into the coupled input and forget gate network model to learn the connection between the preceding and following words and sentences. Subsequently, the attention mechanism layer is added to assign weights to deep-level text information to improve the intensity of the influence of important information on text sentiment classification. Finally, the proposed hybrid network model is tested on the crawled JD product review collection. The test results show that the accuracy of the proposed method reaches 92.13%, and the F-Score value is 92.06%, which proves the feasibility of the CNNCIFG-Attention model.

Key words: sentiment classification, hybrid network model, convolutional neural network, feature extraction, coupled input and forget gate network, attention model, weight distribution, accuracy, F-Score value

CLC Number: 

  • TP391.1