[1] |
Hinton G E, Osindero S, Teh Y W. A fast learning algorithm for deep belief nets[J]. Neural Computation, 2006, 18(7): 1527-1554.
doi: 10.1162/neco.2006.18.7.1527
pmid: 16764513
|
[2] |
Hochreiter S, Schmidhuber J. Long short-term memory[J]. Neural Computation, 1997, 9(8):1735-1780.
doi: 10.1162/neco.1997.9.8.1735
pmid: 9377276
|
[3] |
李彦冬, 郝宗波, 雷航. 卷积神经网络研究综述[J]. 计算机应用, 2016, 36(9):2508-2515.
doi: 10.11772/j.issn.1001-9081.2016.09.2508
|
|
Li Yandong, Hao Zongbo, Lei Hang. Survey of convolutional neural network[J]. Journal of Computer Applications, 2016, 36(9):2508-2515.
doi: 10.11772/j.issn.1001-9081.2016.09.2508
|
[4] |
Qiu X, Sun T, Xu Y, et al. Pre-trained models for natural language processing:A survey[J]. Science China Technological Sciences, 2020, 63(10):1872-1897.
doi: 10.1007/s11431-020-1647-3
|
[5] |
Matthew E Peters, Mark Neumann, Mohit Iyyer, et al. Deep contextualized word representations[C]. New Orleans: Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2018:651-660.
|
[6] |
Radford A, Narasimhan K, Salimans T, et al. Improving language understanding by generative pre-training[J/OL].(2018-06-16)[2022-09-10]https://www.cs.ubc.ca/-amuham01/LING530/papers/radford2018improving.pdf.
|
[7] |
Devlin J, Wei M, Kenton C, et al. BERT:Pre-training of deep bidirectional transformers for language understanding[C]. Minneapolis: Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2019:89-97.
|
[8] |
吴博, 梁循, 张树森, 等. 图神经网络前沿进展与应用[J]. 计算机学报, 2022, 45(1):35-68.
|
|
Wu Bo, Liang Xun, Zhang Shusen, et al. Advances and applications in graph neural network[J]. Chinese Journal of Computers, 2022, 45(1):35-68.
|
[9] |
Peng H, Li J, He Y, et al. Large-scale hierarchical text classification with recursively regularized deep graph-CNN[C]. Lyon: Proceedings of the World Wide Web Conference, 2018:866-879.
|
[10] |
Yao L, Mao C, Luo Y. Graph convolutional networks for text classification[C]. Honolulu: Proceedings of the AAAI Conference on Artificial Intelligence, 2019:671-680.
|
[11] |
Zhang Y, Yu X, Cui Z, et al. Every document owns its structure:Inductive text classification via graph neural networks[C]. Seattle: The Fifty-eighth Annual Meeting of the Association for Computational Linguistics, 2020:9-17.
|
[12] |
丁锋, 孙晓. 基于注意力机制和BiLSTM-CRF的消极情绪意见目标抽取[J]. 计算机科学, 2021, 49(2):223-230.
|
|
Ding Feng, Sun Xiao. Negative-emotion opinion target extraction based on attention and BiLSTM-CRF[J]. Computer Science, 2021, 49(2):223-230.
|
[13] |
Hu D, Wei L, Huai X. Dialoguecrn:Contextual reasoning networks for emotion recognition in conversations[J]. Computation and Language, 2021, 28(9):1-11.
|
[14] |
Huan J L, Sekh A A, Quek C, et al. Emotionally charged text classification with deep learning and sentiment semantic[J]. Neural Computing and Applications, 2022, 34(3):2341-2351.
doi: 10.1007/s00521-021-06542-1
|
[15] |
Luo W, Zhang L. Question text classification method of tourism based on deep learning model[J]. Wireless Communications and Mobile Computing, 2022, 21(1):1-9.
|
[16] |
Wang X, Kim H C. Text categorization with improved deep learning methods[J]. Journal of Information and Communication Convergence Engineering, 2018, 16(2): 106-113.
|
[17] |
Liu Y, Li P, Hu X. Combining context-relevant features with multi-stage attention network for short text classification[J]. Computer Speech & Language, 2022, 71(7):1-14.
|
[18] |
Soni S, Chouhan S S, Rathore S S. TextConvoNet:A convolutional neural network based architecture for text classification[J]. Applied Intelligence, 2022, 53(10):14249-14268.
doi: 10.1007/s10489-022-04221-9
|
[19] |
Li R, Chen H, Feng F, et al. Dual graph convolutional networks for aspect-based sentiment analysis[C]. Online: Proceedings of the Fifty-ninth Annual Meeting of the Association for Computational Linguistics and the Eleventh International Joint Conference on Natural Language Processing, 2021:156-163.
|
[20] |
Shen W, Wu S, Yang Y, et al. Directed acyclic graph ne-twork for conversational emotion recognition[C]. Online: Proceeding of the Fifty-ninth Annual Meeting of the Association for Computational Linguistics and the Eleventh Internation Joint Conference on Natural Language Processing, 2021:1551-1560.
|
[21] |
Yan H, Gui L, Pergola G, et al. Position bias mitigation:A knowledge-aware graph model for emotion cause extraction[C]. Online: Proceeding of the Fifty-ninth Annual Meeting of the Association for Computational Linguistics and the Eleventh Internation Joint Conference on Natural Language Processing, 2021:3364-3375.
|
[22] |
Mittal V, Gangodkar D, Pant B. Deep graph-long short-term memory:A deep learning based approach for text classification[J]. Wireless Personal Communications, 2021, 119(3):2287-2301.
doi: 10.1007/s11277-021-08331-4
|
[23] |
Prabhakar S K, Won D O. Medical text classification using hybrid deep learning models with multi head attention[J]. Computational Intelligence and Neuroscience, 2021(3):1-16.
|
[24] |
邓维斌, 朱坤, 李云波, 等. FMNN:融合多神经网络的文本分类模型[J]. 计算机科学, 2022, 49(3):281-287.
doi: 10.11896/jsjkx.210200090
|
|
Deng Weibin, Zhu Kun, Li Yunbo, et al. FMNN:Text classification model fused with multiple neural networks[J]. Computer Science, 2022, 49(3):281-287.
doi: 10.11896/jsjkx.210200090
|
[25] |
Yu S, Su J, Luo D. Improving bert-based text classification with auxiliary sentence and domain knowledge[J]. IEEE Access, 2019(7):176600-176612.
|
[26] |
Hu Y, Ding J, Dou Z, et al. Short-text classification detector: abert-based mental approach[J]. Computational Intelligence and Neuroscience, 2022(4):1-11.
|