西安电子科技大学学报 ›› 2024, Vol. 51 ›› Issue (6): 117-131.doi: 10.19665/j.issn1001-2400.20240311
收稿日期:
2023-12-20
出版日期:
2024-10-25
发布日期:
2024-10-25
通讯作者:
李雁妮(1962—),女,教授,E-mail:yannili@mail.xidian.edu.cn作者简介:
赵从健†(1997—),男,硕士,E-mail:zhaocj951@gmail.com;†—共同一作
基金资助:
ZHAO Congjian†(), JIAO Yiyuan†(
), LI Yanni(
)
Received:
2023-12-20
Online:
2024-10-25
Published:
2024-10-25
摘要:
语句级实体关系抽取(以下简称实体关系抽取)意指从给定的一条语句中抽取其中一对实体之间的语义关系,它是人工智能中知识图谱构建、自然语言处理、智能问答、Web搜索等应用的重要基础,是当前人工智能中最前沿的基础研究难题。随着深度神经网络在多个领域的成功应用,现已出现了多种基于深度神经网络模型的实体关系抽取算法。近几年来,随着持续地处理与理解文本信息的需求,开始出现了一些实体关系抽取与持续学习相结合的深度持续实体关系抽取算法。该类算法可以使模型在不遗忘已学习的旧任务知识的同时,可持续高效地进行序列性的多个任务的实体关系抽取。文中将对现有典型的深度实体关系抽取和持续实体关系抽取方法,从其深度网络模型、算法框架、性能特征等方面进行深入分析综述,并指出其研究发展趋势,为实体关系抽取的深入研究起到抛砖引玉的作用。
中图分类号:
赵从健, 焦一源, 李雁妮. 深度语句级实体关系抽取综述[J]. 西安电子科技大学学报, 2024, 51(6): 117-131.
ZHAO Congjian, JIAO Yiyuan, LI Yanni. Overview of deep sentence-level entity relation extraction[J]. Journal of Xidian University, 2024, 51(6): 117-131.
表1
典型RE方法特性一览表"
分类 | 算法 | 主干网络模型 | 损失函数 | 分类性能 |
---|---|---|---|---|
CNN/RNN-based | DepLCNN[ | CNN | 交叉熵损失函数 | 较低 |
BRCNN[ | CNN | 双向交叉熵损失函数 | 较低 | |
ATBLSTM[ | LSTM | 交叉熵损失函数 | 较低 | |
REC[ | BiLSTM | 交叉熵损失函数 | 中等 | |
GCN/GNN-based | GraphCache[ | BERT | 交叉熵损失函数 | 中等 |
ERECKG[ | BERT | 双目标的交叉熵损失和三元组损失 | 中等 | |
RE2[ | BERT | 交叉熵损失函数 | 较高 | |
TS-GCN[ | BERT | 交叉熵损失函数 | 较高 | |
PTL-based | BERTEM[ | BERT | 优化的交叉熵损失函数 | 较高 |
R-BERT[ | BERT | 交叉熵损失函数 | 较高 | |
CP[ | BERT | 双目标的三元组损失和掩码语言建模损失 | 较高 | |
Knowledge-based | Student-R[ | BiLSTM | 双目标最高分数差损失和知识蒸馏损失 | 较高 |
RECON[ | BiLSTM+CNN | 交叉熵损失函数 | 较高 | |
KnowPrompt[ | BERT | 双目标的隐式结构化约束损失和掩码损失 | 较高 | |
PTCAS[ | BERT | 双目标知识结构化约束损失和掩码语言建模损失 | 较高 |
表2
现有CRE方法特性一览表"
算法 | 主干网络模型 | CL中抵抗CF的主要机制 | 抵抗遗忘能力 | 算法分类性能 |
---|---|---|---|---|
EA-EMR[ | BiLSTM[ | 不同任务的语义表示嵌入/表示空间对齐 | 较低 | 较低 |
MLLRE[ | BiLSTM | 元学习机制 | 较低 | 较低 |
EMAR[ | BiLSTM | 关系原型+回放 | 中等 | 较低 |
ELRE[ | BiLSTM | 正则化+回放 | 中等 | 较低 |
RP-CRE[ | BERT[ | 精化原型样本表示+回放 | 中等 | 中等 |
CML[ | BiLSTM | 课程学习+元学习 | 中等 | 中等 |
CEMR[ | BiLSTM | 有效样本选择与替换+回放 | 较高 | 中等 |
CRL[ | BERT | 对比学习+知识蒸馏 | 较高 | 较高 |
CRECL[ | BERT | 分类网络+原型对比网络 | 较高 | 较高 |
KIP-Frame[ | Prompt | 利用注意力机制求更好的原型+回放 | 较高 | 较高 |
ACA[ | BERT | 数据增强+回放 | 较高 | 较高 |
MRMR[ | BiLSTM | 三阶段的学习机制+回放 | 较高 | 较高 |
CRL-protoAug[ | BERT | 一致表示学习与原型增强+回放 | 较高 | 较高 |
ConPL[ | BERT | 样本和原型一致性分布+回放 | 较高 | 较高 |
SCKD[ | BERT | 序列知识蒸馏与对比学习+回放 | 较高 | 较高 |
CDec[ | BERT | 分类器分解+回放 | 较高 | 较高 |
CEAR[ | BERT | 记忆不敏感原型与记忆增强+回放 | 较高 | 较高 |
FDCRE[ | BERT | 逐类正则化+回放 | 较高 | 较高 |
BFP[ | BERT | 后向投影+长短期记忆+回放 | 较高 | 较高 |
[1] | 史慧洋, 魏靖烜, 蔡兴业, 等. 威胁情报提取与知识图谱构建技术研究[J]. 西安电子科技大学学报, 2023, 50(4):65-75. |
SHI H, WEI J, CAI X, et al. Research on Threat Intelligence Extraction and Knowledge Graph Construction Technology[J]. Journal of Xidian University, 2023, 50(4):65-75. | |
[2] | 谢俊, 刘睿林, 陈旭. 基于知识图谱的人工智能伦理发展趋势与展望研究述评[J]. 西安电子科技大学学报(社会科学版), 2023, 33(1):53-73. |
XIE J, LIU R, CHEN X. A Review of Research on the Trends and Prospects of Artificial Intelligence Ethics Based on Knowledge Graphs[J]. Journal of Xidian University(Social Science Edition), 2023, 33(1):53-73. | |
[3] | 张相南, 高新波, 田春娜. 基于多边形特征池化与融合的复杂文本检测[J]. 西安电子科技大学学报, 2024, 51(3):113-123. |
ZHANG X, GAO X, TIAN C. Complex Text Detection Based on Polygonal Feature Pooling and Fusion[J]. Journal of Xidian University, 2024, 51(3):113-123. | |
[4] | ZENG D, LIU K, LAI S, et al. Relation Classification via Convolutional Deep Neural Network[C]// Proceedings of the 25th International Conference on Computational Linguistics(COLING 2014). Dublin:ACL, 2014:2335-2344. |
[5] | WANG H, XIONG W, YU M, et al. Sentence Embedding Alignment for Lifelong Relation Extraction[C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics(NAACL 2019). Minneapolis:ACL, 2019:1-11. |
[6] | THRUN S, MITCHELL T M. Lifelong Robot Learning[J]. Robotics and Autonomous Systems, 1995, 15(1-2):25-46. |
[7] | CHEN Z, LIU B. Lifelong Machine Learning[M]. Morgan & Claypool Publishers, 2018. |
[8] | WANG H, LU G, YIN J, et al. Relation Extraction:A Brief Survey on Deep Neural Network Based Methods[C]// Proceedings of the 4th International Conference on Software Engineering and Information Management(ICSIM 2021). New York: ACM, 2021:220-228. |
[9] | PAWAR S, PALSHIKAR G K, BHATTACHARYYA P. Relation Extraction:A Survey(2017)[J/OL]. [2017-12-14]. https://arxiv.org/abs/1712.05191. |
[10] | MCCLOSKEY M, COHEN N J. Catastrophic Interference in Connectionist Networks:The Sequential Learning Problem[J]. Psychology of Learning and Motivation, 1989, 24:109-165. |
[11] | RATCLIFF R. Connectionist Models of Recognition Memory:Constraints Imposed by Learning and Forgetting Functions[J]. Psychological Review, 1990, 97(2):285. |
[12] | MASANA M, LIU X, TWARDOWSKI B, et al. Class-Incremental Learning:Survey and Performance Evaluation on Image Classification[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022, 45(5):5513-5533. |
[13] | DE LANGE M, ALJUNDI R, MASANA M, et al. A Continual Learning Survey:Defying Forgetting in Classification Tasks[J]. IEEE Transactions on Pattern Analysis & Machine Intelligence, 2022, 44(7):3366-3385. |
[14] | WANG L, ZHANG X, SU H, et al. A Comprehensive Survey of Continual Learning:Theory,Method and Application[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2024, 46(8):5362-5383. |
[15] |
LI Z, HOIEM D. Learning without Forgetting[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2018, 40(12):2935-2947.
doi: 10.1109/TPAMI.2017.2773081 pmid: 29990101 |
[16] | QIN Q, HU W, PENG H, et al. BNS:Building Network Structures Dynamically for Continual Learning[C]// Advances in Neural Information Processing Systems(NeurIPS 2021). Virtual: Curran Associates, Inc., 2021:20608-20620. |
[17] | KIRKPATRICK J, PASCANU R, RABINOWITZ N, et al. Overcoming Catastrophic Forgetting in Neural Networks[J]. Proceedings of the National Academy of Sciences, 2017, 114(13):3521-3526. |
[18] | AHN H, CHA S, LEE D, et al. Uncertainty Based Continual Learning with Adaptive Regularization[C]// Advances in Neural Information Processing Systems(NeurIPS 2019). Vancouver: Curran Associates, Inc., 2019:4394-4404. |
[19] | REBUFF S, KOLESNIKOV A, SPERL G, et al. iCaRL:Incremental Classifier and Representation Learning[C]// Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition(CVPR 2017). Honolulu: IEEE Computer Society, 2017,5533-5542. |
[20] | LOPEZ-PAZ D, RANZATO M. Gradient Episodic Memory for Continual Learning[C]// Advances in Neural Information Processing Systems(NeurIPS 2017). Long Beach: Curran Associates,Inc., 2017,6467-6476. |
[21] | ZENG G, CHEN Y, CUI B, et al. Continual Learning of Context-Dependent Processing in Neural Networks[J]. Nature Machine Intelligence, 2019, 1(8):364-372. |
[22] | SAHA G, GARG I,ROYK. Gradient Projection Memory for Continual Learning[C]// Proceedings of the 9th International Conference on Learning Representations(ICLR 2021). Virtual Event: OpenReview. net, 2021:1-18. |
[23] | LIU H, LIU H. Continual Learning with Recursive Gradient Optimization[C]// Proceedings of the 10th International Conference on Learning Representations(ICLR 2022). Virtual Event: OpenReview. net, 2022:1-18. |
[24] | KE Z, LIU B, HUANG X. Continual Learning of a Mixed Sequence of Similar and Dissimilar Tasks[C]// Advances in Neural Information Processing Systems(NeurIPS 2020). Virtual: Curran Associates, Inc., 2020:18493-18504. |
[25] | WORTSMAN M, RAMANUJAN V, LIU R, et al. Supermasks in Superposition[C]// Advances in Neural Information Processing Systems(NeurIPS 2020). Virtual: Curran Associates,Inc., 2020,15173-15184. |
[26] | SERRA J, SURIS D, MIRON M, et al. Overcoming Catastrophic Forgetting with Hard Attention to the Task[C]// Proceedings of the 35th International Conference on Machine Learning(ICML 2018). Stockholm:PMLR, 2018:4548-4557. |
[27] | WANG L, ZHANG X, LI Q, et al. Incorporating Neuro-Inspired Adaptability for Continual Learning in Artificial Intelligence[J]. Nature Machine Intelligence, 2023, 5(12):1356-1368. |
[28] | WANG L, ZHANG X, LI Q, et al. CoSCL:Cooperation of Small Continual Learners is Stronger than a Big One[C]// European Conference on Computer Vision(ECCV 2020). Tel Aviv: Springer, 2022:254-271. |
[29] | KE Z, LIU B, MA N, et al. Achieving Forgetting Prevention and Knowledge Transfer in Continual Learning[C]// Advances in Neural Information Processing Systems(NeurIPS 2021). Virtual: Curran Associates, Inc., 2021:22443-22456. |
[30] | LIN S, YANG L, FAN D, et al. TRGP:Trust Region Gradient Projection for Continual Learning[C]// The Tenth International Conference on Learning Representations(ICLR 2022). Virtual Event: OpenReview. net, 2022:1-15. |
[31] | DOS SANTOS C, XIANG B, ZHOU B. Classifying Relations by Ranking with Convolutional Neural Networks[C]// Joint Conference of the Annual Meeting of the Association for Computational Linguistics and the International Joint Conference on Natural Language Processing(ACL2015). Beijing:ACL, 2015:626-634. |
[32] | WANG L, CAO Z, DE MELO G, et al. Relation Classification via Multi-Level Attention CNNs[C]// Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics(ACL 2016). Berlin:ACL, 2016:1298-1307. |
[33] | XIAO M, LIU C. Semantic Relation Classification via Hierarchical Recurrent Neural Network with Attention[C]// Proceedings of 26th International Conference on Computational Linguistics(COLING 2016). Osaka:ACL, 2016:1254-1263. |
[34] | ZHOU P, SHI W, TIAN J, et al. Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification[C]// Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics(ACL 2016). Berlin:ACL, 2016:207-212. |
[35] | XU K, FENG Y, HUANG S, et al. Semantic Relation Classification via Convolutional Neural Networks with Simple Negative Sampling[C]// Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing(EMNLP 2015). Lisbon:ACL, 2015:536-540. |
[36] | CAI R, ZHANG X, WANG H. Bidirectional Recurrent Convolutional Neural Network for Relation Classification[C]// Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics(ACL 2016). Berlin:ACL, 2016:756-765. |
[37] | ZHANG R, MENG F, ZHOU Y, et al. Relation Classification via Recurrent Neural Network with Attention and Tensor Layers[J]. Big Data Mining and Analytics, 2018, 1(3):234-244. |
[38] | LIU H, WANG P, WU F, et al. REET:Joint Relation Extraction and Entity Typing via Multi-Task Learning[C]// Natural Language Processing and Chinese Computing:8th CCF International Conference(NLPCC 2019). Dunhuang:Springer, 2019:327-339. |
[39] | HUANG Z, XU W, YU K. Bidirectional LSTM-CRF Models for Sequence Tagging(2015)[J/OL]. [2015-08-09]. https://arxiv.org/abs/1508.01991. |
[40] |
ZHANG Y. Relation Extraction in Chinese Using Attention-Based Bidirectional Long Short-Term Memory Networks[J]. PeerJ Computer Science, 2023, 9:e1509.
doi: 10.7717/peerj-cs.1509 pmid: 37705662 |
[41] | LIU D, ZHANG Y, LI Z. A Survey of Graph Neural Network Methods for Relation Extraction[C]// 2022 IEEE 10th Joint International Information Technology and Artificial Intelligence Conference(ITAIC). Chongqing:IEEE, 2022:2209-2223. |
[42] | ZHU H, LIN Y, LIU Z, et al. Graph Neural Networks with Generated Parameters for Relation Extraction[C]// Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics(ACL 2019). Florence:ACL, 2019:1331-1339. |
[43] | ZHAO Y, WAN H, GAO J, et al. Improving Relation Classification by Entity Pair Graph[C]// Asian Conference on Machine Learning(ACML 2019). Nagoya:PMLR, 2019:1156-1171. |
[44] | MANDYA A, BOLLEGALA D, COENEN F. Contextualized Graph Attention for Improved Relation Extraction(2020)[J/OL]. [2020-04-22]. https://arxiv.org/abs/2004.10624. |
[45] | SUN Z, ZHU Y, TANG J, et al. Improve Relation Extraction with Dual Attention-Guided Graph Convolutional Networks[J]. Neural computing & applications, 2021, 33(6):1773-1784. |
[46] | MAHENDRAN D, TANG C, MCINNES B T. Graph Convolutional Networks for Chemical Relation Extraction[C]// Companion Proceedings of the Web Conference(WWW 2020). Virtual Event: ACM, 2022:833-842. |
[47] | WANG Y, CHEN M, ZHOU W, et al. GraphCache:Message Passing as Caching for Sentence-Level Relation Extraction[C]// Findings of the Association for Computational Linguistics(NAACL 2022). Seattle:ACL, 2022:1698-1708. |
[48] | ZHANG Y, QI P, MANNING C D. Graph Convolution over Pruned Dependency Trees Improves Relation Extraction[C]// Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing(EMNLP 2018). Brussels:ACL, 2018,2205-2215. |
[49] | KENTON J, TOUTANOVA L. BERT:Pre-Training of Deep Bidirectional Transformers for Language Understanding[J]. Proceedings of NAACL-HLT, 2019, 1:4171-4186. |
[50] | ZHANG D, LIU Z, JIA W, et al. Dual Attention Graph Convolutional Network for Relation Extraction[J]. IEEE Transactions on Knowledge and Data Engineering, 2023, 36(2):530-543. |
[51] | DONG Y, XU X. Weighted-Dependency with Attention-Based Graph Convolutional Network for Relation Extraction[J]. Neural Processing Letters, 2023, 55(9):12121-12142. |
[52] | HU X, HONG Z, ZHANG C, et al. Think Rationally about What You See:Continuous Rationale Extraction for Relation Extraction[C]. In Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval(SIGIR 2023). Taipei:ACM, 2023,2436-2440. |
[53] | WANG Z, YANG Y, MA J. Two-Stage Graph Convolutional Networks for Relation Extraction[C]// International Conference on Neural Information Processing Systems(ICONIP 2023). Changsha:Springer, 2023:483-494. |
[54] | 胡代旺, 焦一源, 李雁妮. 一种新型高效的文库知识图谱实体关系抽取算法[J]. 西安电子科技大学学报, 2021, 48(6):75-83. |
HU D, JIAO Y, LI Y. Novel and Efficient Algorithm for Entity Relation Extraction with the Crpus Knowledge Graph[J]. Journal of Xidian University, 2021, 48(6):75-83. | |
[55] | SOARES L B, FITZGERALD N, LING J, et al. Matching the Blanks:Distributional Similarity for Relation Learning[C]// Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics(ACL 2019). Florence:ACL, 2019:2895-2905. |
[56] | WU S, HE Y. Enriching Pre-Trained Language Model with Entity Information for Relation Classification[C]// Proceedings of the 28th ACM International Conference on Information and Knowledge Management(CIKM 2019). Beijing:ACM, 2019:2361-2364. |
[57] | PENG H, GAO T, HAN X, et al. Learning from Context or Names? An Empirical Study on Neural Relation Extraction[C]// Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing(EMNLP 2020). Online:ACL, 2020:3661-3672. |
[58] | MTUMBUK F, SCHOCKAERT S. Entity or Relation Embeddings? An Analysis of Encoding Strategies for Relation Extraction[J]. arXiv preprint arXiv:2312.11062, 2023. |
[59] | WEI Z, SU J, WANG Y, et al. A Novel Cascade Binary Tagging Framework for Relational Triple Extraction[C]// Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics(ACL 2020). Online:ACL, 2020:1476-1488. |
[60] | ZHANG Z, SHU X, YU B, et al. Distilling Knowledge from Well-Informed Soft Labels for Neural Relation Extraction[C]// Proceedings of the Thirty-Fourth AAAI Conference on Artificial Intelligence(AAAI 2020). New York: AAAI Press, 2020:9620-9627. |
[61] | NADGERI A, BASTOS A, SINGH K, et al. KGPool:Dynamic Knowledge Graph Context Selection for Relation Extraction[C]// Findings of the Association for Computational Linguistics(ACL 2021). Online Event: ACL, 2021:535-548. |
[62] | BASTOS A, NADGERI A, SINGH K, et al. RECON:Relation Extraction Using Knowledge Graph Context in a Graph Neural Network[C]// Proceedings of the ACM Web Conference(WWW 2021). Virtual Event: ACM, 2021:1673-1685. |
[63] | CHEN X, ZHANG N, XIE X, et al. KnowPrompt:Knowledge-Aware Prompt-Tuning with Synergistic Optimization for Relation Extraction[C]// Proceedings of the ACM Web Conference(WWW 2022). Virtual Event: ACM, 2022:2778-2788. |
[64] | WANG K, CHEN Y, WEN K, et al. Cue Prompt Adapting Model for Relation Extraction[J]. Connection Science, 2023, 35(1):2161478. |
[65] | CHEN Y,SHIB, XU K. PTCAS:Prompt Tuning with Continuous Answer Search for Relation Extraction[J]. Information Sciences, 2024, 659:120060. |
[66] | DE MASSON D'AUTUME C, RUDER S, KONG L, et al. Episodic Memory in Lifelong Language Learning[C]// Advances in Neural Information Processing Systems(NeurIPS 2019). Vancouver: Curran Associates, Inc., 2019:32. |
[67] | OBAMUYIDE A, VLACHOS A. Meta-Learning Improves Lifelong Relation Extraction[C]// Proceedings of the 4th Workshop on Representation Learning for NLP(RepL4NLP 2019). Florence:ACL, 2019:224-229. |
[68] | HAN X, DAI Y, GAO T, et al. Continual Relation Learning via Episodic Memory Activation and Reconsolidation[C]// Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics(ACL 2020). Online:ACL, 2020:6429-6440. |
[69] | SHEN H, JU S, SUN J, et al. Efficient Lifelong Relation Extraction with Dynamic Regularization[C]// Natural Language Processing and Chinese Computing:9th CCF International Conference(NLPCC 2020). Zhengzhou:Springer, 2020:181-192. |
[70] | CUI L, YANG D, YU J, et al. Refining Sample Embeddings with Relation Prototypes to Enhance Continual Relation Extraction[C]// Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing(JCNLP 2021). Virtual Event: ACL, 2021:232-243. |
[71] | WU T, LI X, LI Y, et al. Curriculum-Meta Learning for Order-Robust Continual Relation Extraction[C]// Proceedings of the AAAI Conference on Artificial Intelligence(AAAI 2021). Virtual Event: AAAI Press, 2021:10363-10369. |
[72] | CHEN Y, WEN Y, ZHANG H. Cost-Effective Memory Replay for Continual Relation Extraction[C]// Web Information Systems and Applications:18th International Conference(WISA 2021). Kaifeng:Springer, 2021:335-346. |
[73] | ZHAO K, XU H, YANG J, et al. Consistent Representation Learning for Continual Relation Extraction[C]// Findings of the Association for Computational Linguistics(ACL 2022). Dublin:ACL, 2022:3402-3411. |
[74] | HU C, YANG D, JIN H, et al. Improving Continual Relation Extraction through Prototypical Contrastive Learning[C]// Proceedings of the 29th International Conference on Computational Linguistics(COLING 2022). Gyeongju:ACL, 2022:1885-1895. |
[75] | ZHANG H, LIANG B, YANG M, et al. Prompt-Based Prototypical Framework for Continual Relation Extraction[J]. IEEE/ACM Transactions on Audio,Speech,and Language Processing, 2022, 30:2801-2813. |
[76] | EFEOGLU S. A Continual Relation Extraction Approach for Knowledge Graph Completeness(2024)[J/OL]. [2024-04-20]. https://arxiv.org/abs/2404.17593. |
[77] | HE K, MAO R, GONG T, et al. JCBIE:A Joint Continual Learning Neural Network for Biomedical Information Extraction[J]. BMC Bioinformatics, 2022, 23(1):1-20. |
[78] | SUN Y, WANG S, LI Y, et al. Ernie 2.0:A Continual Pre-Training Framework for Language Understanding[C]// Proceedings of the AAAI conference onArtificial Intelligence(AAAI 2020). New York: AAAI Press, 2020:8968-8975. |
[79] | KIM G, XIAO C, KONISHI T, et al. Learnability and Algorithm for Continual Learning[C]// International Conference on Machine Learning(ICML 2023). Honolulu:PMLR, 2023:16877-16896. |
[80] | CHEN Q, SUN J, PALADE V, et al. Continual Relation Extraction via Linear Mode Connectivity and Interval Cross Training[J]. Knowledge-Based Systems, 2023:110288. |
[81] | WANG P, SONG Y, LIU T, et al. Learning Robust Representations for Continual Relation Extraction via Adversarial Class Augmentation[C]// Proceedings of Empirical Methods in Natural LanguageProcessing(EMNLP 2022). Abu Dhabi: ACL, 2022:6264-6278. |
[82] | THI Q T P, PHAM A C, NGO N H, et al. Memory-Based Method using Prototype Augmentation for Continual Relation Extraction[C]// RIVF International Conference on Computing and Communication Technologies(RIVF 2022). Ho Chi Minh City: IEEE, 2022:1-6. |
[83] | CHEN X, WU H, SHI X. Consistent Prototype Learning for Few-Shot Continual Relation Extraction[C]// Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics(ACL 2023). Toronto:ACL, 2023:7409-7422. |
[84] | WANG X, WANG Z, HU W. Serial Contrastive Knowledge Distillation for Continual Few-shot Relation Extraction[C]// Findings of the Association for Computational Linguistics(ACL 2023). Toronto:ACL, 2023:12693-12706. |
[85] | XIA H, WANG P, LIU T, et al. Enhancing Continual Relation Extraction via Classifier Decomposition[C]// Findings of the Association for Computational Linguistics(ACL 2023). Toronto:ACL, 2023:10053-10062. |
[86] | ZHAO W, CUI Y, HU W. Improving Continual Relation Extraction by Distinguishing Analogous Semantics[C]// Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics(ACL 2023). Toronto:ACL, 2023:1162-1175. |
[87] | NGUYEN H, NGUYEN C, NGO L, et al. A Spectral Viewpoint on Continual Relation Extraction[C]// Findings of the Association for Computational Linguistics(EMNLP 2023). Toronto:ACL, 2023:9621-9629. |
[88] | LUO J, KONG W, CHEN L, et al. Improving Continual Relation Extraction with LSTM and Back Forward Projection[C]// 20th International Computer Conference on Wavelet Active Media Technology and Information Processing(ICCWAMTIP 2023). Chengdu:IEEE, 2023:1-5. |
[89] | LIN H, SHAO Y, QIAN W, et al. Class Incremental Learning via Likelihood Ratio Based Task Prediction[C].// The 26th International Conference on Learning Representations(ICLR 2024).Vienna:OpenReview.net, 2024,1-40. |
[1] | 张静, 吴慧雪, 张少博, 李云松. 分布式策略下的解码端增强图像压缩网络[J]. 西安电子科技大学学报, 2025, 52(1): 1-13. |
[2] | 王潮, 蒋晓锋, 王苏敏. 面向直觉推理的量子效应交通预测算法研究[J]. 西安电子科技大学学报, 2025, 52(1): 152-162. |
[3] | 徐海涛, 刘玉哲, 闫欣怡, 李娇娇, 薛长斌. 一种高光谱与LiDAR特征耦合的融合分类网络[J]. 西安电子科技大学学报, 2024, 51(6): 73-83. |
[4] | 武鑫婷, 黄樱, 牛保宁, 关虎, 兰方鹏, 刘杰. 图像纹理引导的迭代水印模型[J]. 西安电子科技大学学报, 2024, 51(5): 110-121. |
[5] | 张铭津, 周楠, 李云松. 平滑交互式压缩网络的红外小目标检测算法[J]. 西安电子科技大学学报, 2024, 51(4): 1-14. |
[6] | 高迪辉, 盛立杰, 许小冬, 苗启广. 图文跨模态检索的联合特征方法[J]. 西安电子科技大学学报, 2024, 51(4): 128-138. |
[7] | 万鹏武, 惠茜, 陈东瑞, 吴波. 基于二维异步同相正交直方图的调制方式识别[J]. 西安电子科技大学学报, 2024, 51(4): 78-90. |
[8] | 管业鹏, 苏光耀, 盛怡. 双向长短期记忆网络的时间序列预测方法[J]. 西安电子科技大学学报, 2024, 51(3): 103-112. |
[9] | 贺王鹏, 胡德顺, 李诚, 周悦, 郭宝龙. 结合模板更新与轨迹预测的孪生网络跟踪算法[J]. 西安电子科技大学学报, 2024, 51(3): 46-54. |
[10] | 刘伟, 王孟洋, 白宝明. 面向带宽受限场景的高效语义通信方法[J]. 西安电子科技大学学报, 2024, 51(3): 9-18. |
[11] | 刘振岩, 张华, 刘勇, 杨立波, 王梦迪. 一种高效的软件模糊测试种子生成方法[J]. 西安电子科技大学学报, 2024, 51(2): 126-136. |
[12] | 翟凤文, 孙芳林, 金静. 多尺度卷积结合Transformer的抑郁脑电分类研究[J]. 西安电子科技大学学报, 2024, 51(2): 182-195. |
[13] | 丁昕苗, 王家兴, 郭文. 三维注意力增强的暴力场景检测算法[J]. 西安电子科技大学学报, 2024, 51(1): 114-124. |
[14] | 刘博翀, 蔡怀宇, 汪毅, 陈晓冬. 用于语义分割的自监督对比式表征学习[J]. 西安电子科技大学学报, 2024, 51(1): 125-134. |
[15] | 熊敬伟, 潘继飞, 毕大平, 杜明洋. 面向雷达行为识别的多尺度卷积注意力网络[J]. 西安电子科技大学学报, 2023, 50(6): 62-74. |
|