[1] |
Sri S H B, Dutta S R. A survey on automatic text su-mmarization techniques[C]. Kancheepuram: International Conference on Physics and Energy, 2021:121-135.
|
[2] |
李金鹏, 张闯, 陈小军, 等. 自动文本摘要研究综述[J]. 计算机研究与发展, 2021, 58(1):1-21.
|
|
LI Jinpeng, Zhang Chuang, Chen Xiaojun, et al. Survey on automatic text summarization[J]. Journal of Computer Research and Development, 2021, 58(1):1-21.
|
[3] |
Luhn H P. The automatic creation of literature abstracts[J]. IBM Journal of Research and Development, 1958, 2(2):159-165.
doi: 10.1147/rd.22.0159
|
[4] |
Zhang J, Zhao Y, Saleh M, et al. Pegasus:Pretraining with extracted gap-sentences for abstractive summarization[C]. Vienna: International Conference on Machine L-earning, 2020:11328-11339.
|
[5] |
Puspitaningrum D. A survey of recent abstract summarization techniques[C]. Singapore: The Sixth International Congress on Information and Communication Technology, 2022:783-801.
|
[6] |
Goodwin T R, Savery M E, Demner-Fushman D. Flight of the PEGASUS? Comparing transformers on few-shot and zero-shot multi-document abstractive summarization[C]. Barcelona: International Conference on Computational Linguistics, 2020:5640-5645.
|
[7] |
Yang T H, Lu C C, Hsu W L. More than extracting "I-mportant" sentences:The application of PEGASUS[C]. Taichung: International Conference on Technologies and Applications of Artificial Intelligence, 2021:131-134.
|
[8] |
Yadav D, Lalit N, Kaushik R, et al. Qualitative analysis of text summarization techniques and its spplications in health domain[J]. Computational Intelligence and Neuroscience, 2022, 20(2):1-14.
|
[9] |
Mathur A, Suchithra M. Application of abstractive su-mmarization in multiple choice question generation[C]. Greater Noida: International Conference on Computational Intelligence and Sustainable Engineering Solutions, 2022:409-413.
|
[10] |
李岱峰, 林凯欣, 李栩婷. 基于提示学习与T5 PEGASUS的图书宣传自动摘要生成器[J]. 数据分析与知识发现, 2023, 7(3):121-130.
|
|
Li Daifeng, Lin Kaixin, Li Yuting. A books promotion abstractive summarization method based on prompt learning and T5 PEGASUS[J]. Data Analysis and Knowledge Discovery, 2023, 7(3):121-130.
|
[11] |
Raffel C, Shazeer N, Roberts A, et al. Exploring the limits of transfer learning with a unified text-to-text transformer[J]. Journal of Mach Learn Research, 2020, 21(4):1-67.
|
[12] |
Xue L, Constant N, Roberts A, et al. mT5:A massively multilingual pre-trained text-to-text transformer[C]Online: Conference of the North American Chapter of the Association for Computational Linguistics-Human Language Technologies, 2021:483-498.
|
[13] |
施旭涛. 基于堆叠BiLSTM的中文自动文本摘要研究[D]. 昆明: 云南大学, 2019:1-21.
|
|
Shi Xutao. Research on automatic Chinese text summarization based on stack BiLSTM[D]. Kunming: Yunnan University, 2019:1-21.
|
[14] |
李辉, 王一丞. 基于CNNCIFG-Attention模型的文本情感分类[J]. 电子科技, 2022, 35(2):46-51.
|
|
Li Hui, Wang Yicheng. CNNCIFG-Attention model forttext sentiment classifcation[J]. Electronic Science and Technology, 2022, 35(2):46-51.
|
[15] |
Hu B, Chen Q, Zhu F. LCSTS:A large scale Chinese short text summarization dataset[J]. Computer Science, 2015(1):1967-1972.
|
[16] |
Lin C Y. Rouge:A package for automatic evaluation of summaries[C]. Barcelona: The Workshop on Text Summarization Branches Ou, 2004:74-81.
|
[17] |
韩肖赟. 舆情分析的混合主题模型研究与应用[D]. 西安: 陕西科技大学, 2020:17-18.
|
|
Han Xiaoyun. Research and application of hybrid topic model for public opinion analysis[D]. Xi'an: Shaanxi University of Science & Technology, 2020:17-18.
|
[18] |
李福鹏, 付东翔. 编码器的金融文本情感分析方法[J]. 电子科技, 2020, 33(9):10-15.
|
|
Li Fupeng, Fu Dongxiang. Sentiment analysis method of financial text based on transformer encoder[J]. Electronic Science and Technology, 2020, 33(9):10-15.
|