[1]张琳,刘明童,张玉洁,等.探索低资源的迭代式复述生成增强方法[J].智能系统学报,2022,17(4):680-687.[doi:10.11992/tis.202106032]
 ZHANG Lin,LIU Mingtong,ZHANG Yujie,et al.Explore the low-resource iterative paraphrase generation enhancement method[J].CAAI Transactions on Intelligent Systems,2022,17(4):680-687.[doi:10.11992/tis.202106032]
点击复制

探索低资源的迭代式复述生成增强方法

参考文献/References:
[1] BARZILAY R, MCKEOWN K R. Extracting paraphrases from a parallel corpus[C]//ACL ’01: Proceedings of the 39th Annual Meeting on Association for Computational Linguistics. New York: ACM, 2001: 50?57.
[2] CHO K, VAN MERRIENBOER B, GULCEHRE C, et al. Learning phrase representations using RNN encoder-decoder for statistical machine translation[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: Association for Computational Linguistics, 2014: 1724?1734.
[3] YIN Jun, JIANG Xin, LU Zhengdong, et al. Neural generative question answering[C]//IJCAI’16: Proceedings of the 25th International Joint Conference on Artificial Intelligence. New York: ACM, 2016: 2972?2978.
[4] ZHANG Chi, SAH S, NGUYEN T, et al. Semantic sentence embeddings for paraphrasing and text summarization[C]//2017 IEEE Global Conference on Signal and Information Processing. Montreal: IEEE, 2017: 705?709.
[5] GUPTA R, OR?SAN C, ZAMPIERI M, et al. Improving translation memory matching and retrieval using paraphrases[J]. Machine translation, 2016, 30(1/2): 19–40.
[6] SAHAY S, OKUR E, HAKIM N, et al. Semi-supervised interactive intent labeling[C]//Proceedings of the Second Workshop on Data Science with Human in the Loop: Language Advances. Stroudsburg: Association for Computational Linguistics, 2021: 31?40.
[7] WEI J, ZOU Kai. EDA: easy data augmentation techniques for boosting performance on text classification tasks[C]//Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. Stroudsburg: Association for Computational Linguistics, 2019: 6382?6388.
[8] SHAKEEL M H, KARIM A, KHAN I. A multi-cascaded model with data augmentation for enhanced paraphrase detection in short texts[J]. Information processing & management, 2020, 57(3): 102204.
[9] MCKEOWN K. Paraphrasing questions using given and new information[J]. American journal of computational linguistics, 1983, 9(1): 1–10.
[10] 刘圆圆, 王忠建. 基于模板的对几种特殊结构句子的语句改写[J]. 现代电子技术, 2009, 32(3): 157–159,166
LIU Yuanyuan, WANG Zhongjian. Paraphrasing of several special sentence structure based on templates[J]. Modern electronics technique, 2009, 32(3): 157–159,166
[11] 胡金铭, 史晓东, 苏劲松, 等. 引入复述技术的统计机器翻译研究综述[J]. 智能系统学报, 2013, 8(3): 199–207
HU Jinming, SHI Xiaodong, SU Jinsong, et al. A survey of statistical machine translation using paraphrasing technology[J]. CAAI transactions on intelligent systems, 2013, 8(3): 199–207
[12] SERAJ R. Paraphrases for statistical machine translation [D]. Burnaby Campus: Simon Fraser University, 2015.
[13] KUMAR A, BHATTAMISHRA S, BHANDARI M, et al. Submodular optimization-based diverse paraphrasing and its effectiveness in data augmentation[C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2019: 3609?3619.
[14] KOBAYASHI S. Contextual augmentation: data augmentation by words with paradigmatic relations[EB/OL]. New York: arXiv, 2018. (2018?05?16)[2021?06?19].https://arxiv.org/abs/1805.06201.
[15] WIETING J, GIMPEL K. ParaNMT-50M: pushing the limits of paraphrastic sentence embeddings with millions of machine translations[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2018: 451?462.
[16] IYYER M, WIETING J, GIMPEL K, et al. Adversarial example generation with syntactically controlled paraphrase networks[EB/OL]. New York: arXiv, 2018. (2018?04?17)[2021?06?19].https://arxiv.org/abs/1804.06059.
[17] CHENG Yong, JIANG Lu, MACHEREY W, et al. AdvAug: robust adversarial augmentation for neural machine translation[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2020: 5961?5970.
[18] ABDULMUMIN I, GALADANCI B S, ALIYU G. Tag-less back-translation[J]. Machine translation, 2021, 35(4): 519–549.
[19] 薛佳奇, 杨凡. 基于交叉熵与困惑度的LDA-SVM主题研究[J]. 智能计算机与应用, 2019, 9(4): 45–50
XUE Jiaqi, YANG Fan. Research on LDA-SVM subject based on cross entropy and perplexity[J]. Intelligent computer and applications, 2019, 9(4): 45–50
[20] REIMERS N, GUREVYCH I. Sentence-BERT: sentence embeddings using siamese BERT-networks[C]//Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. Stroudsburg: Association for Computational Linguistics, 2019: 671?688.
[21] PAPINENI K, ROUKOS S, WARD T, et al. BLEU: a method for automatic evaluation of machine translation[C]//ACL ’02: Proceedings of the 40th Annual Meeting on Association for Computational Linguistics. New York: ACM, 2002: 311?318.
[22] MENG F, LU Z, LI H, et al. Interactive attention for neural machine translation[C]//Proceedings of the 26th International Conference on Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2016: 2174?2185.
[23] GU Jiatao, LU Zhengdong, LI Hang, et al. Incorporating copying mechanism in sequence-to-sequence learning[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2016: 1631?1640.
[24] TU Zhaopeng, LU Zhengdong, LIU Yang, et al. Modeling coverage for neural machine translation[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2016: 76?85.
[25] PRAKASH A, HASAN S, LEE K, et al. Neural paraphrase generation with stacked residual LSTM networks[C]//Proceedings of the 26th International Conference on Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2016: 2923?2934.
[26] LI Zichao, JIANG Xin, SHANG Lifeng, et al. Decomposable neural paraphrase generation[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2019: 3403?3414.
[27] CHEN Wenqing, TIAN Jidong, XIAO Liqiang, et al. A semantically consistent and syntactically variational encoder-decoder framework for paraphrase generation[C]//Proceedings of the 28th International Conference on Computational Linguistics. Stroudsburg: International Committee on Computational Linguistics, 2020: 1186?1198.
[28] LIN C Y. ROUGE: a package for automatic evaluation of summaries[C]//In Proceedings of Workshop on Text Summarization Branches Out. Barcelona: ACL, 2004: 74?81.
[29] LAVIE A, AGARWAL A. Meteor: an automatic metric for MT evaluation with high levels of correlation with human judgments[C]//StatMT ’07: Proceedings of the Second Workshop on Statistical Machine Translation. New York: ACM, 2007: 228?231.

备注/Memo

收稿日期:2021-06-23。
基金项目:国家自然科学基金项目(61876198,61976015, 61976016).
作者简介:张琳,硕士研究生,主要研究方向为复述生成和机器翻译;刘明童,博士,主要研究方向为依存句法分析、句子匹配、复述生成、机器翻译和自然语言处理;张玉洁,教授,主要研究方向为机器翻译、多语言信息处理、句法分析和自然语言处理。发表学术论文30余篇
通讯作者:张玉洁. E-mail:yjzhang@bjtu.edu.cn

更新日期/Last Update: 1900-01-01
Copyright © 《 智能系统学报》 编辑部
地址:(150001)黑龙江省哈尔滨市南岗区南通大街145-1号楼 电话:0451- 82534001、82518134 邮箱:tis@vip.sina.com