[1]于皓,张杰,吴明辉,等.领域知识图谱快速构建和应用框架[J].智能系统学报,2021,16(5):871-884.[doi:10.11992/tis.202103024]
 YU Hao,ZHANG Jie,WU Minghui,et al.A framework for rapid construction and application of domain knowledge graphs[J].CAAI Transactions on Intelligent Systems,2021,16(5):871-884.[doi:10.11992/tis.202103024]
点击复制

领域知识图谱快速构建和应用框架

参考文献/References:
[1] WU Minghui, WU Xindong. On big wisdom[C]//Proceedings of the IEEE International Conference on Data Mining. Singapore, Singapore, 2018:1-2.
[2] NADEAU D, SEKINE S. A survey of named entity recognition and classification[J]. Lingvistic? investigationes, 2007, 30(1): 3–26.
[3] GRISHMAN R, SUNDHEIM B. Message understanding conference-6: a brief history[C]//Proceedings of the 16th Conference on Computational Linguistics. Copenhagen, Denmark, 1996:466-471.
[4] RABINER L R. A tutorial on hidden Markov models and selected applications in speed recognition[J]. Proceedings of IEEE, 1989, 77(2): 257–286.
[5] SEKINE S, GRISHMAN R, SHINNOU H. A decision tree method for finding and classifying names in Japanese texts[C]//Proceedings of the 6th Workshop on Very Large Corpora.Montreal. Quebec, Canada, 1998:171-178.
[6] BORTHWICK A, STERLING J, AGICHTEIN E, et al. NYU: description of the MENE named entity system as used in MUC-7[C]//Proceedings of the 7th Message Understanding Conference. Washington, DC, USA, 1998.
[7] ASAHARA M, MATSUMOTO Y. Japanese named entity extraction with redundant morphological analysis[C]//Proceedings of the 2003 Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology. Edmonton, Canada, 2003:8-15.
[8] LAFFERTY J D, MCCALLUM A, PEREIRA F C N. Conditional random fields: probabilistic models for segmenting and labeling sequence data[C]//Proceedings of the Eighteenth International Conference on Machine Learning. San Francisco, CA, United States, 2001: 282-289.
[9] LAMPLE G, BALLESTEROS M, SUBRAMANIAN S, et al. Neural architectures for named entity recognition[C]//Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. San Diego, California, USA, 2016: 260-270.
[10] ZHANG Yue, YANG Jie. Chinese NER using lattice LSTM[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. Melbourne, Australia, USA, 2018: 1554-1564.
[11] DEVLIN J, CHANG Mingwei, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Com-putational Linguistics: Human Language Technologies Minneapolis, Minnesota, USA, 2019: 4171-4186.
[12] ZHANG Zhengyan, HAN Xu, LIU Zhiyuan, et al. ERNIE: enhanced language representation with informative entities[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Florence, Italy, 2019: 1441-1451.
[13] LI X, YAN H, QIU X, et al. FLAT: Chinese NER using flat-lattice transformer[C]// Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Seattle, Washington, USA, 2020:6836-6842.
[14] YE Z X, LING Z H. Distant supervision relation extraction with intra-bag and inter-bag attentions[C]// Proceedings of the 2019 Conference of the North. Minneapolis, Minnesota, USA,2019:2810-2819.
[15] QIN Pengda, XU Weiran, WANG W Y. Robust distant supervision relation extraction via deep reinforcement learning[C]// Meeting of the Association for Computational Linguistics. Melbourne, Australia, 2018: 2137-2147
[16] ALT C, HüBNER M, HENNIG L. Fine-tuning pre-trained transformer language models to distantly supervised relation extraction[J]. arXiv preprint arXiv:1906.08646, 2019.
[17] YE Zhixiu, LING Zhenhua. Multi-level matching and aggregation network for few-shot relation classification[J]. arXiv preprint arXiv:1906.06678, 2019.
[18] SNELL J, SWERSKY K, ZEMEL R. Prototypical networks for few-shot learning[C]//Proceedings of the 31st Conference on Neural Information Processing Systems. Long Beach, CA, USA, 2007: 4077-4087.
[19] SHEN Wei, WANG Jianyong, HAN Jiawei. Entity linking with a knowledge base: issues, techniques, and solutions[J]. IEEE transactions on knowledge and data engineering, 2015, 27(2): 443-460.
[20] SEVGILI O, SHELMANOV A, ARKHIPOV M, et al. Neural entity linking: a survey of models based on deep learning[J]. arXiv:2006.00575, 2021.
[21] GU Yingjie, QU Xiaoye, WANG Zhefeng, et al. Read, retrospect, select: an MRC framework to short text entity linking[J]. arXiv:2101.02394, 2021.
[22] WU L, PETRONI F, JOSIFOSKI M, et al. Scalable zero-shot entity linking with dense entity retrieval[C]//Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing. PuntaCana, The Dominican Republic,2020: 6397-6407.
[23] LIN Yankai, LIU Zhiyuan, SUN Maosong, et al. Learning entity and relation embeddings for knowledge graph completion[C]//Proceedings of the 29th AAAI Conference on Artificial Intelligence. Austin, Texas, 2015: 2181-2187.
[24] SOCHER R, CHEN Danqi, MANNING C D, et al. Reasoning with neural tensor networks for knowledge base completion[C]//Proceedings of the 26th International Conference on Neural Information Processing Systems. Lake Tahoe, Nevada, 2013: 926-934.
[25] WANG Zhen, ZHANG Jianwen, FENG Jianlin, et al. Knowledge graph embedding by translating on hyperplanes[C]//Proceedings of the 28th AAAI Conference on Artificial Intelligence. Quebec, Canada, 2014: 1112-1119.
[26] SHI B X, WENINGER T. Open-world knowledge graph completion[C]//Proceedings of the 32nd International Conference on Artificial Intelligence. New Orleans, Louisiana, USA, 2018: 1957-1964.
[27] LAO Ni, COHEN W W. Relational retrieval using a combination of path-constrained random walks[J]. Machine learning, 2010, 81(1): 53-67.
[28] GARDNER M, TALUKDAR P, KRISHNAMURTHY J, et al. Incorporating vector space similarity in random walk inference over knowledge bases[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Doha, Qatar, 2014: 397-406.
[29] XIONG Wenhan, HOANG T, WANG W Y. DeepPath: a reinforcement learning method for knowledge graph reasoning[C]//Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Copenhagen, Denmark, 2017: 564-573.
[30] XIONG Wenhan, YU Mo, CHANG Shiyu, et al. One-shot relational learning for knowledge graphs[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Brussels, Belgium, 2018: 1980-1990.
[31] YIH W T, HE Xiaodong, MEEK C. Semantic parsing for single-relation question answering[C]//Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics. Baltimore, Maryland, 2014: 643-648.
[32] REDDY S, LAPATA M, STEEDMAN M. Large-scale semantic parsing without question-answer pairs[J]. Transactions of the association for computational linguistics, 2014, 2: 377-392.
[33] XU Kun, WU Lingfei, WANG Zhiguo, et al. Exploiting rich syntactic information for semantic parsing with graph-to-sequence model[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Brussels, Belgium, 2018: 918-924.
[34] BORDES A, USUNIER N, CHOPRA S, et al. Large-scale simple question answering with memory networks[J]. arXiv preprint arXiv:1506.02075, 2015.
[35] DONG Li, WEI Furu, ZHOU Ming, et al. Question answering over freebase with multi-column convolutional neural networks[C]//Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing. Beijing, China, 2015: 260-269.
[36] ZHANG Yuanzhe, LIU Kang, HE Shizhu, et al. Question answering over knowledge base with neural attention combining global knowledge information[J]. arXiv preprint arXiv:1606.00979, 2016.
[37] ZHANG Yuyu, DAI Hanjun, KOZAREVA Z, et al. Variational reasoning for question answering with knowledge graph[C]//Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence.San Francisco, California, USA, 2018:1-13.
[38] LIN X V, SOCHER R, XIONG Caiming. Multi-hop knowledge graph reasoning with reward shaping[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Brussels, Belgium, 2018: 3243-3253.

备注/Memo

收稿日期:2021-03-15。
基金项目:国家重点研发计划项目(2016YFB1000901);国家自然科学基金项目(91746209);教育部创新团队项目(IRT17R3);国家发改委项目(20190404165100454)
作者简介:于皓,博士研究生,主要研究方向为机器学习、知识图谱和自然语言处理;张杰,博士研究生,主要研究方向为知识工程、自然语言处理;吴信东,教授,博士生导师,主要研究方向为数据挖掘、大数据分析、知识工程。明略科技集团首席科学家和明略科学院院长,教育部大数据知识工程重点实验室(合肥工业大学)主任,营销智能国家新一代人工智能开放创新平台负责人,国家重点研发计划重点专项项目“大数据知识工程基础理论及其应用研究”首席科学家
通讯作者:吴信东.E-mail:wuxindong@mininglamp.com

更新日期/Last Update: 1900-01-01
Copyright © 《 智能系统学报》 编辑部
地址:(150001)黑龙江省哈尔滨市南岗区南通大街145-1号楼 电话:0451- 82534001、82518134 邮箱:tis@vip.sina.com