[1]陈培,景丽萍.融合语义信息的矩阵分解词向量学习模型[J].智能系统学报,2017,12(5):661-667.[doi:10.11992/tis.201706012]
 CHEN Pei,JING Liping.Word representation learning model using matrix factorization to incorporate semantic information[J].CAAI Transactions on Intelligent Systems,2017,12(5):661-667.[doi:10.11992/tis.201706012]
点击复制

融合语义信息的矩阵分解词向量学习模型

参考文献/References:
[1] TURIAN J, RATINOV L, BENGIO Y. Word representations:a simple and general method for semi-supervised learning[C]//Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics. Uppsala, Sweden, 2010:384-394.
[2] LIU Y, LIU Z, CHUA T S, et al. Topical word embeddings[C]//Association for the Advancement of Artificial Intelligence. Austin Texas, USA, 2015:2418-2424.
[3] MAAS A L, DALY R E, PHAM P T, et al. Learning word vectors for sentiment analysis[C]//Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics. Portland Oregon, USA, 2011:142-150.
[4] DHILLON P, FOSTER D P, UNGAR L H. Multi-view learning of word embeddings via cca[C]//Advances in Neural Information Processing Systems. Granada, Spain,2011:199-207.
[5] BANSAL M, GIMPEL K, LIVESCU K. Tailoring continuous word representations for dependency parsing[C]//Meeting of the Association for Computational Linguistics. Baltimore Maryland, USA, 2014:809-815.
[6] HUANG E H, SOCHER R, MANNING C D, et al. Improving word representations via global context and multiple word prototypes[C]//Meeting of the Association for Computational Linguistics. Jeju Island, Korea, 2012:873-882.
[7] MNIH A, HINTON G. Three new graphical models for statistical language modelling[C]//Proceedings of the 24th International Conference on Machine Learning. New York, USA, 2007:641-648.
[8] MNIH A, HINTON G. A scalable hierarchical distributed language model[C]//Advances in Neural Information Processing Systems. Vancouver, Canada, 2008:1081-1088.
[9] BENGIO Y, DUCHARME R, VINCENT P, et al. A neural probabilistic language model[J]. Journal of machine learning research, 2003, 3(02):1137-1155.
[10] COLLOBERT R, WESTON J, BOTTOU L, et al. Natural language processing (almost) from scratch[J]. Journal of machine learning research, 2011, 12(8):2493-2537.
[11] MIKOLOV T, CHEN K, CORRADO G, ET AL. Efficient estimation of word representations in vector space[C]//International Conference on Learning Representations. Scottsdale, USA,2013.
[12] BAIN J, Gao B, Liu T Y. Knowledge-powered deep learning for word embedding[C]//Joint European Conference on Machine Learning and Knowledge Discovery in Databases. Springer, Berlin, Heidelberg, 2014:132-148.
[13] LI Y, XU L, TIAN F, ET AL. Word embedding revisited:a new representation learning and explicit matrix factorization perspective[C]//International Conference on Artificial Intelligence. Buenos Aires, Argentina, 2015:3650-3656.
[14] LEVY O, GOLDBERG Y. Neural word embedding as implicit matrix factorization[C]//Advances in Neural Information Processing Systems. Montreal Quebec, Canada, 2014:2177-2185.
[15] PENNINGTON J, SOCHER R, MANNING C. Glove:global vectors for word representation[C]//Conference on Empirical Methods in Natural Language Processing. Doha, Qatar, 2014:1532-1543.
[16] BIAN J, GAO B, LIU T Y. Knowledge-powered deep learning for word embedding[C]//Joint European Conference on Machine Learning and Knowledge Discovery in Databases. Berlin, Germany, 2014:132-148.
[17] XU C, BAI Y, BIAN J, et al. Rc-net:a general framework for incorporating knowledge into word representations[C]//Proceedings of the 23rd ACM International Conference on Conference on Information and Knowledge Management. Shanghai, China,2014:1219-1228.
[18] YU M, DREDZE M. Improving lexical embeddings with semantic knowledge[C]//Meeting of the Association for Computational Linguistics. Baltimore Maryland, USA,2014:545-550.
[19] LIU Q, JIANG H, WEI S, et al. Learning semantic word embeddings based on ordinal knowledge constraints[C]//The 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference of the Asian Federation of Natural Language Processing. Beijing, China, 2015:1501-1511.
[20] FARUQUI M, DODGE J, JAUHAR S K, et al. Retrofitting word vectors to semantic lexicons[C]//The 2015 Conference of the North American Chapter of the Association for Computational Linguistics. Colorado, USA, 2015:1606-1615.
[21] LEE D D, SEUNG H S. Algorithms for non-negative matrix factorization[C]//Advances in Neural Information Processing Systems.Vancouver, Canada, 2001:556-562.
[22] MNIH A, SALAKHUTDINOV R. Probabilistic matrix factorization[C]//Advances in Neural Information Processing Systems. Vancouver, Canada, 2008:1257-1264.
[23] SREBRO N, RENNIE J D M, JAAKKOLA T. Maximum-margin matrix factorization[J]. Advances in neural information processing systems, 2004, 37(2):1329-1336.
[24] LUONG T, SOCHER R, MANNING C D. Better word representations with recursive neural networks for morphology[C]//Seventeenth Conference on Computational Natural Language Learning. Sofia, Bulgaria,2013:104-113.
[25] FINKELSTEIN R L. Placing search in context:the concept revisited[J]. ACM transactions on information systems, 2002, 20(1):116-131.
相似文献/References:
[1]李 蕾,周延泉,钟义信.基于语用的自然语言处理研究与应用初探[J].智能系统学报,2006,1(2):1.
 LI Lei,ZHOU Yan-quan,ZHONG Yi-xin.Pragmatic Information Based NLP Research and Application[J].CAAI Transactions on Intelligent Systems,2006,1():1.
[2]李德毅.AI——人类社会发展的加速器[J].智能系统学报,2017,12(5):583.[doi:10.11992/tis.201710016]
 LI Deyi.Artificial intelligence:an accelerator for the development of human society[J].CAAI Transactions on Intelligent Systems,2017,12():583.[doi:10.11992/tis.201710016]
[3]张森,张晨,林培光,等.基于用户查询日志的网络搜索主题分析[J].智能系统学报,2017,12(5):668.[doi:10.11992/tis.201706096]
 ZHANG Sen,ZHANG Chen,LIN Peiguang,et al.Web search topic analysis based on user search query logs[J].CAAI Transactions on Intelligent Systems,2017,12():668.[doi:10.11992/tis.201706096]
[4]曲昭伟,吴春叶,王晓茹.半监督自训练的方面提取[J].智能系统学报,2019,14(4):635.[doi:10.11992/tis.201806006]
 QU Zhaowei,WU Chunye,WANG Xiaoru.Aspects extraction based on semi-supervised self-training[J].CAAI Transactions on Intelligent Systems,2019,14():635.[doi:10.11992/tis.201806006]
[5]张潇鲲,刘琰,陈静.引入外部词向量的文本信息网络表示学习[J].智能系统学报,2019,14(5):1056.[doi:10.11992/tis.201809037]
 ZHANG Xiaokun,LIU Yan,CHEN Jing.Representation learning using network embedding based on external word vectors[J].CAAI Transactions on Intelligent Systems,2019,14():1056.[doi:10.11992/tis.201809037]
[6]王一成,万福成,马宁.融合多层次特征的中文语义角色标注[J].智能系统学报,2020,15(1):107.[doi:10.11992/tis.201910012]
 WANG Yicheng,WAN Fucheng,MA Ning.Chinese semantic role labeling with multi-level linguistic features[J].CAAI Transactions on Intelligent Systems,2020,15():107.[doi:10.11992/tis.201910012]
[7]曾碧卿,韩旭丽,王盛玉,等.层次化双注意力神经网络模型的情感分析研究[J].智能系统学报,2020,15(3):460.[doi:10.11992/tis.201812017]
 ZENG Biqing,HAN Xuli,WANG Shengyu,et al.Hierarchical double-attention neural networks for sentiment classification[J].CAAI Transactions on Intelligent Systems,2020,15():460.[doi:10.11992/tis.201812017]
[8]毛明毅,吴晨,钟义信,等.加入自注意力机制的BERT命名实体识别模型[J].智能系统学报,2020,15(4):772.[doi:10.11992/tis.202003003]
 MAO Mingyi,WU Chen,ZHONG Yixin,et al.BERT named entity recognition model with self-attention mechanism[J].CAAI Transactions on Intelligent Systems,2020,15():772.[doi:10.11992/tis.202003003]
[9]胡康,何思宇,左敏,等.基于CNN-BLSTM的化妆品违法违规行为分类模型[J].智能系统学报,2021,16(6):1151.[doi:10.11992/tis.202104001]
 HU Kang,HE Siyu,ZUO Min,et al.Classification model for judging illegal and irregular behavior for cosmetics based on CNN-BLSTM[J].CAAI Transactions on Intelligent Systems,2021,16():1151.[doi:10.11992/tis.202104001]
[10]喻波,王志海,孙亚东,等.非结构化文档敏感数据识别与异常行为分析[J].智能系统学报,2021,16(5):932.[doi:10.11992/tis.202104028]
 YU Bo,WANG Zhihai,SUN Yadong,et al.Unstructured document sensitive data identification and abnormal behavior analysis[J].CAAI Transactions on Intelligent Systems,2021,16():932.[doi:10.11992/tis.202104028]

备注/Memo

收稿日期:2017-06-06。
基金项目: 国家自然科学基金项目(61370129,61375062,61632004)
作者简介:陈培,女,1990年生,硕士研究生,主要研究方向为自然语言处理、情感分析;景丽萍,女,1978年生,教授,博士,主要研究方向为数据挖掘、文本挖掘、生物信息学、企业智能。
通讯作者:景丽萍.E-mail:lpjing@bjtu.edu.cn

更新日期/Last Update: 2017-10-25
Copyright @ 《 智能系统学报》 编辑部
地址:(150001)黑龙江省哈尔滨市南岗区南通大街145-1号楼 电话:0451- 82534001、82518134