[1]肖宇晗,林慧苹,汪权彬,等.基于双特征嵌套注意力的方面词情感分析算法[J].智能系统学报,2021,16(1):142-151.[doi:10.11992/tis.202012024]
XIAO Yuhan,LIN Huiping,WANG Quanbin,et al.An algorithm for aspect-based sentiment analysis based on dual features attention-over-attention[J].CAAI Transactions on Intelligent Systems,2021,16(1):142-151.[doi:10.11992/tis.202012024]
点击复制
《智能系统学报》[ISSN 1673-4785/CN 23-1538/TP] 卷:
16
期数:
2021年第1期
页码:
142-151
栏目:
吴文俊人工智能科学技术奖论坛
出版日期:
2021-01-05
- Title:
-
An algorithm for aspect-based sentiment analysis based on dual features attention-over-attention
- 作者:
-
肖宇晗1, 林慧苹1, 汪权彬2, 谭营2
-
1. 北京大学 软件与微电子学院,北京 102600;
2. 北京大学 信息科学技术学院,北京 100871
- Author(s):
-
XIAO Yuhan1, LIN Huiping1, WANG Quanbin2, TAN Ying2
-
1. School of Software and Microelectronics, Peking University, Beijing 102600, China;
2. School of Electronics Engineering and Computer Science, Peking University, Beijing 100871, China
-
- 关键词:
-
情感分析; 方面词; 嵌套注意力; BERT预训练模型; 全局特征; 局部特征; 深度学习; 机器学习
- Keywords:
-
sentiment analysis; aspect; attention-over-attention; BERT pretrained model; global feature; local feature; deep learning; machine learning
- 分类号:
-
TP391
- DOI:
-
10.11992/tis.202012024
- 摘要:
-
针对目前方面词情感分析方法忽视了以方面词为核心的局部特征的重要性,并难以有效减小情感干扰项的负面噪声的问题,本文提出了一种带有基于变换器的双向编码器表示技术(bi-directional encoder representations from transformers,BERT)加持的双特征嵌套注意力模型(dual features attention-over-attention with BERT,DFAOA-BERT),首次将AOA(attention-over-attention)与BERT预训练模型结合,并设计了全局与局部特征提取器,能够充分捕捉方面词和语境的有效语义关联。实验结果表明:DFAOA-BERT在SemEval 2014任务4中的餐馆评论、笔记本评论和ACL-14 Twitter社交评论这3个公开数据集上均表现优异,而子模块的有效性实验,也充分证明了DFAOA-BERT各个部分的设计合理性。
- Abstract:
-
Aspect-based sentiment analysis is of great significance to making full use of product reviews to analyze potential user needs. The current research work still has deficiencies. Many studies ignore the importance of local features centered on aspects and fail to handle emotional disturbances effectively. To address these problems, this article proposes a dual features attention-over-attention model with BERT (DFAOA-BERT). For the first time, an AOA (attention-over-attention) mechanism is combined with the BERT pretrained model. DFAOA-BERT also designs global and local feature extractors to fully capture an effective semantic association between aspects and context. According to the experimental results, DFAOA-BERT performs well on three public datasets: restaurant and laptop review datasets from SemEval 2014 Task 4 and the ACL-14 Twitter social review dataset. The effectiveness experiment of submodules also fully proves that each part of DFAOA-BERT makes a significant contribution to the excellent performance.
备注/Memo
收稿日期:2020-12-15。
基金项目:国家重点研发计划项目(2018AAA0102301, 2018AAA0100302, 2018YFB1702900);国家自然科学基金项目(62076010)
作者简介:肖宇晗,硕士研究生,主要研究方向为深度学习、数据挖掘和自然语言处理;林慧苹,副教授,博士,主要研究方向为大数据分析、企业信息服务,主持和参与国家863计划、国家自然科学基金项目、国家重点研发计划项目等多项。发表学术论文20余篇;谭营,教授,博士生导师,主要研究方向为智能科学、计算智能与群体智能、机器学习、人工神经网络、群体机器人、大数据挖掘。烟花算法发明人。吴文俊人工智能科学技术成就奖创新三等奖获得者。发表学术论文 330 余篇,出版学术专著 12 部。
通讯作者:谭营. E-mail:ytan@pku.edu.cn
更新日期/Last Update:
2021-02-25