[1]程艳,胡建生,赵松华,等.融合Transformer和交互注意力网络的方面级情感分类模型[J].智能系统学报,2024,19(3):728-737.[doi:10.11992/tis.202303016]
CHENG Yan,HU Jiansheng,ZHAO Songhua,et al.Aspect-level sentiment classification model combining Transformer and interactive attention network[J].CAAI Transactions on Intelligent Systems,2024,19(3):728-737.[doi:10.11992/tis.202303016]
点击复制
《智能系统学报》[ISSN 1673-4785/CN 23-1538/TP] 卷:
19
期数:
2024年第3期
页码:
728-737
栏目:
学术论文—自然语言处理与理解
出版日期:
2024-05-05
- Title:
-
Aspect-level sentiment classification model combining Transformer and interactive attention network
- 作者:
-
程艳1,2, 胡建生1, 赵松华1, 罗品1, 邹海锋1, 詹勇鑫1, 富雁3, 刘春雷4
-
1. 江西师范大学 计算机信息工程学院, 江西 南昌 330022;
2. 智能信息处理与情感计算江西省重点实验室, 江西 南昌 330022;
3. 江西软云科技股份有限公司, 江西 南昌 330200;
4. 江西和壹科技有限公司, 江西 南昌 330200
- Author(s):
-
CHENG Yan1,2, HU Jiansheng1, ZHAO Songhua1, LUO Pin1, ZOU Haifeng1, ZHAN Yongxin1, FU Yan3, LIU Chunlei4
-
1. School of Computer Information Engineering, Jiangxi Normal University, Nanchang 330022, China;
2. Key Laboratory of Intelligent Information Processing and Emotional Computing in Jiangxi Province, Nanchang 330022, China;
3. Jiangxi Ruanyun Technol
-
- 关键词:
-
方面词; 情感分类; 循环神经网络; Transformer; 交互注意力网络; BERT; 局部特征; 深度学习
- Keywords:
-
aspect term; sentiment classification; recurrent neural network; transformer; interactive attention network; BERT; local feature; deep learning
- 分类号:
-
TP391
- DOI:
-
10.11992/tis.202303016
- 文献标志码:
-
2023-11-13
- 摘要:
-
现有的大多数研究者使用循环神经网络与注意力机制相结合的方法进行方面级情感分类任务。然而,循环神经网络不能并行计算,并且模型在训练过程中会出现截断的反向传播、梯度消失和梯度爆炸等问题,传统的注意力机制可能会给句子中重要情感词分配较低的注意力权重。针对上述问题,该文提出了一种融合Transformer和交互注意力网络的方面级情感分类模型。首先利用BERT(bidirectional encoder representation from Transformers)预训练模型来构造词嵌入向量,然后使用Transformer编码器对输入的句子进行并行编码,接着使用上下文动态掩码和上下文动态权重机制来关注与特定方面词有重要语义关系的局部上下文信息。最后在5个英文数据集和4个中文评论数据集上的实验结果表明,该文所提模型在准确率和F1上均表现最优。
- Abstract:
-
At present, most researchers use a combination of recurrent neural networks and attention mechanisms for aspect-level sentiment classification tasks. However, the recurrent neural network cannot be computed in parallel, and the models encounter problems, such as truncated backpropagation, gradient vanishing, and gradient exploration, in the training process. Traditional attention mechanisms may assign reduced attention weights to important sentiment words in sentences. An aspect-level sentiment classification model combining Transformer and interactive attention network is proposed to solve these problems. In this approach, the pretrained model, which considers bidirectional encoder representation from Transformers (BERT), is initially used to construct word embedding vectors. Then, Transformer encoders are used to perform parallel encoding for input sentences. Subsequently, the contextual dynamic mask-off code and the contextual dynamic weighting mechanisms are applied to focus on local context information semantically relevant to specific aspect words. Finally, the model is tested on five English datasets and four Chinese review datasets. Experimental results demonstrate that the proposed model outperforms others in terms of accuracy and F1.
备注/Memo
收稿日期:2023-03-08。
基金项目:国家自然科学基金项目(62167006,61967011);江西省科技创新基地-智能信息处理与情感计算江西省重点实验室(原江西省智能教育省重点实验室)项目(20212BCD42001);江西省 03 专项及 5G 项目(20212ABC03A22);江西省主要学科学术和技术带头人培养计划-领军人才项目(20213BCJL22047);江西省自然科学基金项目(20212BAB202017).
作者简介:程艳,教授,博士生导师,主要研究方向为人工智能、智能信息处理、情感分析。发表学术论文70余篇,出版学术专著2部。E-mail:chyan88888@jxnu.edu.cn;胡建生,硕士研究生,主要研究方向为情感分析、深度学习。E-mail:1875560437@qq.com;赵松华,硕士研究生,主要研究方向为知识追踪、教育数据挖掘。E-mail:3140851935@qq.com
通讯作者:程艳. E-mail:chyan88888@jxnu.edu.cn
更新日期/Last Update:
1900-01-01