[1]张静宇,续欣莹,谢刚,等.基于弹性权重巩固与知识蒸馏的垃圾持续分类[J].智能系统学报,2023,18(4):878-885.[doi:10.11992/tis.202211023]
ZHANG Jingyu,XU Xinying,XIE Gang,et al.Continuous classification of garbage based on the elastic weightconsolidation and knowledge distillation[J].CAAI Transactions on Intelligent Systems,2023,18(4):878-885.[doi:10.11992/tis.202211023]
点击复制
《智能系统学报》[ISSN 1673-4785/CN 23-1538/TP] 卷:
18
期数:
2023年第4期
页码:
878-885
栏目:
吴文俊人工智能科学技术奖论坛
出版日期:
2023-07-15
- Title:
-
Continuous classification of garbage based on the elastic weightconsolidation and knowledge distillation
- 作者:
-
张静宇1, 续欣莹1, 谢刚1, 刘华平2
-
1. 太原理工大学 电气与动力工程学院, 山西 太原 030024;
2. 清华大学 计算机科学与技术系, 北京 100084
- Author(s):
-
ZHANG Jingyu1, XU Xinying1, XIE Gang1, LIU Huaping2
-
1. College of Electrical and Power Engineering, Taiyuan University of Technology, Taiyuan 030024, China;
2. Department of Computer Science and Technology, Tsinghua University, Beijing 100084, China
-
- 关键词:
-
生活垃圾; 图像分类; 持续学习; 深度学习; 知识蒸馏; 正则化; 温度系数; 泛化能力
- Keywords:
-
domestic garbage; image classification; continuous learning; deep learning; knowledge distillation; regularization; temperature coefficient; generalization capability
- 分类号:
-
TP391
- DOI:
-
10.11992/tis.202211023
- 摘要:
-
针对目前的垃圾分类方法仅仅是对固定类别的常见生活垃圾分类,无法满足垃圾类别数量增长带来的动态持续分类要求的问题,本文提出了一种弹性权重巩固与知识蒸馏(elastic weight consolidation and knowledge distillation,EWC-KD)垃圾持续分类方法。该方法通过EWC正则化损失函数和蒸馏损失函数增强模型的记忆能力,EWC正则化损失函数限制重要参数的更新范围,带有温度系数的蒸馏损失函数通过保护类别标签中携带的类别信息增强模型的泛化能力。在5个垃圾分类任务上进行实验,结果表明该方法的性能优于对比方法,可以在所有任务上保持较高的分类准确率和较低的后向转移值,能够增强垃圾分类系统的持续分类能力。
- Abstract:
-
The current garbage classification methods focus on the classification of common domestic garbage of fixed classes, which cannot meet the dynamic and continuous classification requirements brought by the growth of the number of garbage classes. To solve this problem, the paper proposes an elastic weight consolidation and knowledge distillation (EWC-KD) continuous garbage classification method. The method enhances the memory ability of the model through EWC regularization loss function and distillation loss function. EWC regularization loss function limits the update range of important parameters, and the distillation loss function with temperature coefficient enhances the generalization ability of the model by protecting the class information carried in the class label. Experiments on five garbage classification tasks show that the performance of this method is better than that of the comparison method. Our method can maintain high classification accuracy and low backward transfer value on all tasks, and can enhance the continuous classification ability of the garbage classification system.
备注/Memo
收稿日期:2022-11-16。
基金项目:山西省回国留学人员科研项目(2021-046);山西省自然科学基金项目(202103021224056);山西省科技合作交流专项(202104041101030).
作者简介:张静宇,硕士研究生,主要研究方向为深度学习和图像处理;续欣莹,教授,美国圣荷西州立大学访问学者,中国人工智能学会科普工作委员会常务委员、认知系统与信息处理专委会委员,主要研究方向为人工智能、视觉感知与智能控制。主持国家级、省部级和企业等重要项目20余项,发表学术论文80余篇;刘华平,教授,博士生导师,国家级人才,中国人工智能学会理事、中国人工智能学会认知系统与信息处理专业委员会副主任,国家杰出青年科学基金获得者,主要研究方向为机器人感知、学习与控制,获吴文俊科技进步一等奖。主持国家自然科学基金重点 项目 2 项,发表学术论文 300 余篇
通讯作者:刘华平.E-mail:hpliu@tsinghua.edu.cn
更新日期/Last Update:
1900-01-01