[1]JIANG Yunliang,YIN Zezong,ZHANG Xiongtao,et al.TSK fuzzy distillation classifier with negative Euclidean probability and High-order fuzzy dark knowledge transfer and its application on EEG signals classification[J].CAAI Transactions on Intelligent Systems,2024,19(6):1419-1427.[doi:10.11992/tis.202307029]
Copy
CAAI Transactions on Intelligent Systems[ISSN 1673-4785/CN 23-1538/TP] Volume:
19
Number of periods:
2024 6
Page number:
1419-1427
Column:
学术论文—机器感知与模式识别
Public date:
2024-12-05
- Title:
-
TSK fuzzy distillation classifier with negative Euclidean probability and High-order fuzzy dark knowledge transfer and its application on EEG signals classification
- Author(s):
-
JIANG Yunliang1; 2; 3; YIN Zezong1; 2; ZHANG Xiongtao1; 2; SHEN Qing1; 2; LI Hua2; 4
-
1. School of Information Engineering, Huzhou University, Huzhou 313000, China;
2. Zhejiang Province Key Laboratory of Smart Management and Application of Modern Agricultural Resources, Huzhou 313000, China;
3. School of Computer Science and Technolo
-
- Keywords:
-
TSK fuzzy classifier; knowledge distillation; high-order fuzzy dark knowledge; electro encephalo gram (EEG); least learning machine; epilepsy; motor imagery; fuzzy system
- CLC:
-
TP181
- DOI:
-
10.11992/tis.202307029
- Abstract:
-
In the classification and detection task of electroencephalogram (EEG) signals, the low-order Takagi-Sugeno-Kang (TSK) fuzzy classifier runs faster but performs poorly, while the high-order TSK fuzzy classifier demonstrates strong prediction performance. However, the extremely complex fuzzy rules in the consequent part notably affect the running speed of the model. Therefore, this study proposes a novel TSK fuzzy distillation classifier, STSK-LLM-KD, based on negative Euclidean probability and high-order fuzzy dark knowledge transfer. First, the least learning machine based on knowledge distillation (LLM-KD) is used to quickly solve the consequent parameters of the teacher model and obtain corresponding negative Euclidean probabilities to generate soft labels. Then, the high-order fuzzy dark knowledge of the teacher model is extracted by calculating the Kullback-Leible divergence between soft labels and transferred to the low-order student model. This approach enhances the performance of the model beyond that of the high-order TSK fuzzy classifier while maintaining a faster training speed. Experimental results on the motor imagery EEG dataset and Hauz Khas epilepsy EEG dataset in New Delhi fully verify the advantages of the proposed STSK-LLM-KD. Compared to other fuzzy classifiers, STSK-LLM-KD performs better; compared to deep knowledge distillation models, STSK-LLM-KD more effectively improves the performance of the student model.