[1]LIN Sunqi,XU Jiameng,ZHENG Yujie,et al.An asymmetric bimodal fusion method for lightweight palm print and palm vein recognition network[J].CAAI Transactions on Intelligent Systems,2024,19(5):1190-1198.[doi:10.11992/tis.202212031]
Copy
CAAI Transactions on Intelligent Systems[ISSN 1673-4785/CN 23-1538/TP] Volume:
19
Number of periods:
2024 5
Page number:
1190-1198
Column:
学术论文—机器感知与模式识别
Public date:
2024-09-05
- Title:
-
An asymmetric bimodal fusion method for lightweight palm print and palm vein recognition network
- Author(s):
-
LIN Sunqi1; XU Jiameng2; ZHENG Yujie1; WANG Chong1; 2; WANG Jun2
-
1. Faculty of Electrical Engineering and Computer Science, Ningbo University, Ningbo 315211, China;
2. School of Information and Control Engineering, China University of Mining and Technology, Xuzhou 221116, China
-
- Keywords:
-
deep learning; biometrics; palm print and vein recognition; multimodal network; knowledge distillation; model compression; convolutional neural network; class activation map
- CLC:
-
TP30
- DOI:
-
10.11992/tis.202212031
- Abstract:
-
Deep learning has been widely used in palm print and palm vein recognition. However, with the continuous miniaturization and terminalization of task usage scenarios, it is often challenging to deploy current deep-learning models successfully on edge devices that suffer from limited computational power and memory constraints. In this study, we propose a lightweight palm print and palm vein recognition network based on knowledge distillation. First, we select different network depths for the palm print and palm vein modalities according to the complexity of their feature extraction. We introduce a novel modality feature loss function into the traditional knowledge distillation method to enhance the guiding role of the teacher model in the feature extraction of each modality. The experimental results demonstrate that this method effectively balances model size with performance and offers a viable solution for biometric recognition technologies within an edge computing environment.