[1]LIU Wanjun,ZHAO Siqi,QU Haicheng,et al.Combining foreground feature reinforcement and region mask self-attention for fine-grained image classification[J].CAAI Transactions on Intelligent Systems,2022,17(6):1134-1144.[doi:10.11992/tis.202109029]
Copy
CAAI Transactions on Intelligent Systems[ISSN 1673-4785/CN 23-1538/TP] Volume:
17
Number of periods:
2022 6
Page number:
1134-1144
Column:
学术论文—机器感知与模式识别
Public date:
2022-11-05
- Title:
-
Combining foreground feature reinforcement and region mask self-attention for fine-grained image classification
- Author(s):
-
LIU Wanjun; ZHAO Siqi; QU Haicheng; WANG Yuping
-
School of Software, Liaoning Technical University, Huludao 125105, China
-
- Keywords:
-
fine-grained image classification; object localization; region-based mask; self-attention; diverse feature; feature reinforcement; residual network; deep learning
- CLC:
-
TP391.4
- DOI:
-
10.11992/tis.202109029
- Abstract:
-
This study presents a method of foreground feature reinforcement and region mask self-attention for fine-grained image classification due to the difficulty in extracting subtle features of subordinate classes that are difficult to distinguish irrelevant background noise interference. The ResNet50 is used first to extract global features of the input image, followed by the foreground feature reinforcement, which predicts the position coordinates of the foreground object in the input image. While eliminating background information interference, the features of foreground objects are enhanced to effectively highlight foreground objects. Finally, the region mask self-attention network is used to teach feature-enhanced foreground objects with rich and diverse fine-grained information that is different from other subclasses. The multi-branch loss function constrains the network’s feature learning throughout the process. The comprehensive experiments show that our approach outperforms other mainstream methods on CUB-200-2011, Stanford Cars datasets, and FGVC-Aircraft, with 88.0%, 95.3%, and 93.6%, respectively.