[1]LI Jianyu,ZHAN Zhihui.A self-supervised data-driven particle swarm optimization approach for large-scale feature selection[J].CAAI Transactions on Intelligent Systems,2023,18(1):194-206.[doi:10.11992/tis.202206008]
Copy

A self-supervised data-driven particle swarm optimization approach for large-scale feature selection

References:
[1] XUE Bing, ZHANG Mengjie, BROWNE W N, et al. A survey on evolutionary computation approaches to feature selection[J]. IEEE transactions on evolutionary computation, 2016, 20(4): 606–626.
[2] ZHAN Zhihui, SHI Lin, TAN K C, et al. A survey on evolutionary computation for complex continuous optimization[J]. Artificial intelligence review, 2022, 55(1): 59–110.
[3] WANG Peng, XUE Bing, LIANG Jing, et al. Differential evolution based feature selection: a niching-based multi-objective approach[J]. IEEE transactions on evolutionary computation, PP(99): 1.
[4] ZHAN Zhihui, LI Jianyu, ZHANG Jun. Evolutionary deep learning: a survey[J]. Neurocomputing, 2022, 483: 42–58.
[5] CHENG Fan, CUI Junjie, WANG Qijun, et al. A variable granularity search based multi-objective feature selection algorithm for high-dimensional data classification[EB/OL].(2022–03–18)[2022–06–01].https://ieeexplore.ieee.org/abstract/document/9737335.
[6] LI Jianyu, ZHAN Zhihui, XU Jin, et al. Surrogate-assisted hybrid-model estimation of distribution algorithm for mixed-variable hyperparameters optimization in convolutional neural networks[EB/OL]. (2021–09–20)[2022–06–01].https://ieeexplore.ieee.org/document/9540902.
[7] SONG Xianfang, ZHANG Yong, GONG Dunwei, et al. Surrogate sample-assisted particle swarm optimization for feature selection on high-dimensional data[EB/OL]. (2022–05–18)[2022–06–01].https://ieeexplore.ieee.org/abstract/document/9775183
[8] 李永豪, 胡亮, 高万夫. 基于稀疏系数矩阵重构的多标记特征选择[J]. 计算机学报, 2022, 45(9): 1827–1841
LI Yonghao, HU Liang, GAO Wanfu. Multi-label feature selection based on sparse coefficient matrix reconstruction[J]. Chinese journal of computers, 2022, 45(9): 1827–1841
[9] LI Junyu, CHEN Jiazhou, QI Fei, et al. Two-dimensional unsupervised feature selection via sparse feature filter[EB/OL]. (2022–04–11)[2022–06–01].https://ieeexplore.ieee.org/abstract/document/9754711.
[10] WANG Peng, XUE Bing, LIANG Jing, et al. Multiobjective differential evolution for feature selection in classification[EB/OL]. (2021–12–07) [2022–06–01].https://pubmed.ncbi.nlm.nih.gov/34874881.
[11] 陈彤, 陈秀宏. 特征自表达和图正则化的鲁棒无监督特征选择[J]. 智能系统学报, 2022, 17(2): 286–294
CHEN Tong, CHEN Xiuhong. Feature self-representation and graph regularization for robust unsupervised feature selection[J]. CAAI transactions on intelligent systems, 2022, 17(2): 286–294
[12] LI Xiaoping, WANG Yadi, RUIZ R. A survey on sparse learning models for feature selection[J]. IEEE transactions on cybernetics, 2022, 52(3): 1642–1660.
[13] 李顺勇, 王改变. 一种新的最大相关最小冗余特征选择算法[J]. 智能系统学报, 2021, 16(4): 649–661
LI Shunyong, WANG Gaibian. New MRMR feature selection algorithm[J]. CAAI transactions on intelligent systems, 2021, 16(4): 649–661
[14] LIU Shulei, WANG Handing, PENG Wei, et al. A surrogate-assisted evolutionary feature selection algorithm with parallel random grouping for high-dimensional classification[J]. IEEE transactions on evolutionary computation, 2022, 26(5): 1087–1101.
[15] 曾毓菁, 姜勇. 一种融入注意力和预测的特征选择SLAM算法[J]. 智能系统学报, 2021, 16(6): 1039–1044
ZENG Yujing, JIANG Yong. Feature selection simultaneous localization and mapping algorithm incorporating attention and anticipation[J]. CAAI transactions on intelligent systems, 2021, 16(6): 1039–1044
[16] ZHANG N, GUPTA A, CHEN Zefeng, et al. Evolutionary machine learning with minions: a case study in feature selection[J]. IEEE transactions on evolutionary computation, 2022, 26(1): 130–144.
[17] YANG Jiaquan, CHEN Chunhua, LI Jianyu, et al. Compressed-encoding particle swarm optimization with fuzzy learning for large-scale feature selection[J]. Symmetry, 2022, 14(6): 1142.
[18] CHEN Ke, XUE Bing, ZHANG Mengjie, et al. Correlation-guided updating strategy for feature selection in classification with surrogate-assisted particle swarm optimisation[EB/OL]. (2021–12–13)[2022–06–01].https://ieeexplore.ieee.org/abstract/document/9647020.
[19] WANG Zijia, JIAN Junrong, ZHAN Zhihui, et al. Gene targeting differential evolution: a simple and efficient method for large scale optimization[EB/OL]. (2022–06–23)[2022–06–23].https://ieeexplore.ieee.org/abstract/document/9804806.
[20] ZHANG Yong, WANG Yanhu, GONG Dunwei, et al. Clustering-guided particle swarm feature selection algorithm for high-dimensional imbalanced data with missing values[J]. IEEE transactions on evolutionary computation, 2022, 26(4): 616–630.
[21] WANG Yequn, LI Jianyu, CHEN Chunhua, et al. Scale adaptive fitness evaluation-based particle swarm optimisation for hyperparameter and architecture optimisation in neural networks and deep learning[EB/OL]. (2022–06–02)[2022–06–02].https://ietresearch.onlinelibrary.wiley.com/doi/full/10.1049/cit2.12106.
[22] HE Chunlin, ZHANG Yong, GONG Dunwei, et al. A multi-task bee colony band selection algorithm with variable-size clustering for hyperspectral images[EB/OL]. (2022–03–14) [2022–06–02].https://ieeexplore.ieee.org/abstract/document/9733922.
[23] 陈宗淦, 詹志辉. 面向多峰优化问题的双层协同差分进化算法[J]. 计算机学报, 2021, 44(9): 1806–1823
CHEN Zonggan, ZHAN Zhihui. Two-layer collaborative differential evolution algorithm for multimodal optimization problems[J]. Chinese journal of computers, 2021, 44(9): 1806–1823
[24] KENNEDY J, EBERHART R. Particle swarm optimization[C]//Proceedings of ICNN’95-International Conference on Neural Networks. Perth: IEEE, 1995: 1942–1948.
[25] KENNEDY J, EBERHART R C. A discrete binary version of the particle swarm algorithm[C]//1997 IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation. Orlando: IEEE, 1997: 4104–4108.
[26] GU Shenkai, CHENG Ran, JIN Yaochu. Feature selection for high-dimensional classification using a competitive swarm optimizer[J]. Soft computing, 2018, 22(3): 811–822.
[27] TRAN Binh, XUE Bing, ZHANG Mengjie. Variable-length particle swarm optimization for feature selection on high-dimensional classification[J]. IEEE transactions on evolutionary computation, 2019, 23(3): 473–487.
[28] XUE Yu, TANG Tao, PANG Wei, et al. Self-adaptive parameter and strategy based particle swarm optimization for large-scale feature selection problems with multiple classifiers[J]. Applied soft computing, 2020, 88: 106031.
[29] LUO Chuan, WANG Sizhao, LI Tianrui, et al. Large-scale meta-heuristic feature selection based on BPSO assisted rough hypercuboid approach[EB/OL]. (2022–05–12) [2022–06–02].https://ieeexplore.ieee.org/abstract/document/9773310.
[30] JIAN Junrong, CHEN Zonggan, ZHAN Zhihui, et al. Region encoding helps evolutionary computation evolve faster: a new solution encoding scheme in particle swarm for large-scale optimization[J]. IEEE transactions on evolutionary computation, 2021, 25(4): 779–793.
[31] HE Xiaofei, YAN Shuicheng, HU Yuxiao, et al. Face recognition using Laplacianfaces[J]. IEEE transactions on pattern analysis and machine intelligence, 2005, 27(3): 328–340.
[32] CAI Deng, HE Xiaofei, HAN Jiawei, et al. Orthogonal laplacianfaces for face recognition[J]. IEEE transactions on image processing, 2006, 15(11): 3608–3614.
[33] CAI Deng, HE Xiaofei, HAN Jiawei, et al. Graph regularized nonnegative matrix factorization for data representation[J]. IEEE transactions on pattern analysis and machine intelligence, 2011, 33(8): 1548–1560.
[34] CHEN Ke, XUE Bing, ZHANG Mengjie, et al. An evolutionary multitasking-based feature selection method for high-dimensional classification[J]. IEEE transactions on cybernetics, 2022, 52(7): 7172–7186.
[35] JING Longlong, TIAN Yingli. Self-supervised visual feature learning with deep neural networks: a survey[J]. IEEE transactions on pattern analysis and machine intelligence, 2021, 43(11): 4037–4058.
[36] SARKAR P, ETEMAD A. Self-supervised ECG representation learning for emotion recognition[J]. IEEE transactions on affective computing, 2022, 13(3): 1541–1554.
[37] HSU W N, BOLTE B, TSAI Y H H, et al. HuBERT: self-supervised speech representation learning by masked prediction of hidden units[J]. IEEE/ACM transactions on audio, speech, and language processing, 2021, 29: 3451–3460.
[38] YUAN Aihong, YOU Mengbo, HE Dongjian, et al. Convex non-negative matrix factorization with adaptive graph for unsupervised feature selection[J]. IEEE transactions on cybernetics, 2022, 52(6): 5522–5534.
[39] YOU Mengbo, YUAN Aihong, ZOU Min, et al. Robust unsupervised feature selection via multi-group adaptive graph representation[EB/OL]. (2021–11–08)[2022–06–1].https://ieeexplore.ieee.org/abstract/document/9606609.
[40] ZHANG Rui, ZHANG Yunxing, LI Xuelong. Unsupervised feature selection via adaptive graph learning and constraint[J]. IEEE transactions on neural networks and learning systems, 2022, 33(3): 1355–1362.
[41] CHANG Heng, GUO Jun, ZHU Wenwu. Rethinking embedded unsupervised feature selection: a simple joint approach[EB/OL]. (2022–05–30)[2022–06–01].https://ieeexplore.ieee.org/abstract/document/9784919.
[42] SOLORIO-FERNáNDEZ S, CARRASCO-OCHOA J A, MARTíNEZ-TRINIDAD J F. A review of unsupervised feature selection methods[J]. Artificial intelligence review, 2020, 53(2): 907–948.
[43] LI Jundong, CHENG Kewei, WANG Suhang, et al. Feature selection: a data perspective[J]. ACM computing surveys, 2018, 50(6): 94.
Similar References:

Memo

-

Last Update: 1900-01-01

Copyright © CAAI Transactions on Intelligent Systems