[1]ZHAI Junhai,LIU Bo,ZHANG Sufang.A feature selection approach based on rough set relative classification information entropy and particle swarm optimization[J].CAAI Transactions on Intelligent Systems,2017,12(3):397-404.[doi:10.11992/tis.201705004]
Copy

A feature selection approach based on rough set relative classification information entropy and particle swarm optimization

References:
[1] GUYON I, GUNN S, NIKRAVESH M, et al. Feature extraction, foundations and applications[M]. Berlin: Springer, 2006.
[2] DASH M, LIU H. Feature selection for classification [J]. Intelligent data analysis, 1997, 1: 131-151.
[3] PAWLAK Z. Rough sets [J]. Internationa journal of information and computer sciences, 1982, 11: 341-356.
[4] 苗夺谦, 李道国. 粗糙集理论、算法与应用 [M]. 北京: 清华大学出版社, 2008.
[5] SWINIARSKI R W, SKOWRON A. Rough set methods in feature selection and recognition[J]. Pattern recognition letters, 2003, 24(6): 833-849.
[6] JENSEN R, SHEN Q. Fuzzy-rough sets for descriptive dimensionality reduction[C]//IEEE International Conference on Fuzzy Systems, 2002. Fuzz-IEEE. 2002:29-34.
[7] BHATT R B, GOPAL M. On fuzzy-rough sets approach to feature selection[J]. Pattern recognition letters, 2005, 26(7): 965-975.
[8] JENSEN R, PARTHALáIN N M. Towards scalable fuzzy rough feature selection[J]. Information sciences, 2015, 323(C): 1-15.
[9] QIAN Y H, LIANG J, PEDRYCZ W, et al. Positive approximation: an accelerator for attribute reduction in rough set theory[J]. Artificial intelligence, 2010, 174(9/10): 597-618.
[10] HU Q H, YU D R, LIU J F, et al. Neighborhood rough set based heterogeneous feature subset selection[J]. Information sciences, 2008, 178(18): 3577-3594.
[11] ALMUALLIM H, DIETTERICH T G. Learning boolean concepts in the presence of many irrelevant features[J]. Artificial intelligence, 1994, 69 (1/2): 279-305.
[12] DASH M, LIU H. Consistency-based search in feature selection[J]. Artificial intelligence 2003 (151):155-176.
[13] BATTITI R. Using mutual information for selecting features in supervised neural net learning[J]. IEEE transactions on neural networks, 1994, 5(4): 537-549.
[14] KWAK N, CHOI C H. Input feature selection by mutual information based on parzen window [J]. IEEE transactions on pattern analysis and machine intelligence, 2002, 24(12): 1667-1671.
[15] ESTEVEZ P A, TESMER M, PEREZ C A, et al. Normalized mutual information feature selection [J]. IEEE transactions on neural networks, 2009, 20(2): 189-201.
[16] SONG L, SMOLA A, GRETTON A, et al. Feature selection via dependence maximization [J]. Journal of machine learning research, 2012, 13:1393-1434.
[17] HU Q H, ZHU Pengfei, LIU Jinfu, et al. Feature selection via maximizing fuzzy dependency[J]. Fundamenta informaticae, 2010, 98: 167-181.
[18] KOHAVI R, JOHN G. Wrappers for feature subset selection[J]. Artificial intelligence, 1997, 97(1/2): 273-324.
[19] SINDHWANI V, RAKSHIT S, DEODHARE D, et al. Feature selection in MLPs and SVMs based on maximum output information[J]. IEEE transactions on neural networks, 2004, 15(4): 937-947.
[20] YANG Jianbo, SHEN Kaiquan, ONG Chongjin, et al. Feature selection for MLP neural network: the use of random permutation of probabilistic outputs[J]. IEEE transactions on neural networks, 2009, 20(12): 1911-1922.
[21] QUINLAN J R. Induction of decision trees [J]. Machine learning, 1986, 1: 81-106.
[22] BREIMAN L, FRIEDMAN J H, RICHARD A S, et al. Classification and regression trees[M]. Belmont, CA: wadsworth international group, 1984.
[23] SETIONO R, LIU H. Neural-network feature selector [J]. IEEE transactions on neural networks, 1997, 8(3): 654-662.
[24] SHEN Kaiquan, ONG Chongjin, LI Xiaoping, et al. Feature selection via sensitivity analysis of SVM probabilistic outputs[J]. Machine learning, 2008, 70: 1-20.
[25] PERKINS S, LACKER K, THEILER J. Grafting: fast, incremental feature selection by gradient descent in function space [J]. Journal of machine learning research, 2003 (3) : 1333-1356.
[26] KENNEDY J, EBERHART R. Particle swarm optimization [C]. IEEE International Conference on Neural Networks. Perth, Australia, 1995, 4: 1942-1948.
[27] EBERHART R C, SHI Y H, KENNEDY J. Swarm Intelligence[M]. Massachusetts: Morgan Kaufmann, 2001.
[28] EBERHART R C, KENNEDY J. A discrete binary version of the particle swarm algorithm [J].IEEE conference on systems, 1997, 5: 4104-4109.
[29] CHUANG L Y, CHANG H W, TU C J, et al. Improved binary PSO for feature selection using gene expression data[J]. Computational biology & chemistry, 2008, 32(1): 29-37.
[30] CHUANG L Y, TSAI S W, YANG C H. Improved binary particle swarm optimization using catfish effect for feature selection[J]. Expert systems with applications, 2011, 38(10): 12699-12707.
[31] WANG Xiangyang, YANG Jie, TENG Xiaolong, et al. Feature selection based on rough sets and particle swarm optimization[J]. Pattern recognition letters, 2007, 28(4): 459-471.
[32] CERVANTE L, XUE B, ZHANG M, et al. Binary particle swarm optimisation for feature selection: a filter based approach[J]. Evolutionary computation, 2012, 41: 1-8.
[33] LIU Quanjin, ZHAO Zhimin, LI Yingxin. Ensemble feature selection method based on neighborhood information and PSO algorithm[J]. Acta electronica sinica, 2016, 44(4): 995-1002.
[34] FONG S, WONG R, VASILAKOS A. Accelerated PSO swarm search feature selection for data stream mining big data[J]. IEEE transactions on services computing, 2016, 9(1): 33-45.
[35] 翟俊海, 刘博, 张素芳. 基于相对分类信息熵的进化特征选择算法[J]. 模式识别与人工智能, 2016, 29(8):682-690.ZHAI Junhai, LIU Bo, ZHANG Sufang. Feature selection via evolutionary computation based on relative classification information entropy[J]. Pattern recognition and artificial intelligence, 2016, 29(8): 682-690.
[36] SHI B Y, EBERHART R. A modified particle swarm optimizer[J]. IEEE world congress on computational intelligence, 1999, 6: 69-73.
Similar References:

Memo

-

Last Update: 2017-06-25

Copyright © CAAI Transactions on Intelligent Systems