[1]YANG Wenyuan.Unsupervised dimensionality reduction of multi-label learning via autoencoder networks[J].CAAI Transactions on Intelligent Systems,2018,13(5):808-817.[doi:10.11992/tis.201804051]
Copy

Unsupervised dimensionality reduction of multi-label learning via autoencoder networks

References:
[1] ZHANG Minling, ZHOU Zhihua. A review on multi-Label learning algorithms[J]. IEEE transactions on knowledge and data engineering, 2014, 26(8):1819-1837.
[2] TSOUMAKAS G, KATAKIS I. Multi-label classification:an overview[J]. International journal of data warehousing and mining, 2007, 3(3):1-13.
[3] WU Fei, WANG Zhuhao, ZHANG Zhongfei, et al. Weakly semi-supervised deep learning for multi-label image annotation[J]. IEEE transactions on big data, 2015, 1(3):109-122.
[4] LI Feng, MIAO Duoqian, PEDRYCZ W. Granular multi-label feature selection based on mutual information[J]. Pattern recognition, 2017, 67:410-423.
[5] ZHANG Yin, ZHOU Zhihua. Multilabel dimensionality reduction via dependence maximization[J]. ACM transactions on knowledge discovery from data, 2010, 4(3):1-21.
[6] 郭雨萌, 李国正. 一种多标记数据的过滤式特征选择框架[J]. 智能系统学报, 2014, 9(3):292-297 GUO Yumeng, LI Guozheng. A filtering framework fro the multi-label feature selection[J]. CAAI transactions on intelligent systems, 2014, 9(3):292-297
[7] JINDAL P, KUMAR D. A review on dimensionality reduction techniques[J]. International journal of computer applications, 2017, 173(2):42-46.
[8] OMPRAKASH S, SUMIT S. A review on dimension reduction techniques in data mining[J]. Computer engineering and intelligent systems, 2018, 9(1):7-14.
[9] YU Tingzhao, ZHANG Wensheng. Semisupervised multilabel learning with joint dimensionality reduction[J]. IEEE signal processing letters, 2016, 23(6):795-799.
[10] YU Yanming, WANG Jun, TAN Qiaoyu, et al. Semi-supervised multi-label dimensionality reduction based on dependence maximization[J]. IEEE access, 2017, 5:21927-21940.
[11] ZHANG Minling, ZHANG Kun. Multi-label learning by exploiting label dependency[C]//Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. Washington, DC, USA, 2010:999-1008.
[12] ZHANG Minling, ZHOU Zhihua. ML-KNN:a lazy learning approach to multi-label learning[J]. Pattern recognition, 2007, 40(7):2038-2048.
[13] BALDI P. Autoencoders, unsupervised learning and deep architectures[C]//Proceedings of the 2011 International Conference on Unsupervised and Transfer Learning Workshop. Washington, USA, 2011:37-50.
[14] BOURLARD H, KAMP Y. Auto-association by multilayer perceptrons and singular value decomposition[J]. Biological cybernetics, 1988, 59(4/5):291-294.
[15] 刘帅师, 程曦, 郭文燕, 等. 深度学习方法研究新进展[J]. 智能系统学报, 2016, 11(5):567-577 LIU Shuaishi, CHENG Xi, GUO Wenyan, et al. Progress report on new research in deep learning[J]. CAAI transactions on intelligent systems, 2016, 11(5):567-577
[16] VINCENT P, LAROCHELLE H, LAJOIE I, et al. Stacked denoising autoencoders:learning useful representations in a deep network with a local denoising criterion[J]. Journal of machine learning research, 2010, 11(12):3371-3408.
[17] YU Ying, WANG Yinglong. Feature selection for multi-label learning using mutual information and GA[M]//MIAO D, PEDRYCZ W, SLEZAK D, et al. Rough Sets and Knowledge Technology. Cham:Springer, 2014:454-463.
[18] 余鹰. 多标记学习研究综述[J]. 计算机工程与应用, 2015, 51(17):20-27 YU Ying. Survey on multi-label learning[J]. Computer engineering and applications, 2015, 51(17):20-27
[19] 段洁, 胡清华, 张灵均, 等. 基于邻域粗糙集的多标记分类特征选择算法[J]. 计算机研究与发展, 2015, 52(1):56-65 DUAN Jie, HU Qinghua, ZHANG Lingjun, et al. Feature selection for multi-label classification based on neighborhood rough sets[J]. Journal of computer research and development, 2015, 52(1):56-65
[20] DOQUIRE G, VERLEYSEN M. Feature selection for multi-label classification problems[C]//Proceedings of the 11th International Conference on Artificial Neural NetWorks Conference on Advances in Computational Intelligence. Torremolinos-M?laga, Spain, 2011:9-16.
[21] LAMDA. Data & Code[EB/OL]. Nanjing:LAMDA, 2016[2018.03.20]. http://lamda.nju.edu.cn/Data.ashx.
[22] WOLD S, ESBENSEN K, GELADI P. Principal component analysis[J]. Chemometrics and intelligent laboratory systems, 1987, 2(1):27-52.
[23] HE Xiaofei. Locality preserving projections[M]. IL, USA:University of Chicago, 2005:186-197.
[24] BELKIN M, NIYOGI P. Laplacian eigenmaps for dimensionality reduction and data representation[J]. Neural computation, 2003, 15(6):1373-1396.
[25] SCHAPIRE R E, SINGER Y. Boos texter:a boosting-based systemfor text categorization[J]. Machine learning-special issue on information retrieval, 2000, 39(2/3):135-168.
[26] TSOUMAKAS G, VLAHAVAS I. Random k-labelsets:an ensemble method for multilabel classification[M]//KOK J N, KORONACKI J, MANTARAS R L, et al. Machine Learning:ECML 2007. Berlin Heidelberg:Springer, 2007:406-417.
Similar References:

Memo

-

Last Update: 2018-10-25

Copyright © CAAI Transactions on Intelligent Systems