[1]QIAO Junfei,LI Fanjun,YANG Cuili.Review and prospect on neural networks with random weights[J].CAAI Transactions on Intelligent Systems,2016,11(6):758-767.[doi:10.11992/tis.201612015]
Copy

Review and prospect on neural networks with random weights

References:
[1] QI Min, ZHANG G P. Trend time series modeling and forecasting with neural networks[J]. IEEE transactions on neural networks, 2008, 19(5):808-816.
[2] 赵洪伟, 谢永芳, 蒋朝辉, 等. 基于泡沫图像特征的浮选槽液位智能优化设定方法[J]. 自动化学报, 2014, 40(6):1086-1097. ZHAO Hongwei, XIE Yongfang, JIANG Chaohui, et al. An intelligent optimal setting approach based on froth features for level of flotation cells[J]. Acta automatica sinica, 2014, 40(6):1086-1097.
[3] 乔俊飞, 薄迎春, 韩广. 基于ESN的多指标DHP控制策略在污水处理过程中的应用[J]. 自动化学报, 2013, 39(7):1146-1151. QIAO Junfei, BO Yingchun, HAN Guang. Application of esn-based multi indices dual heuristic dynamic programming on wastewater treatment process[J]. Acta automatica sinica, 2013, 39(7):1146-1151.
[4] CHENG M H, HWANG K S, JENG J H, et al. Classification-based video super-resolution using artificial neural networks[J]. Signal processing, 2013, 93(9):2612-2625.
[5] SCARDAPANE S, WANG Dianhui, PANELLA M. A decentralized training algorithm for echo state networks in distributed big data applications[J]. Neural networks, 2015, 78:65-74.
[6] TENG T H, TAN A H, ZURADA J M. Self-organizing neural networks integrating domain knowledge and reinforcement learning[J]. IEEE transactions on neural networks & learning systems, 2015, 26(5):889-902.
[7] ALHAMDOOSH M, WANG Dianhui. Fast decorrelated neural network ensembles with random weights[J]. Information sciences, 2014, 264:104-117.
[8] GUSTAVSSON A, MAGNUSON A, BLOMBERG B, et al. On the difficulty of training recurrent neural networks[J]. Computer science, 2013, 52(3):337-345.
[9] RIGOTTI M, BARAK O, WARDEN M R, et al. The importance of mixed selectivity in complex cognitive tasks[J]. Nature, 2013, 497(7451):585-590.
[10] BARAK O, RIGOTTI M, FUSI S. The sparseness of mixed selectivity neurons controls the generalization-discrimination trade-off[J]. The journal of neuroscience, 2013, 33(9):3844-3856.
[11] HUANG Gao, HUANG Guangbin, SONG Shiji, et al. Trends in extreme learning machines:a review[J]. Neural networks, 2015, 61:32-48.
[12] DENG Chenwei, HUANG Guangbin, XU Jia, et al. Extreme learning machines:new trends and applications[J]. Science China information sciences, 2015, 58(2):1-16.
[13] PAO Y H, TAKEFUJI Y. Functional-link net computing:theory, system architecture, and functionalities[J]. Computer, 1992, 25(5):76-79.
[14] HUANG Guangbin, ZHU Qinyu, SIEW C K. Extreme learning machine:theory and applications[J]. Neurocomputing, 2006, 70(1/2/3):489-501.
[15] RAHIMI A, RECHT B. Weighted sums of random kitchen sinks:replacing minimization with randomization in learning[C]//Proceedings of the 21st International Conference on Neural Information Processing Systems. Vancouver, British Columbia, Canada:Curran Associates Inc., 2008:1313-1320.
[16] WIDROW B, GREENBLATT A, KIM Y, et al. The No-Prop algorithm:A new learning algorithm for multilayer neural networks[J]. Neural networks, 2013, 37:182-188.
[17] KASUN L L C, ZHOU Hongming, HUANG Guangbin, et al. Representational learning with ELMs for big data[J]. IEEE intelligent systems, 2013, 28(6):31-34.
[18] TANG Jiexiong, DENG Chenwei, HUANG Guangbin. Extreme learning machine for multilayer perceptron[J]. IEEE transactions on neural networks and learning systems, 2016, 27(4):809-821.
[19] QIAO Junfei, LI Fanjun, HAN Honggui, et al. Constructive algorithm for fully connected cascade feedforward neural networks[J]. Neurocomputing, 2016, 182:154-164.
[20] WAN Yihe, SONG Shiji, HUANG Gao. Incremental extreme learning machine based on cascade neural networks[C]//IEEE International Conference on Systems, Man, and Cybernetics. Kowloon:IEEE, 2015:1889-1894.
[21] STEIL J J. Memory in backpropagation-decorrelation O(N) efficient online recurrent learning[C]//Proceedings of the 15th International Conference on Artificial Neural Networks:Formal Models and Their Applications. Berlin Heidelberg:Springer, 2005:750-750.
[22] HERBERT J. The "echo state" approach to analyzing and training recurrent neural networks with an erratum note[R]. Bonn, Germany:German National Research Center for Information Technology, 2001.
[23] MAASS W. Liquid state machines:motivation, theory, and applications[J]. Computability in context, 2010:275-296. DOI:10.1142/9781848162778_0008.
[24] LIANG Nanying, SARATCHANDRAN P, HUANG Guangbin, et al. Classification of mental tasks from EEG signals using extreme learning machine[J]. International journal of neural systems, 2006, 16(1):29-38.
[25] SKOWRONSKI M D, HARRIS J G. Automatic speech recognition using a predictive echo state network classifier[J]. Neural networks, 2007, 20(3):414-423.
[26] LI Decai, HAN Min, WANG Jun. Chaotic time series prediction based on a novel robust echo state network[J]. IEEE transactions on neural networks and learning systems, 2012, 23(5):787-799.
[27] SCHMIDT W F, KRAAIJVELD M A, DUIN R P W. Feedforward neural networks with random weights[C]//Proceedings of 11th International Conference on Pattern Recognition Methodology and Systems. Hague, Holland:IEEE, 1992:1-4.
[28] IGELNIK B, PAO Y H. Stochastic choice of basis functions in adaptive function approximation and the Functional-link neural net[J]. IEEE transactions on neural networks, 1995, 6(6):1320-1329.
[29] HUANG Guangbin. An insight into extreme learning machines:random neurons, random features and kernels[J]. Cognitive computation, 2014, 6(3):376-390.
[30] HUANG Guangbin, ZHU Qinyu, SIEW C K. Extreme learning machine:theory and applications[J]. Neurocomputing, 2006, 70(1/2/3):489-501.
[31] HUANG G B, CHEN L, SIEW C K. Universal approximation using incremental constructive feedforward networks with random hidden nodes[J]. IEEE transactions on neural networks, 2006, 17(4):879-92.
[32] LIU Xia, LIN Shaobo, FANG Jian, et al. Is extreme learning machine feasible? a theoretical assessment (Part I)[J]. IEEE transactions on neural networks and learning systems, 2014, 26(1):7-20.
[33] DEHURI S, CHO S B. A comprehensive survey on functional link neural networks and an adaptive PSO-BP learning for CFLNN[J]. Neural computing and applications, 2010, 19(2):187-205.
[34] ZHANG Le, SUGANTHAN P N. A comprehensive evaluation of random vector functional link networks[J]. Information sciences, 2015, 367/368:1094-1105.
[35] LIANG Nanying, HUANG Guangbin, SARATCHANDRAN P, et al. A fast and accurate online sequential learning algorithm for feedforward networks[J]. IEEE transactions on neural networks, 2006, 17(6):1411-1423.
[36] SCARDAPANE S, WANG Dianhui, PANELLA M, et al. Distributed learning for random vector functional-link networks[J]. Information sciences, 2015, 301:271-284.
[37] ALHAMDOOSH M, WANG Dianhui. Fast decorrelated neural network ensembles with random weights[J]. Information sciences, 2014, 264:104-117.
[38] LI Ying. Orthogonal incremental extreme learning machine for regression and multiclass classification[J]. Neural computing and applications, 2014, 27(1):111-120.
[39] 李凡军, 乔俊飞, 韩红桂. 网络结构增长的极端学习机算法[J]. 控制理论与应用, 2014, 31(5):638-643. LI Fanjun, QIAO Junfei, HAN Honghui. Incremental constructive extreme learning machine[J]. Control theory & applications, 2014, 31(5):638-643.
[40] 李凡军, 韩红桂, 乔俊飞. 基于灵敏度分析法的ELM剪枝算法[J]. 控制与决策, 2014, 29(6):1003-1008. LI Fanjun, HAN Honghui, QIAO Junfei. Pruning algorithm for extreme learning machine based on sensitivity analysis[J]. Control and Decision, 2014, 29(6):1003-1008.
[41] FENG Guorui, HUANG Guangbin, LIN Qingping, et al. Error minimized extreme learning machine with growth of hidden nodes and incremental learning[J]. IEEE transactions on neural networks, 2009, 20(8):1352-1357.
[42] MICHE Y, SORJAMAA A, BAS P, et al. OP-ELM:optimally pruned extreme learning machine[J]. IEEE transactions on neural networks, 2010, 21(1):158-162.
[43] 韩敏, 李德才. 基于替代函数及贝叶斯框架的1范数ELM算法[J]. 自动化学报, 2011, 37(11):1344-1350. HAN Min, LI Decai. An norm 1 regularization term elm algorithm based on surrogate function and bayesian framework[J]. Acta automatica sinica, 2011, 37(11):1344-1350.
[44] MICHE Y, VAN HEESWIJK M, BAS P, et al. TROP-ELM:a double-regularized ELM using LARS and Tikhonov regularization[J]. Neurocomputing, 2011, 74(16):2413-2421.
[45] MICHE Y, AKUSOK A, VEGANZONES D, et al. SOM-ELM-Self-organized clustering using ELM[J]. Neurocomputing, 2015, 165:238-254.
[46] HUANG Gao, LIU Tianchi, YANG Yan, et al. Discriminative clustering via extreme learning machine[J]. Neural networks, 2015, 70:1-8.
[47] ISLAM M M, SATTAR M A, AMIN M F, et al. A new constructive algorithm for architectural and functional adaptation of artificial neural networks[J]. IEEE transactions on systems, man, and cybernetics, part B (cybernetics), 2009, 39(6):1590-1605.
[48] SCHMIDHUBER J. Deep learning in neural networks:an overview[J]. Neural networks, 2014, 61:85-117.
[49] HAN Honggui, WANG Lidan, QIAO Junfei. Hierarchical extreme learning machine for feedforward neural network[J]. Neurocomputing, 2014, 128:128-135.
[50] QU B Y, LANG B F, LIANG J J, et al. Two-hidden-layer extreme learning machine for regression and classification[J]. Neurocomputing, 2016, 175:826-834.
[51] ZHANG Jian, DING Shifei, ZHANG Nan, et al. Incremental extreme learning machine based on deep feature embedded[J]. International journal of machine learning and cybernetics, 2016, 7(1):111-120.
[52] QU B Y, LANG B F, LIANG J J, et al. Two-hidden-layer extreme learning machine for regression and classification[J]. Neurocomputing, 2016, 175:826-834.
[53] ZHANG Nan, DING Shifei, ZHANG Jian. Multi layer ELM-RBF for multi-label learning[J]. Applied soft computing, 2016, 43:535-545.
[54] HUNTER D, YU Hao, PUKISH M S, et al. Selection of proper neural network sizes and architectures-a comparative study[J]. IEEE transactions on industrial informatics, 2012, 8(2):228-240.
[55] WILAMOWSKI B M. Challenges in applications of computational intelligence in industrial electronics[C]//Proceedings of 2010 IEEE International Symposium on Industrial Electronics. Bari:IEEE, 2010:15-22.
[56] WILAMOWSKI B M, COTTON N J, KAYNAK O, et al. Computing gradient vector and jacobian matrix in arbitrarily connected neural networks[J]. IEEE transactions on industrial electronics, 2008, 55(10):3784-3790.
[57] OAKDEN A. Cascade networks and extreme learning machines[D]. Canberra:Australian National University, 2014.
[58] LI Fanjun, QIAO Junfei, HAN Honggui, et al. A self-organizing cascade neural network with random weights for nonlinear system modeling[J]. Applied soft computing, 2016, 42:184-193.
[59] JAEGER H, HAAS H. Harnessing nonlinearity:predicting chaotic systems and saving energy in wireless communication[J]. Science, 2004, 304(5667):78-80.
[60] SKOWRONSKI M D, HARRIS J G. Automatic speech recognition using a predictive echo state network classifier[J]. Neural networks, 2007, 20(3):414-423.
[61] XIA Yili, JELFS B, VAN HULLE M M, et al. An augmented echo state network for nonlinear adaptive filtering of complex noncircular signals[J]. IEEE transactions on neural networks, 2011, 22(1):74-83.
[62] LUKOSEVICIUS M, JAEGER H. Reservoir computing approaches to recurrent neural network training[J]. Computer science review, 2009, 3(3):127-149.
[63] 彭宇, 王建民, 彭喜元. 储备池计算概述[J]. 电子学报, 2011, 39(10):2387-2396. PENG Yu, WANG Jianmin, PENG Xiyuan. Survey on reservoir computing[J]. Acta electronica sinica, 2011, 39(10):2387-2396.
[64] QIAO Junfei, LI Fanjun, HAN Honggui, et al. Growing echo-state network with multiple subreservoirs[J]. IEEE transactions on neural networks and learning systems, 2016, 99:1-14.
[65] DENG Zhidong, ZHANG Yi. Collective behavior of a small-world recurrent neural system with scale-free distribution[J]. IEEE transactions on neural networks, 2007, 18(5):1364-1375.
[66] STRAUSS T, WUSTLICH W, LABAHN R. Design strategies for weight matrices of echo state networks[J]. Neural computation, 2012, 24(12):3246-3276.
[67] XUE Yanbo, YANG Le, HAYKIN S. Decoupled echo state networks with lateral inhibition[J]. Neural networks, 2007, 20(3):365-376.
[68] NAJIBI E, ROSTAMI H. SCESN, SPESN, SWESN:three recurrent neural echo state networks with clustered reservoirs for prediction of nonlinear and chaotic time series[J]. Applied intelligence, 2015, 43(2):460-472.
[69] 薄迎春, 乔俊飞, 张昭昭. 一种具有small world特性的ESN结构分析与设计[J]. 控制与决策, 2012, 27(3):383-388. BO Yingchun, QIAO Junfei, ZHANG Zhaozhao. Analysis and design on structure of small world property ESN[J]. Control and decision, 2012, 27(3):383-388.
[70] 李凡军, 乔俊飞. 一种增量式模块化回声状态网络[J]. 控制与决策, 2016, 31(8):1481-1486. LI Fanjun, QIAO Junfei. An incremental modular echo state network[J]. Control and decision, 2016, 31(8):1481-1486.
[71] SCHRAUWEN B, VERSTRAETEN D, VAN CAMPENHOUT J M. An overview of reservoir computing:Theory, applications and implementations[C]//Proceedings of the 15th European Symposium on Artificial Neural Networks. Bruges, Belgium, 2007:471-482.
[72] MAASS W, NATSCHL?GER T, MARKRAM H. Real-time computing without stable states:a new framework for neural computation based on perturbations[J]. Neural computation, 2002, 14(11):2531-2560.
[73] LEGENSTEIN R, MAASS W. Edge of chaos and prediction of computational performance for neural circuit models.[J]. Neural networks, 2007, 20(3):323-334.
[74] BURGSTEINER H, KR?LL M, LEOPOLD A, et al. Movement prediction from real-world images using a liquid state machine[J]. Applied intelligence, 2007, 26(2):99-109.
[75] 张冠元, 王斌. 一种基于液体状态机的音乐和弦序列识别方法[J]. 模式识别与人工智能, 2013, 26(7):643-647. ZHANG Guanyuan, WANG Bin. Liquid state machine based music chord sequence recognition algorithm[J]. Pattern recognition and artificial intelligence, 2013, 26(7):643-647.
[76] HUANG Guangbin, CHEN Lei. Enhanced random search based incremental extreme learning machine[J]. Neurocomputing, 2008, 71(16/17/18):3460-3468.
[77] WANG Yuguang, CAO Feilong, YUAN Yubo. A study on effectiveness of extreme learning machine[J]. Neurocomputing, 2011, 74(16):2483-2490.
[78] CUI H, LIU X, LI L. The architecture of dynamic reservoir in the echo state network[J]. Chaos, 2012, 22(3):033127.
[79] OZTURK M C, XU Dongming. Analysis and design of echo state networks[J]. Neural computation, 2007, 19(1):111-138.
[80] DUAN Haibin, WANG Xiaohua. Echo state networks with orthogonal pigeon-inspired optimization for image restoration[J]. IEEE transactions on neural networks and learning systems, 2015, 27(11):2413-2425.
[81] BOCCATO L, ATTUX R, VON ZUBEN F J. Self-organization and lateral interaction in echo state network reservoirs[J]. Neurocomputing, 2014, 138:297-309.
[82] KORYAKIN D, LOHMANN J, BUTZ M V. Balanced echo state networks[J]. Neural networks the official journal of the international neural network, 2012, 36:35-45.
[83] YUENYONG S, NISHIHARA A. Evolutionary pre-training for CRJ-type reservoir of echo state networks[J]. Neurocomputing, 2015, 149:1324-1329.
[84] LI Xiumin, ZHONG Ling, XUE Fangzheng, et al. A priori data-driven multi-clustered reservoir generation algorithm for echo state network[J]. PLoS one, 2015, 10(4):e0120750.
[85] PALANGI H, DENG Li, WARD R K. Learning input and recurrent weight matrices in echo state networks[C]//Advances in Neural Information Processing Systems 22:Conference on Neural Information Processing Systems. Vancouver, British Columbia, Canada, 2009.
Similar References:

Memo

-

Last Update: 1900-01-01

Copyright © CAAI Transactions on Intelligent Systems