TY - GEN
T1 - A Rank Reduced Matrix Method in Extreme Learning Machine
AU - LU, Shuxia
AU - ZHANG, Guiqiang
AU - WANG, Xizhao
N1 - This research is supported in part by the National Natural Science Foundation of China (No. 61170040), the Natural Science Foundation of Hebei Province (No. F2011201063, F2010000323), the Plan of the Natural Science Foundation of Hebei University (doctor project) (No.Y2008122).
PY - 2012
Y1 - 2012
N2 - Extreme learning machine (ELM) is a learning algorithm for single-hidden layer feedforward neural networks (SLFNs) which randomly chooses hidden nodes and analytically determines the output weights of SLFNs. but when dealing with large datasets, we need more hidden nodes to enhance training and testing accuracy, in this case, this algorithm can't achieve high speed any more, sometimes its training can't be executed because the bias matrix is out of memory. We focus on this issue and use the Rank Reduced Matrix (MMR) method to calculate the hidden layer output matrix, the result showed this method can not only reach much higher speed but also better improve the generalization performance whenever the number of hidden nodes is large or not.
AB - Extreme learning machine (ELM) is a learning algorithm for single-hidden layer feedforward neural networks (SLFNs) which randomly chooses hidden nodes and analytically determines the output weights of SLFNs. but when dealing with large datasets, we need more hidden nodes to enhance training and testing accuracy, in this case, this algorithm can't achieve high speed any more, sometimes its training can't be executed because the bias matrix is out of memory. We focus on this issue and use the Rank Reduced Matrix (MMR) method to calculate the hidden layer output matrix, the result showed this method can not only reach much higher speed but also better improve the generalization performance whenever the number of hidden nodes is large or not.
KW - Extreme learning machine
KW - Rank reduced matrix
KW - Singular value decomposition
UR - http://www.scopus.com/inward/record.url?scp=84865118876&partnerID=8YFLogxK
U2 - 10.1007/978-3-642-31346-2_9
DO - 10.1007/978-3-642-31346-2_9
M3 - Conference paper (refereed)
AN - SCOPUS:84865118876
SN - 9783642313455
T3 - Lecture Notes in Computer Science
SP - 72
EP - 79
BT - Advances in Neural Networks, ISNN 2012 : 9th International Symposium on Neural Networks, Proceedings
A2 - WANG, Jun
A2 - YEN, Gary G.
A2 - POLYCARPOU, Marios M.
PB - Springer Berlin
T2 - 9th International Symposium on Neural Networks, ISNN 2012
Y2 - 11 July 2012 through 14 July 2012
ER -