The incremental extreme learning machine (I-ELM) algorithm provides a low computational complexity training mechanism for single-hidden layer feedforward neworks (SLFNs). However, the original I-ELM algorithm does not consider the node noise situation, and node noise may greatly degrade the performance of a trained SLFN. This paper presents a generalized node noise resistant I-ELM (GNNR-I-ELM) for SLFNs. We first define a noise resistant training objective function for SLFNs. Afterwards, we develop the GNNR-I-ELM algorithm which adds τ nodes into the network at each iteration. The GNNR-I-ELM algorithm estimates the output weights of the newly additive nodes and does not change all the previously trained output weights. Its noise tolerant ability is much better than that of the original I-ELM. Besides, we prove that in terms of the training set mean squared error of noisy networks, the GNNR-I-ELM algorithm converges.
|Title of host publication||Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)|
|Publication status||Published - Nov 2017|
Bibliographical noteThe work was supported by a research grant from the Government of the Hong Kong Special Administrative Region (CityU 11259516).
- Additive noise
- Feedforward networks
- Multiplicative noise