Abstract
The extreme learning machine (ELM) concept provides some effective training algorithms to construct single hidden layer feedforward networks (SHLFNs). However, the conventional ELM algorithms were designed for the noiseless situation only, in which the outputs of the hidden nodes are not contaminated by noise. This paper presents two noise-resistant training algorithms, namely noise-resistant incremental ELM (NRI-ELM) and noise-resistant convex incremental ELM (NRCI-ELM). For NRI-ELM, its noise-resistant ability is better than that of the conventional incremented ELM algorithms. To further enhance the noise resistant ability, the NRCI-ELM algorithm is proposed. The convergent properties of the two proposed noise resistant algorithms are also presented.
Original language | English |
---|---|
Title of host publication | Advances in Neural Networks: ISNN 2017: 14th International Symposium, ISNN 2017, Sapporo, Hakodate, and Muroran, Hokkaido, Japan, June 21–26, 2017, Proceedings, Part II |
Editors | Fengyu CONG, Andrew LEUNG, Qinglai WEI |
Publisher | Springer, Cham |
Pages | 257-265 |
Number of pages | 9 |
ISBN (Electronic) | 9783319590813 |
ISBN (Print) | 9783319590806 |
DOIs | |
Publication status | Published - 2017 |
Externally published | Yes |
Event | 14th International Symposium on Neural Networks (ISNN 2017) - Hokkaido University, Sapporo, Japan Duration: 21 Jun 2017 → 26 Jun 2017 |
Conference
Conference | 14th International Symposium on Neural Networks (ISNN 2017) |
---|---|
Country/Territory | Japan |
City | Sapporo |
Period | 21/06/17 → 26/06/17 |
Funding
The work was supported by a research grant from the Government of the Hong Kong Special Administrative Region (CityU 11259516).
Keywords
- Extreme learning machines
- Incremental algorithm
- Node noise