Noise resistant training for extreme learning machine

Yik Lam LUI, Hiu Tung WONG, Chi-Sing LEUNG, Sam KWONG

Research output: Book Chapters | Papers in Conference ProceedingsConference paper (refereed)Researchpeer-review

1 Citation (Scopus)


The extreme learning machine (ELM) concept provides some effective training algorithms to construct single hidden layer feedforward networks (SHLFNs). However, the conventional ELM algorithms were designed for the noiseless situation only, in which the outputs of the hidden nodes are not contaminated by noise. This paper presents two noise-resistant training algorithms, namely noise-resistant incremental ELM (NRI-ELM) and noise-resistant convex incremental ELM (NRCI-ELM). For NRI-ELM, its noise-resistant ability is better than that of the conventional incremented ELM algorithms. To further enhance the noise resistant ability, the NRCI-ELM algorithm is proposed. The convergent properties of the two proposed noise resistant algorithms are also presented.
Original languageEnglish
Title of host publicationAdvances in Neural Networks: ISNN 2017: 14th International Symposium, ISNN 2017, Sapporo, Hakodate, and Muroran, Hokkaido, Japan, June 21–26, 2017, Proceedings, Part II
EditorsFengyu CONG, Andrew LEUNG, Qinglai WEI
PublisherSpringer, Cham
Number of pages9
ISBN (Electronic)9783319590813
ISBN (Print)9783319590806
Publication statusPublished - 2017
Externally publishedYes
Event14th International Symposium on Neural Networks (ISNN 2017) - Hokkaido University, Sapporo, Japan
Duration: 21 Jun 201726 Jun 2017


Conference14th International Symposium on Neural Networks (ISNN 2017)

Bibliographical note

The work was supported by a research grant from the Government of the Hong Kong Special Administrative Region (CityU 11259516).


  • Extreme learning machines
  • Incremental algorithm
  • Node noise


Dive into the research topics of 'Noise resistant training for extreme learning machine'. Together they form a unique fingerprint.

Cite this