A generalized I-ELM algorithm for handling node noise in single-hidden layer feedforward networks

Hiu Tung WONG, Chi-Sing LEUNG, Sam KWONG

Research output: Book Chapters | Papers in Conference ProceedingsConference paper (refereed)Researchpeer-review

1 Citation (Scopus)


The incremental extreme learning machine (I-ELM) algorithm provides a low computational complexity training mechanism for single-hidden layer feedforward neworks (SLFNs). However, the original I-ELM algorithm does not consider the node noise situation, and node noise may greatly degrade the performance of a trained SLFN. This paper presents a generalized node noise resistant I-ELM (GNNR-I-ELM) for SLFNs. We first define a noise resistant training objective function for SLFNs. Afterwards, we develop the GNNR-I-ELM algorithm which adds τ nodes into the network at each iteration. The GNNR-I-ELM algorithm estimates the output weights of the newly additive nodes and does not change all the previously trained output weights. Its noise tolerant ability is much better than that of the original I-ELM. Besides, we prove that in terms of the training set mean squared error of noisy networks, the GNNR-I-ELM algorithm converges.
Original languageEnglish
Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Publication statusPublished - Nov 2017
Externally publishedYes

Bibliographical note

The work was supported by a research grant from the Government of the Hong Kong Special Administrative Region (CityU 11259516).


  • Additive noise
  • Feedforward networks
  • Multiplicative noise


Dive into the research topics of 'A generalized I-ELM algorithm for handling node noise in single-hidden layer feedforward networks'. Together they form a unique fingerprint.

Cite this