Abstract
Stochastic Configure Networks (SCNs) are an incremental variant of randomly weighted neural networks of which one of the key highlights lies in the constraints while adding hidden layer nodes. Recent studies show two weaknesses of SCNs. One is the redundancy of added hidden layer nodes while the other is the lack of interpretability to the designed constraints. Motivated by overcoming these weaknesses, this paper proposes a new model named Evolving SCN to increase the interpretability of the design for constraints from the viewpoint of evolution by a sampling mechanism and to promote the model compactness by optimizing random weights within the space of constraint parameters. Surprisingly, although an evolving method is used in our model, the effectiveness and efficiency are significantly improved with respect to running time and prediction accuracy in comparison with the existing versions of SCNs. This work makes a first attempt to enhance the interpretability of incrementally adding nodes and simultaneously reduce the redundancy of hidden nodes in SCNs, which brings some new insights to understand the model compactness from the view of Occam's razor and further to the nature of incremental learning.
Original language | English |
---|---|
Article number | 119006 |
Number of pages | 15 |
Journal | Information Sciences |
Volume | 639 |
Early online date | 26 Apr 2023 |
DOIs | |
Publication status | Published - Aug 2023 |
Externally published | Yes |
Bibliographical note
This work was supported in part by the National Natural Science Foundation of China under Grant 61976141, in part by the State Key Laboratory of Mechanical Behavior and System Safety of Traffic Engineering Structures of Shijiazhuang Tiedao University under Grant KF2022-10.Keywords
- Evolutionary algorithm
- Feed-forward neural network
- Incremental learning
- Model compactness
- SCN (stochastic configure network)