This paper presents a new algorithm, called adaptive merging and growing algorithm (AMGA), in designing artificial neural networks (ANNs). This algorithm merges and adds hidden neurons during the training process of ANNs. The merge operation introduced in AMGA is a kind of a mixed mode operation, which is equivalent to pruning two neurons and adding one neuron. Unlike most previous studies, AMGA puts emphasis on autonomous functioning in the design process of ANNs. This is the main reason why AMGA uses an adaptive not a predefined fixed strategy in designing ANNs. The adaptive strategy merges or adds hidden neurons based on the learning ability of hidden neurons or the training progress of ANNs. In order to reduce the amount of retraining after modifying ANN architectures, AMGA prunes hidden neurons by merging correlated hidden neurons and adds hidden neurons by splitting existing hidden neurons. The proposed AMGA has been tested on a number of benchmark problems in machine learning and ANNs, including breast cancer, Australian credit card assessment, and diabetes, gene, glass, heart, iris, and thyroid problems. The experimental results show that AMGA can design compact ANN architectures with good generalization ability compared to other algorithms. © 2009 IEEE.
|Number of pages
|IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
|Early online date
|13 Feb 2009
|Published - Jun 2009
Bibliographical noteThis work was supported in part by the Japanese Society for Promotion of Science (JSPS), by the Yazaki Memorial Foundation for Science and Technology, and by the University of Fukui through grants given to K. Murase. The work of Md. M. Islam was supported by the JSPS through a fellowship. This paper was recommended by Associate Editor S. Hu.
- Adding neurons
- Artificial neural network (ANN) design
- Generalization ability
- Merging neurons