Abstract
Random feature mapping (RFM) is the core operation in the random weight neural network (RWNN). Its quality has a significant impact on the performance of a RWNN model. However, there has been no good way to evaluate the quality of RFM. In this paper, we introduce a new concept called dispersion degree of matrix information distribution (DDMID), which can be used to measure the quality of RFM. We used DDMID in our experiments to explain the relationship between the rank of input data and the performance of the RWNN model and got some interesting results. We found that: (1) when the rank of input data reaches a certain threshold, the model’s performance increases with the increase in the rank; (2) the impact of the rank on the model performance is insensitive to the type of activation functions and the number of hidden nodes; (3) if the DDMID of an RFM matrix is very small, it implies that the first k singular values in the singular value matrix of the RFM matrix contain too much information, which usually has a negative impact on the final closed-form solution of the RWNN model. Besides, we verified the improvement effect of intrinsic plasticity (IP) algorithm on RFM by using DDMID. The experimental results showed that DDMID allows researchers evaluate the mapping quality of data features before model training, so as to predict the effect of data preprocessing or network initialization without model training. We believe that our findings could provide useful guidance when constructing and analyzing a RWNN model.
Original language | English |
---|---|
Pages (from-to) | 12685-12696 |
Number of pages | 12 |
Journal | Neural Computing and Applications |
Volume | 32 |
Issue number | 16 |
Early online date | 22 Jan 2020 |
DOIs | |
Publication status | Published - Aug 2020 |
Externally published | Yes |
Bibliographical note
This work was supported in part by the National Natural Science Foundation of China (Grant 61672358 and Grant 61836005) and the Guangdong Science and Technology Department (Grant 2018B010107004)Keywords
- Extreme learning machine
- Random vector functional link network
- Random weight neural network