Abstract
Deep Neural Networks (DNNs) demonstrate great performances in pattern classification problems. There are several available activation functions for DNNs while the Sigmoid and the Tanh functions are most widely used choices. In this work, we propose the Broad Autoencoder Features (BAF) to better utilize advantages of different activation functions. The BAF consists of four parallel connected Stacked AutoEncoders (SAEs) with different activation functions: the Sigmoid, the Tanh, the ReLu, and the Softplus. With this broad setting, the final learned features merge learn features using diversified nonlinear mappings from the original input features and such that more information is mined from the original input features. Experimental results show that the BAF yields better learned features in comparison with merging four SAEs using the same activation functions.
Original language | English |
---|---|
Title of host publication | Proceedings of the 18th International Conference on Cognitive Informatics and Cognitive Computing, ICCI*CC 2019 |
Publisher | IEEE |
Pages | 130-135 |
Number of pages | 6 |
ISBN (Electronic) | 9781728114194 |
ISBN (Print) | 9781728104966 |
DOIs | |
Publication status | Published - Jul 2019 |
Externally published | Yes |
Event | 18th IEEE International Conference on Cognitive Informatics and Cognitive Computing, ICCI*CC 2019 - Politecnico di Milano University, Milan, Italy Duration: 23 Jul 2019 → 25 Jul 2019 https://www.iccicc19.polimi.it/ |
Conference
Conference | 18th IEEE International Conference on Cognitive Informatics and Cognitive Computing, ICCI*CC 2019 |
---|---|
Country/Territory | Italy |
City | Milan |
Period | 23/07/19 → 25/07/19 |
Internet address |
Keywords
- feature learning
- pattern classification
- stacked autoencoder