Close
返回澳門理工大學

2018/2019

SinP[N]: A fast convergence activation function for convolutional neural networks^

Proceedings - 11th IEEE/ACM International Conference on Utility and Cloud Computing Companion, UCC Companion 2018, Zurich, Switzerland, IEEE & ACM*, 2019: 359-364

作者Ka-hou Chan,
Sio-Kei Im,
Wei Ke,
Ngan-Lin Lei
摘要
Convolutional Neural Networks (CNNs) are currently the most advanced machine learning architecture for visual data classification. The choice of activation functions has a significant impact on the performance of a training task. In order to overcome the vanishing gradient problem, we propose a new activation function for the classification system. The activation function makes use of the properties of periodic functions, where the derivative of a periodic function is also periodic. Furthermore, a linear combination is introduced to prevent the derivative from becoming zero. We verify this novel activation function by training an empirical analysis and comparing with the currently discovered activation functions. Experimental results show that our activation function SinP[N](x) = sin(x)+Nx, leads to very fast convergence even without the normalization layer. As a result, this new activation function enhances the training accuracy significantly, and can be easily deployed in the current systems built upon the standard CNN architecture.
 


* It is also listed in SCOPUS

^ 同時列入不同索引的文章

Top Top