Close
Ir à UPM

2019/2020

Self-adaptive layer: An application of function approximation theory to enhance convergence efficiency in neural networks*

2020 International Conference on Information Networking (ICOIN)*, IEEE, Barcelona, 2020: 447-452

Author(s)Ka-Hou Chan,
Sio-Kei Im,
Wei Ke
Summary

Neural networks provide a general architecture to model complex nonlinear systems, but the source data are often mixed with a lot of noise and interference information. One way to offer a smoother alternative for addressing this issue in training is to increase the neural or layer size. In this paper, a new self-adaptive layer is developed to overcome the problems of neural networks so as to achieve faster convergence and avoid local minimum. We incorporate function approximation theory into the layer element arrangement, so that the training process and the network approximation properties can be investigated via linear algebra, where the precision of adaptation can be controlled by the order of polynomials being used. Experimental results show that our proposed layer leads to significantly faster performance in convergence. As a result, this new layer greatly enhances the training accuracy. Moreover, the design and implementation can be easily deployed in most current systems.

 


* Also listed in EI.

Top Top