Close
Go To Macao Polytechnic University

2020/2021

CARU: A content-adaptive recurrent unit for the transition of hidden state in NLP

27th International Conference on Neural Information Processing (ICONIP 2020), Bangkok, Thailand, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) *, 2020(12532): 693-703

Author(s)Ka-Hou Chan,
Wei Ke,
Sio-Kei Im
Summary

This article introduces a novel RNN unit inspired by GRU, namely the Content-Adaptive Recurrent Unit (CARU). The design of CARU contains all the features of GRU but requires fewer training parameters. We make use of the concept of weights in our design to analyze the transition of hidden states. At the same time, we also describe how the content adaptive gate handles the received words and alleviates the long-term dependence problem. As a result, the unit can improve the accuracy of the experiments, and the results show that CARU not only has better performance than GRU, but also produces faster training. Moreover, the proposed unit is general and can be applied to all RNN related neural network models.


* Also listed in EI.

Top Top