R. Zakovskis, A. Draguns, E. Gaile, E. Ozolins, K. Freivalds. Gates Are Not What You Need in RNNs. The 22nd International Conference on Artificial Intelligence and Soft Computing. Lecture Notes in Artificial Intelligence, 14125(), Springer Cham, 2023.

Bibtex citāts:
@inproceedings{14795_2023,
author = {R. Zakovskis and A. Draguns and E. Gaile and E. Ozolins and K. Freivalds},
title = {Gates Are Not What You Need in RNNs},
journal = {The 22nd International Conference on Artificial Intelligence and Soft Computing. Lecture Notes in Artificial Intelligence},
volume = {14125},
publisher = {Springer Cham},
year = {2023}
}

Anotācija: Recurrent neural networks have flourished in many areas. Consequently, we can see new RNN cells being developed continuously, usually by creating or using gates in a new, original way. But what if we told you that gates in RNNs are redundant? In this paper, we propose a new recurrent cell called Residual Recurrent Unit (RRU) which beats traditional cells and does not employ a single gate. It is based on the residual shortcut connection together with linear transformations, ReLU, and normalization. To evaluate our cell's effectiveness, we compare its performance against the widely-used GRU and LSTM cells and the recently proposed Mogrifier LSTM on several tasks including, polyphonic music modeling, language modeling, and sentiment analysis. Our experiments show that RRU outperforms the traditional gated units on most of these tasks. Also, it has better robustness to parameter selection, allowing immediate application in new tasks without much tuning. We have implemented the RRU in TensorFlow, and the code is made available at https://github.com/LUMII-Syslab/RRU.

URL: https://arxiv.org/abs/2108.00527

Žurnāla kvartile: Q3

Pilnais teksts: RRU___ICAISC_2023

Scopus meklēšana