Welcome to the community hub built on top of the Gating mechanism Wikipedia article.
Here, you can discuss, collect, and organize anything related to Gating mechanism. The
purpose of the hub is to connect...
The gated recurrent unit (GRU) simplifies the LSTM.[3] Compared to the LSTM, the GRU has just two gates: a reset gate and an update gate. GRU also merges the cell state and hidden state. The reset gate roughly corresponds to the forget gate, and the update gate roughly corresponds to the input gate. The output gate is removed.
There are several variants of GRU. One particular variant has these equations:[4]
^Zhang, Aston; Lipton, Zachary; Li, Mu; Smola, Alexander J. (2024). "10.1. Long Short-Term Memory (LSTM)". Dive into deep learning. Cambridge New York Port Melbourne New Delhi Singapore: Cambridge University Press. ISBN978-1-009-38943-3.
^Cho, Kyunghyun; van Merrienboer, Bart; Bahdanau, DZmitry; Bougares, Fethi; Schwenk, Holger; Bengio, Yoshua (2014). "Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation". Association for Computational Linguistics. arXiv:1406.1078.
^Zhang, Aston; Lipton, Zachary; Li, Mu; Smola, Alexander J. (2024). "10.2. Gated Recurrent Units (GRU)". Dive into deep learning. Cambridge New York Port Melbourne New Delhi Singapore: Cambridge University Press. ISBN978-1-009-38943-3.
^Hua, Weizhe; Zhou, Yuan; De Sa, Christopher M; Zhang, Zhiru; Suh, G. Edward (2019). "Channel Gating Neural Networks". Advances in Neural Information Processing Systems. 32. Curran Associates, Inc. arXiv:1805.12549.
Zhang, Aston; Lipton, Zachary; Li, Mu; Smola, Alexander J. (2024). "10.1. Long Short-Term Memory (LSTM)". Dive into deep learning. Cambridge New York Port Melbourne New Delhi Singapore: Cambridge University Press. ISBN978-1-009-38943-3.