Gated Linear Unit#
Note
A gated linear unit is often abbreviated as a GRU. Not to be confused with the one in Despicable Me!
What are GRUs?#
GRU is a special kind of recurrent layer. It allows some input to pass the ‘gate’, but transform the other parts. The mechanism is highly inspired by LSTMs.
When to use GRUs?#
Gate linear units are a lot like LSTMs. It is much less complicated compare to LSTM, so it’s often used as a cheap replacement to LSTMs. Its performance is not too shabby, and it trains a lot faster compared to similar sized LSTM networks.