-
Book Overview & Buying
-
Table Of Contents
-
Feedback & Rating

Deep Learning for Natural Language Processing
By :

Note that all we have done is update the cell state until now. We need to generate the activation for the current state as well; that is, (h[t]). This is done using an output gate that is calculated as given:
The input at timestep t is multiplied by a new set of weights, W_o, with the dimensions (n_h, n_x). The activation from the previous timestep (h[t-1]) is multiplied by another new set of weights, U_o, with the dimensions (n_h, n_h). Note that the multiplications are matrix multiplications. These two terms are then added and passed through a sigmoid function to squish the output, o[t], within a range of [0,1]. The output has the same number of dimensions as there are in cell state vector h (n_h, 1).
The output gate is responsible for regulating the amount by which the current cell state is allowed to affect the activation value for the timestep. In our example sentence, it is worth propagating the...
Change the font size
Change margin width
Change background colour