• Gated Recurrent Unit - Cho et al. 2014.

    This is an RNN layer consisting of one GRUCell. However, unlike the underlying GRUCell, the apply method of SimpleRNN operates on a sequence of inputs. The shape of the input (not including the first, batch dimension) needs to be at least 2-D, with the first dimension being time steps. For example:

    const rnn = tf.layers.gru({units: 8, returnSequences: true});

    // Create an input with 10 time steps.
    const input = tf.input({shape: [10, 20]});
    const output = rnn.apply(input);

    console.log(JSON.stringify(output.shape));
    // [null, 10, 8]: 1st dimension is unknown batch size; 2nd dimension is the
    // same as the sequence length of `input`, due to `returnSequences`: `true`;
    // 3rd dimension is the `GRUCell`'s number of units.

    @doc {heading: 'Layers', subheading: 'Recurrent', namespace: 'layers'}

    Parameters

    • args: GRULayerArgs

    Returns GRU

Generated using TypeDoc