• Parameterized version of a leaky rectified linear unit.

    It follows f(x) = alpha * x for x < 0. f(x) = x for x >= 0. wherein alpha is a trainable weight.

    Input shape: Arbitrary. Use the configuration inputShape when using this layer as the first layer in a model.

    Output shape: Same shape as the input.

    Parameters

    • Optional args: PReLULayerArgs

    Returns PReLU

    Doc

Generated using TypeDoc