As it is a regularization layer, it is only active at training time.
Alpha Dropout is a Dropout that keeps mean and variance of inputs
to their original values, in order to ensure the self-normalizing property
even after this dropout.
Alpha Dropout fits well to Scaled Exponential Linear Units
by randomly setting activations to the negative saturation value.
Arguments:
rate: float, drop probability (as with Dropout).
The multiplicative noise will have
standard deviation sqrt(rate / (1 - rate)).
noise_shape: A 1-D Tensor of type int32, representing the
shape for randomly generated keep/drop flags.
Input shape:
Arbitrary. Use the keyword argument inputShape
(tuple of integers, does not include the samples axis)
when using this layer as the first layer in a model.
Applies Alpha Dropout to the input.
As it is a regularization layer, it is only active at training time.
Alpha Dropout is a
Dropout
that keeps mean and variance of inputs to their original values, in order to ensure the self-normalizing property even after this dropout. Alpha Dropout fits well to Scaled Exponential Linear Units by randomly setting activations to the negative saturation value.Arguments:
rate
: float, drop probability (as withDropout
). The multiplicative noise will have standard deviationsqrt(rate / (1 - rate))
.noise_shape
: A 1-DTensor
of typeint32
, representing the shape for randomly generated keep/drop flags.Input shape: Arbitrary. Use the keyword argument
inputShape
(tuple of integers, does not include the samples axis) when using this layer as the first layer in a model.Output shape: Same shape as input.
References: