Normalizes the activations of the previous layer for each given example in a
batch independently, instead of across a batch like in batchNormalization.
In other words, this layer applies a transformation that maintains the mean
activation within each example close to 0 and activation variance close to 1.
Input shape:
Arbitrary. Use the argument inputShape when using this layer as the first
layer in a model.
Layer-normalization layer (Ba et al., 2016).
Normalizes the activations of the previous layer for each given example in a batch independently, instead of across a batch like in
batchNormalization
. In other words, this layer applies a transformation that maintains the mean activation within each example close to 0 and activation variance close to 1.Input shape: Arbitrary. Use the argument
inputShape
when using this layer as the first layer in a model.Output shape: Same as input.
References: