mlpack
Public Member Functions | List of all members
mlpack::ann::LayerNorm< InputDataType, OutputDataType > Class Template Reference

Declaration of the Layer Normalization class. More...

#include <layer_norm.hpp>

Public Member Functions

 LayerNorm ()
 Create the LayerNorm object. More...
 
 LayerNorm (const size_t size, const double eps=1e-8)
 Create the LayerNorm object for a specified number of input units. More...
 
void Reset ()
 Reset the layer parameters.
 
template<typename eT >
void Forward (const arma::Mat< eT > &input, arma::Mat< eT > &output)
 Forward pass of Layer Normalization. More...
 
template<typename eT >
void Backward (const arma::Mat< eT > &input, const arma::Mat< eT > &gy, arma::Mat< eT > &g)
 Backward pass through the layer. More...
 
template<typename eT >
void Gradient (const arma::Mat< eT > &input, const arma::Mat< eT > &error, arma::Mat< eT > &gradient)
 Calculate the gradient using the output delta and the input activations. More...
 
OutputDataType const & Parameters () const
 Get the parameters.
 
OutputDataType & Parameters ()
 Modify the parameters.
 
OutputDataType const & OutputParameter () const
 Get the output parameter.
 
OutputDataType & OutputParameter ()
 Modify the output parameter.
 
OutputDataType const & Delta () const
 Get the delta.
 
OutputDataType & Delta ()
 Modify the delta.
 
OutputDataType const & Gradient () const
 Get the gradient.
 
OutputDataType & Gradient ()
 Modify the gradient.
 
OutputDataType Mean ()
 Get the mean across single training data.
 
OutputDataType Variance ()
 Get the variance across single training data.
 
size_t InSize () const
 Get the number of input units.
 
double Epsilon () const
 Get the value of epsilon.
 
size_t InputShape () const
 Get the shape of the input.
 
template<typename Archive >
void serialize (Archive &ar, const uint32_t)
 Serialize the layer.
 

Detailed Description

template<typename InputDataType = arma::mat, typename OutputDataType = arma::mat>
class mlpack::ann::LayerNorm< InputDataType, OutputDataType >

Declaration of the Layer Normalization class.

The layer transforms the input data into zero mean and unit variance and then scales and shifts the data by parameters, gamma and beta respectively over a single training data. These parameters are learnt by the network. Layer Normalization is different from Batch Normalization in the way that normalization is done for individual training cases, and the mean and standard deviations are computed across the layer dimensions, as opposed to across the batch.

For more information, refer to the following papers,

@article{Ba16,
author = {Jimmy Lei Ba, Jamie Ryan Kiros and Geoffrey E. Hinton},
title = {Layer Normalization},
volume = {abs/1607.06450},
year = {2016},
url = {http://arxiv.org/abs/1607.06450},
eprint = {1607.06450},
}
@article{Ioffe15,
author = {Sergey Ioffe and
Christian Szegedy},
title = {Batch Normalization: Accelerating Deep Network Training by
Reducing Internal Covariate Shift},
journal = {CoRR},
volume = {abs/1502.03167},
year = {2015},
url = {http://arxiv.org/abs/1502.03167},
eprint = {1502.03167},
}
Template Parameters
InputDataTypeType of the input data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).
OutputDataTypeType of the output data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).

Constructor & Destructor Documentation

◆ LayerNorm() [1/2]

template<typename InputDataType , typename OutputDataType >
mlpack::ann::LayerNorm< InputDataType, OutputDataType >::LayerNorm ( )

Create the LayerNorm object.

Artificial Neural Network.

◆ LayerNorm() [2/2]

template<typename InputDataType , typename OutputDataType >
mlpack::ann::LayerNorm< InputDataType, OutputDataType >::LayerNorm ( const size_t  size,
const double  eps = 1e-8 
)

Create the LayerNorm object for a specified number of input units.

Parameters
sizeThe number of input units.
epsThe epsilon added to variance to ensure numerical stability.

Member Function Documentation

◆ Backward()

template<typename InputDataType , typename OutputDataType >
template<typename eT >
void mlpack::ann::LayerNorm< InputDataType, OutputDataType >::Backward ( const arma::Mat< eT > &  input,
const arma::Mat< eT > &  gy,
arma::Mat< eT > &  g 
)

Backward pass through the layer.

Parameters
inputThe input activations.
gyThe backpropagated error.
gThe calculated gradient.

◆ Forward()

template<typename InputDataType , typename OutputDataType >
template<typename eT >
void mlpack::ann::LayerNorm< InputDataType, OutputDataType >::Forward ( const arma::Mat< eT > &  input,
arma::Mat< eT > &  output 
)

Forward pass of Layer Normalization.

Transforms the input data into zero mean and unit variance, scales the data by a factor gamma and shifts it by beta.

Parameters
inputInput data for the layer.
outputResulting output activations.

◆ Gradient()

template<typename InputDataType , typename OutputDataType >
template<typename eT >
void mlpack::ann::LayerNorm< InputDataType, OutputDataType >::Gradient ( const arma::Mat< eT > &  input,
const arma::Mat< eT > &  error,
arma::Mat< eT > &  gradient 
)

Calculate the gradient using the output delta and the input activations.

Parameters
inputThe input activations.
errorThe calculated error.
gradientThe calculated gradient.

The documentation for this class was generated from the following files: