mlpack
Public Member Functions | List of all members
mlpack::ann::BatchNorm< InputDataType, OutputDataType > Class Template Reference

Declaration of the Batch Normalization layer class. More...

#include <batch_norm.hpp>

Public Member Functions

 BatchNorm ()
 Create the BatchNorm object. More...
 
 BatchNorm (const size_t size, const double eps=1e-8, const bool average=true, const double momentum=0.1)
 Create the BatchNorm layer object for a specified number of input units. More...
 
void Reset ()
 Reset the layer parameters.
 
template<typename eT >
void Forward (const arma::Mat< eT > &input, arma::Mat< eT > &output)
 Forward pass of the Batch Normalization layer. More...
 
template<typename eT >
void Backward (const arma::Mat< eT > &input, const arma::Mat< eT > &gy, arma::Mat< eT > &g)
 Backward pass through the layer. More...
 
template<typename eT >
void Gradient (const arma::Mat< eT > &input, const arma::Mat< eT > &error, arma::Mat< eT > &gradient)
 Calculate the gradient using the output delta and the input activations. More...
 
OutputDataType const & Parameters () const
 Get the parameters.
 
OutputDataType & Parameters ()
 Modify the parameters.
 
OutputDataType const & OutputParameter () const
 Get the output parameter.
 
OutputDataType & OutputParameter ()
 Modify the output parameter.
 
OutputDataType const & Delta () const
 Get the delta.
 
OutputDataType & Delta ()
 Modify the delta.
 
OutputDataType const & Gradient () const
 Get the gradient.
 
OutputDataType & Gradient ()
 Modify the gradient.
 
bool Deterministic () const
 Get the value of deterministic parameter.
 
bool & Deterministic ()
 Modify the value of deterministic parameter.
 
OutputDataType const & TrainingMean () const
 Get the mean over the training data.
 
OutputDataType & TrainingMean ()
 Modify the mean over the training data.
 
OutputDataType const & TrainingVariance () const
 Get the variance over the training data.
 
OutputDataType & TrainingVariance ()
 Modify the variance over the training data.
 
size_t InputSize () const
 Get the number of input units / channels.
 
double Epsilon () const
 Get the epsilon value.
 
double Momentum () const
 Get the momentum value.
 
bool Average () const
 Get the average parameter.
 
size_t WeightSize () const
 Get size of weights.
 
template<typename Archive >
void serialize (Archive &ar, const uint32_t)
 Serialize the layer.
 

Detailed Description

template<typename InputDataType = arma::mat, typename OutputDataType = arma::mat>
class mlpack::ann::BatchNorm< InputDataType, OutputDataType >

Declaration of the Batch Normalization layer class.

The layer transforms the input data into zero mean and unit variance and then scales and shifts the data by parameters, gamma and beta respectively. These parameters are learnt by the network.

If deterministic is false (training), the mean and variance over the batch is calculated and the data is normalized. If it is set to true (testing) then the mean and variance accrued over the training set is used.

For more information, refer to the following paper,

@article{Ioffe15,
author = {Sergey Ioffe and
Christian Szegedy},
title = {Batch Normalization: Accelerating Deep Network Training by
Reducing Internal Covariate Shift},
journal = {CoRR},
volume = {abs/1502.03167},
year = {2015},
url = {http://arxiv.org/abs/1502.03167},
eprint = {1502.03167},
}
Template Parameters
InputDataTypeType of the input data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).
OutputDataTypeType of the output data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).

Constructor & Destructor Documentation

◆ BatchNorm() [1/2]

template<typename InputDataType , typename OutputDataType >
mlpack::ann::BatchNorm< InputDataType, OutputDataType >::BatchNorm ( )

Create the BatchNorm object.

Artificial Neural Network.

◆ BatchNorm() [2/2]

template<typename InputDataType , typename OutputDataType >
mlpack::ann::BatchNorm< InputDataType, OutputDataType >::BatchNorm ( const size_t  size,
const double  eps = 1e-8,
const bool  average = true,
const double  momentum = 0.1 
)

Create the BatchNorm layer object for a specified number of input units.

Parameters
sizeThe number of input units / channels.
epsThe epsilon added to variance to ensure numerical stability.
averageBoolean to determine whether cumulative average is used for updating the parameters or momentum is used.
momentumParameter used to to update the running mean and variance.

Member Function Documentation

◆ Backward()

template<typename InputDataType , typename OutputDataType >
template<typename eT >
void mlpack::ann::BatchNorm< InputDataType, OutputDataType >::Backward ( const arma::Mat< eT > &  input,
const arma::Mat< eT > &  gy,
arma::Mat< eT > &  g 
)

Backward pass through the layer.

Parameters
inputThe input activations
gyThe backpropagated error.
gThe calculated gradient.

◆ Forward()

template<typename InputDataType , typename OutputDataType >
template<typename eT >
void mlpack::ann::BatchNorm< InputDataType, OutputDataType >::Forward ( const arma::Mat< eT > &  input,
arma::Mat< eT > &  output 
)

Forward pass of the Batch Normalization layer.

Transforms the input data into zero mean and unit variance, scales the data by a factor gamma and shifts it by beta.

Parameters
inputInput data for the layer
outputResulting output activations.

◆ Gradient()

template<typename InputDataType , typename OutputDataType >
template<typename eT >
void mlpack::ann::BatchNorm< InputDataType, OutputDataType >::Gradient ( const arma::Mat< eT > &  input,
const arma::Mat< eT > &  error,
arma::Mat< eT > &  gradient 
)

Calculate the gradient using the output delta and the input activations.

Parameters
inputThe input activations
errorThe calculated error
gradientThe calculated gradient.

The documentation for this class was generated from the following files: