mlpack
Public Member Functions | List of all members
mlpack::ann::WeightNorm< InputDataType, OutputDataType, CustomLayers > Class Template Reference

Declaration of the WeightNorm layer class. More...

#include <weight_norm.hpp>

Public Member Functions

 WeightNorm (LayerTypes< CustomLayers... > layer=LayerTypes< CustomLayers... >())
 Create the WeightNorm layer object. More...
 
 ~WeightNorm ()
 Destructor to release allocated memory.
 
void Reset ()
 Reset the layer parameters.
 
template<typename eT >
void Forward (const arma::Mat< eT > &input, arma::Mat< eT > &output)
 Forward pass of the WeightNorm layer. More...
 
template<typename eT >
void Backward (const arma::Mat< eT > &input, const arma::Mat< eT > &gy, arma::Mat< eT > &g)
 Backward pass through the layer. More...
 
template<typename eT >
void Gradient (const arma::Mat< eT > &input, const arma::Mat< eT > &error, arma::Mat< eT > &gradient)
 Calculate the gradient using the output delta, input activations and the weights of the wrapped layer. More...
 
OutputDataType const & Delta () const
 Get the delta.
 
OutputDataType & Delta ()
 Modify the delta.
 
OutputDataType const & Gradient () const
 Get the gradient.
 
OutputDataType & Gradient ()
 Modify the gradient.
 
OutputDataType const & OutputParameter () const
 Get the output parameter.
 
OutputDataType & OutputParameter ()
 Modify the output parameter.
 
OutputDataType const & Parameters () const
 Get the parameters.
 
OutputDataType & Parameters ()
 Modify the parameters.
 
LayerTypes< CustomLayers... > const & Layer ()
 Get the wrapped layer.
 
template<typename Archive >
void serialize (Archive &ar, const uint32_t)
 Serialize the layer.
 

Detailed Description

template<typename InputDataType = arma::mat, typename OutputDataType = arma::mat, typename... CustomLayers>
class mlpack::ann::WeightNorm< InputDataType, OutputDataType, CustomLayers >

Declaration of the WeightNorm layer class.

The layer reparameterizes the weight vectors in a neural network, decoupling the length of those weight vectors from their direction. This reparameterization does not introduce any dependencies between the examples in a mini-batch.

This class will be a wrapper around existing layers. It will just modify the calculation and updation of weights of the layer.

For more information, refer to the following paper,

@inproceedings{Salimans2016WeightNorm,
title = {Weight Normalization: A Simple Reparameterization to Accelerate
Training of Deep Neural Networks},
author = {Tim Salimans, Diederik P. Kingma},
booktitle = {Neural Information Processing Systems 2016},
year = {2016},
url = {https://arxiv.org/abs/1602.07868},
}
Template Parameters
InputDataTypeType of the input data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).
OutputDataTypeType of the output data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).
CustomLayersAdditional custom layers that can be added.

Constructor & Destructor Documentation

◆ WeightNorm()

template<typename InputDataType , typename OutputDataType , typename... CustomLayers>
mlpack::ann::WeightNorm< InputDataType, OutputDataType, CustomLayers >::WeightNorm ( LayerTypes< CustomLayers... >  layer = LayerTypes<CustomLayers...>())

Create the WeightNorm layer object.

Artificial Neural Network.

Parameters
layerThe layer whose weights are needed to be normalized.

Member Function Documentation

◆ Backward()

template<typename InputDataType , typename OutputDataType , typename... CustomLayers>
template<typename eT >
void mlpack::ann::WeightNorm< InputDataType, OutputDataType, CustomLayers >::Backward ( const arma::Mat< eT > &  input,
const arma::Mat< eT > &  gy,
arma::Mat< eT > &  g 
)

Backward pass through the layer.

This function calls the Backward() function of the wrapped layer.

Parameters
inputThe input activations.
gyThe backpropagated error.
gThe calculated gradient.

◆ Forward()

template<typename InputDataType , typename OutputDataType , typename... CustomLayers>
template<typename eT >
void mlpack::ann::WeightNorm< InputDataType, OutputDataType, CustomLayers >::Forward ( const arma::Mat< eT > &  input,
arma::Mat< eT > &  output 
)

Forward pass of the WeightNorm layer.

Calculates the weights of the wrapped layer from the parameter vector v and the scalar parameter g. It then calulates the output of the wrapped layer from the calculated weights.

Parameters
inputInput data for the layer.
outputResulting output activations.

◆ Gradient()

template<typename InputDataType , typename OutputDataType , typename... CustomLayers>
template<typename eT >
void mlpack::ann::WeightNorm< InputDataType, OutputDataType, CustomLayers >::Gradient ( const arma::Mat< eT > &  input,
const arma::Mat< eT > &  error,
arma::Mat< eT > &  gradient 
)

Calculate the gradient using the output delta, input activations and the weights of the wrapped layer.

Parameters
inputThe input activations.
errorThe calculated error.
gradientThe calculated gradient.

The documentation for this class was generated from the following files: