|
mlpack
|
Declaration of the WeightNorm layer class. More...
#include <weight_norm.hpp>
Public Member Functions | |
| WeightNorm (LayerTypes< CustomLayers... > layer=LayerTypes< CustomLayers... >()) | |
| Create the WeightNorm layer object. More... | |
| ~WeightNorm () | |
| Destructor to release allocated memory. | |
| void | Reset () |
| Reset the layer parameters. | |
| template<typename eT > | |
| void | Forward (const arma::Mat< eT > &input, arma::Mat< eT > &output) |
| Forward pass of the WeightNorm layer. More... | |
| template<typename eT > | |
| void | Backward (const arma::Mat< eT > &input, const arma::Mat< eT > &gy, arma::Mat< eT > &g) |
| Backward pass through the layer. More... | |
| template<typename eT > | |
| void | Gradient (const arma::Mat< eT > &input, const arma::Mat< eT > &error, arma::Mat< eT > &gradient) |
| Calculate the gradient using the output delta, input activations and the weights of the wrapped layer. More... | |
| OutputDataType const & | Delta () const |
| Get the delta. | |
| OutputDataType & | Delta () |
| Modify the delta. | |
| OutputDataType const & | Gradient () const |
| Get the gradient. | |
| OutputDataType & | Gradient () |
| Modify the gradient. | |
| OutputDataType const & | OutputParameter () const |
| Get the output parameter. | |
| OutputDataType & | OutputParameter () |
| Modify the output parameter. | |
| OutputDataType const & | Parameters () const |
| Get the parameters. | |
| OutputDataType & | Parameters () |
| Modify the parameters. | |
| LayerTypes< CustomLayers... > const & | Layer () |
| Get the wrapped layer. | |
| template<typename Archive > | |
| void | serialize (Archive &ar, const uint32_t) |
| Serialize the layer. | |
Declaration of the WeightNorm layer class.
The layer reparameterizes the weight vectors in a neural network, decoupling the length of those weight vectors from their direction. This reparameterization does not introduce any dependencies between the examples in a mini-batch.
This class will be a wrapper around existing layers. It will just modify the calculation and updation of weights of the layer.
For more information, refer to the following paper,
| InputDataType | Type of the input data (arma::colvec, arma::mat, arma::sp_mat or arma::cube). |
| OutputDataType | Type of the output data (arma::colvec, arma::mat, arma::sp_mat or arma::cube). |
| CustomLayers | Additional custom layers that can be added. |
| mlpack::ann::WeightNorm< InputDataType, OutputDataType, CustomLayers >::WeightNorm | ( | LayerTypes< CustomLayers... > | layer = LayerTypes<CustomLayers...>() | ) |
Create the WeightNorm layer object.
Artificial Neural Network.
| layer | The layer whose weights are needed to be normalized. |
| void mlpack::ann::WeightNorm< InputDataType, OutputDataType, CustomLayers >::Backward | ( | const arma::Mat< eT > & | input, |
| const arma::Mat< eT > & | gy, | ||
| arma::Mat< eT > & | g | ||
| ) |
Backward pass through the layer.
This function calls the Backward() function of the wrapped layer.
| input | The input activations. |
| gy | The backpropagated error. |
| g | The calculated gradient. |
| void mlpack::ann::WeightNorm< InputDataType, OutputDataType, CustomLayers >::Forward | ( | const arma::Mat< eT > & | input, |
| arma::Mat< eT > & | output | ||
| ) |
Forward pass of the WeightNorm layer.
Calculates the weights of the wrapped layer from the parameter vector v and the scalar parameter g. It then calulates the output of the wrapped layer from the calculated weights.
| input | Input data for the layer. |
| output | Resulting output activations. |
| void mlpack::ann::WeightNorm< InputDataType, OutputDataType, CustomLayers >::Gradient | ( | const arma::Mat< eT > & | input, |
| const arma::Mat< eT > & | error, | ||
| arma::Mat< eT > & | gradient | ||
| ) |
Calculate the gradient using the output delta, input activations and the weights of the wrapped layer.
| input | The input activations. |
| error | The calculated error. |
| gradient | The calculated gradient. |
1.8.13