mlpack
Public Member Functions | List of all members
mlpack::ann::Dropout< InputDataType, OutputDataType > Class Template Reference

The dropout layer is a regularizer that randomly with probability 'ratio' sets input values to zero and scales the remaining elements by factor 1 / (1 - ratio) rather than during test time so as to keep the expected sum same. More...

#include <dropout.hpp>

Public Member Functions

 Dropout (const double ratio=0.5)
 Create the Dropout object using the specified ratio parameter. More...
 
 Dropout (const Dropout &layer)
 Copy Constructor.
 
 Dropout (const Dropout &&)
 Move Constructor.
 
Dropoutoperator= (const Dropout &layer)
 Copy assignment operator.
 
Dropoutoperator= (Dropout &&layer)
 Move assignment operator.
 
template<typename eT >
void Forward (const arma::Mat< eT > &input, arma::Mat< eT > &output)
 Ordinary feed forward pass of the dropout layer. More...
 
template<typename eT >
void Backward (const arma::Mat< eT > &, const arma::Mat< eT > &gy, arma::Mat< eT > &g)
 Ordinary feed backward pass of the dropout layer. More...
 
OutputDataType const & OutputParameter () const
 Get the output parameter.
 
OutputDataType & OutputParameter ()
 Modify the output parameter.
 
OutputDataType const & Delta () const
 Get the detla.
 
OutputDataType & Delta ()
 Modify the delta.
 
bool Deterministic () const
 The value of the deterministic parameter.
 
bool & Deterministic ()
 Modify the value of the deterministic parameter.
 
double Ratio () const
 The probability of setting a value to zero.
 
void Ratio (const double r)
 Modify the probability of setting a value to zero.
 
template<typename Archive >
void serialize (Archive &ar, const uint32_t)
 Serialize the layer.
 

Detailed Description

template<typename InputDataType = arma::mat, typename OutputDataType = arma::mat>
class mlpack::ann::Dropout< InputDataType, OutputDataType >

The dropout layer is a regularizer that randomly with probability 'ratio' sets input values to zero and scales the remaining elements by factor 1 / (1 - ratio) rather than during test time so as to keep the expected sum same.

In the deterministic mode (during testing), there is no change in the input.

Note: During training you should set deterministic to false and during testing you should set deterministic to true.

For more information, see the following.

@article{Hinton2012,
author = {Geoffrey E. Hinton, Nitish Srivastava, Alex Krizhevsky,
Ilya Sutskever, Ruslan Salakhutdinov},
title = {Improving neural networks by preventing co-adaptation of feature
detectors},
journal = {CoRR},
volume = {abs/1207.0580},
year = {2012},
url = {https://arxiv.org/abs/1207.0580}
}
Template Parameters
InputDataTypeType of the input data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).
OutputDataTypeType of the output data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).

Constructor & Destructor Documentation

◆ Dropout()

template<typename InputDataType , typename OutputDataType >
mlpack::ann::Dropout< InputDataType, OutputDataType >::Dropout ( const double  ratio = 0.5)

Create the Dropout object using the specified ratio parameter.

Parameters
ratioThe probability of setting a value to zero.

Member Function Documentation

◆ Backward()

template<typename InputDataType , typename OutputDataType >
template<typename eT >
void mlpack::ann::Dropout< InputDataType, OutputDataType >::Backward ( const arma::Mat< eT > &  ,
const arma::Mat< eT > &  gy,
arma::Mat< eT > &  g 
)

Ordinary feed backward pass of the dropout layer.

Parameters
*(input) The propagated input activation.
gyThe backpropagated error.
gThe calculated gradient.

◆ Forward()

template<typename InputDataType , typename OutputDataType >
template<typename eT >
void mlpack::ann::Dropout< InputDataType, OutputDataType >::Forward ( const arma::Mat< eT > &  input,
arma::Mat< eT > &  output 
)

Ordinary feed forward pass of the dropout layer.

Parameters
inputInput data used for evaluating the specified function.
outputResulting output activation.

The documentation for this class was generated from the following files: