mlpack
Public Member Functions | List of all members
mlpack::ann::ELU< InputDataType, OutputDataType > Class Template Reference

The ELU activation function, defined by. More...

#include <elu.hpp>

Public Member Functions

 ELU ()
 Create the ELU object. More...
 
 ELU (const double alpha)
 Create the ELU object using the specified parameter. More...
 
template<typename InputType , typename OutputType >
void Forward (const InputType &input, OutputType &output)
 Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f. More...
 
template<typename DataType >
void Backward (const DataType &input, const DataType &gy, DataType &g)
 Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f. More...
 
OutputDataType const & OutputParameter () const
 Get the output parameter.
 
OutputDataType & OutputParameter ()
 Modify the output parameter.
 
OutputDataType const & Delta () const
 Get the delta.
 
OutputDataType & Delta ()
 Modify the delta.
 
double const & Alpha () const
 Get the non zero gradient.
 
double & Alpha ()
 Modify the non zero gradient.
 
bool Deterministic () const
 Get the value of deterministic parameter.
 
bool & Deterministic ()
 Modify the value of deterministic parameter.
 
double const & Lambda () const
 Get the lambda parameter.
 
template<typename Archive >
void serialize (Archive &ar, const uint32_t)
 Serialize the layer.
 

Detailed Description

template<typename InputDataType = arma::mat, typename OutputDataType = arma::mat>
class mlpack::ann::ELU< InputDataType, OutputDataType >

The ELU activation function, defined by.

\begin{eqnarray*} f(x) &=& \left\{ \begin{array}{lr} x & : x > 0 \\ \alpha(e^x - 1) & : x \le 0 \end{array} \right. \\ f'(x) &=& \left\{ \begin{array}{lr} 1 & : x > 0 \\ f(x) + \alpha & : x \le 0 \end{array} \right. \end{eqnarray*}

For more information, read the following paper:

@article{Clevert2015,
author = {Djork{-}Arn{\'{e}} Clevert and Thomas Unterthiner and
Sepp Hochreiter},
title = {Fast and Accurate Deep Network Learning by Exponential Linear
Units (ELUs)},
journal = {CoRR},
year = {2015},
url = {https://arxiv.org/abs/1511.07289}
}

The SELU activation function is defined by

\begin{eqnarray*} f(x) &=& \left\{ \begin{array}{lr} \lambda * x & : x > 0 \\ \lambda * \alpha(e^x - 1) & : x \le 0 \end{array} \right. \\ f'(x) &=& \left\{ \begin{array}{lr} \lambda & : x > 0 \\ f(x) + \lambda * \alpha & : x \le 0 \end{array} \right. \end{eqnarray*}

For more information, read the following paper:

@article{Klambauer2017,
author = {Gunter Klambauer and Thomas Unterthiner and
Andreas Mayr},
title = {Self-Normalizing Neural Networks},
journal = {Advances in Neural Information Processing Systems},
year = {2017},
url = {https://arxiv.org/abs/1706.02515}
}

In the deterministic mode, there is no computation of the derivative.

Note
During training deterministic should be set to false and during testing/inference deterministic should be set to true.
Make sure to use SELU activation function with normalized inputs and weights initialized with Lecun Normal Initialization.
Template Parameters
InputDataTypeType of the input data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).
OutputDataTypeType of the output data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).

Constructor & Destructor Documentation

◆ ELU() [1/2]

template<typename InputDataType , typename OutputDataType >
mlpack::ann::ELU< InputDataType, OutputDataType >::ELU ( )

Create the ELU object.

NOTE: Use this constructor for SELU activation function.

◆ ELU() [2/2]

template<typename InputDataType , typename OutputDataType >
mlpack::ann::ELU< InputDataType, OutputDataType >::ELU ( const double  alpha)

Create the ELU object using the specified parameter.

The non zero gradient for negative inputs can be adjusted by specifying the ELU hyperparameter alpha (alpha > 0).

Note
Use this constructor for ELU activation function.
Parameters
alphaScale parameter for the negative factor.

Member Function Documentation

◆ Backward()

template<typename InputDataType , typename OutputDataType >
template<typename DataType >
void mlpack::ann::ELU< InputDataType, OutputDataType >::Backward ( const DataType &  input,
const DataType &  gy,
DataType &  g 
)

Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f.

Using the results from the feed forward pass.

Parameters
inputThe propagated input activation f(x).
gyThe backpropagated error.
gThe calculated gradient.

◆ Forward()

template<typename InputDataType , typename OutputDataType >
template<typename InputType , typename OutputType >
void mlpack::ann::ELU< InputDataType, OutputDataType >::Forward ( const InputType &  input,
OutputType &  output 
)

Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f.

Parameters
inputInput data used for evaluating the specified function.
outputResulting output activation.

The documentation for this class was generated from the following files: