|
mlpack
|
The FlexibleReLU activation function, defined by. More...
#include <flexible_relu.hpp>
Public Member Functions | |
| FlexibleReLU (const double alpha=0) | |
| Create the FlexibleReLU object using the specified parameters. More... | |
| void | Reset () |
| Reset the layer parameter. More... | |
| template<typename InputType , typename OutputType > | |
| void | Forward (const InputType &input, OutputType &output) |
| Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f. More... | |
| template<typename DataType > | |
| void | Backward (const DataType &input, const DataType &gy, DataType &g) |
| Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f. More... | |
| template<typename eT > | |
| void | Gradient (const arma::Mat< eT > &input, const arma::Mat< eT > &error, arma::Mat< eT > &gradient) |
| Calculate the gradient using the output delta and the input activation. More... | |
| OutputDataType const & | Parameters () const |
| Get the parameters. | |
| OutputDataType & | Parameters () |
| Modify the parameters. | |
| OutputDataType const & | OutputParameter () const |
| Get the output parameter. | |
| OutputDataType & | OutputParameter () |
| Modify the output parameter. | |
| OutputDataType const & | Delta () const |
| Get the delta. | |
| OutputDataType & | Delta () |
| Modify the delta. | |
| OutputDataType const & | Gradient () const |
| Get the gradient. | |
| OutputDataType & | Gradient () |
| Modify the gradient. | |
| double const & | Alpha () const |
| Get the parameter controlling the range of the relu function. | |
| double & | Alpha () |
| Modify the parameter controlling the range of the relu function. | |
| template<typename Archive > | |
| void | serialize (Archive &ar, const uint32_t) |
| Serialize the layer. | |
The FlexibleReLU activation function, defined by.
\begin{eqnarray*} f(x) &=& \max(0,x)+alpha \\ f'(x) &=& \left\{ \begin{array}{lr} 1 & : x > 0 \\ 0 & : x \le 0 \end{array} \right. \end{eqnarray*}
For more information, read the following paper:
| InputDataType | Type of the input data (arma::colvec, arma::mar, arma::sp_mat or arma::cube) |
| OutputDataType | Type of the output data (arma::colvec, arma::mat, arma::sp_mat or arma::cube) |
| mlpack::ann::FlexibleReLU< InputDataType, OutputDataType >::FlexibleReLU | ( | const double | alpha = 0 | ) |
Create the FlexibleReLU object using the specified parameters.
The non zero parameter can be adjusted by specifying the parameter alpha which controls the range of the relu function. (Default alpha = 0) This parameter is trainable.
| alpha | Parameter for adjusting the range of the relu function. |
| void mlpack::ann::FlexibleReLU< InputDataType, OutputDataType >::Backward | ( | const DataType & | input, |
| const DataType & | gy, | ||
| DataType & | g | ||
| ) |
Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f.
Using the results from the feed forward pass.
| input | The propagated input activation. |
| gy | The backpropagated error. |
| g | The calculated gradient. |
Compute the first derivative of FlexibleReLU function.
| void mlpack::ann::FlexibleReLU< InputDataType, OutputDataType >::Forward | ( | const InputType & | input, |
| OutputType & | output | ||
| ) |
Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f.
| input | Input data used for evaluating the specified function. |
| output | Resulting output activation. |
| void mlpack::ann::FlexibleReLU< InputDataType, OutputDataType >::Gradient | ( | const arma::Mat< eT > & | input, |
| const arma::Mat< eT > & | error, | ||
| arma::Mat< eT > & | gradient | ||
| ) |
Calculate the gradient using the output delta and the input activation.
| input | The input parameter used for calculating the gradient. |
| error | The calculated error. |
| gradient | The calculated gradient. |
| void mlpack::ann::FlexibleReLU< InputDataType, OutputDataType >::Reset | ( | ) |
Reset the layer parameter.
Set value of alpha to the one given by user.
1.8.13