mlpack
Public Types | Public Member Functions | Friends | List of all members
mlpack::ann::FFN< OutputLayerType, InitializationRuleType, CustomLayers > Class Template Reference

Implementation of a standard feed forward network. More...

#include <ffn.hpp>

Public Types

using NetworkType = FFN< OutputLayerType, InitializationRuleType >
 Convenience typedef for the internal model construction.
 

Public Member Functions

 FFN (OutputLayerType outputLayer=OutputLayerType(), InitializationRuleType initializeRule=InitializationRuleType())
 Create the FFN object. More...
 
 FFN (const FFN &)
 Copy constructor.
 
 FFN (FFN &&)
 Move constructor.
 
FFNoperator= (FFN)
 Copy/move assignment operator.
 
 ~FFN ()
 Destructor to release allocated memory.
 
template<typename OptimizerType >
std::enable_if< HasMaxIterations< OptimizerType, size_t &(OptimizerType::*)()>::value, void >::type WarnMessageMaxIterations (OptimizerType &optimizer, size_t samples) const
 Check if the optimizer has MaxIterations() parameter, if it does then check if it's value is less than the number of datapoints in the dataset. More...
 
template<typename OptimizerType >
std::enable_if< !HasMaxIterations< OptimizerType, size_t &(OptimizerType::*)()>::value, void >::type WarnMessageMaxIterations (OptimizerType &optimizer, size_t samples) const
 Check if the optimizer has MaxIterations() parameter, if it doesn't then simply return from the function. More...
 
template<typename OptimizerType , typename... CallbackTypes>
double Train (arma::mat predictors, arma::mat responses, OptimizerType &optimizer, CallbackTypes &&... callbacks)
 Train the feedforward network on the given input data using the given optimizer. More...
 
template<typename OptimizerType = ens::RMSProp, typename... CallbackTypes>
double Train (arma::mat predictors, arma::mat responses, CallbackTypes &&... callbacks)
 Train the feedforward network on the given input data. More...
 
void Predict (arma::mat predictors, arma::mat &results)
 Predict the responses to a given set of predictors. More...
 
template<typename PredictorsType , typename ResponsesType >
double Evaluate (const PredictorsType &predictors, const ResponsesType &responses)
 Evaluate the feedforward network with the given predictors and responses. More...
 
double Evaluate (const arma::mat &parameters)
 Evaluate the feedforward network with the given parameters. More...
 
double Evaluate (const arma::mat &parameters, const size_t begin, const size_t batchSize, const bool deterministic)
 Evaluate the feedforward network with the given parameters, but using only a number of data points. More...
 
double Evaluate (const arma::mat &parameters, const size_t begin, const size_t batchSize)
 Evaluate the feedforward network with the given parameters, but using only a number of data points. More...
 
template<typename GradType >
double EvaluateWithGradient (const arma::mat &parameters, GradType &gradient)
 Evaluate the feedforward network with the given parameters. More...
 
template<typename GradType >
double EvaluateWithGradient (const arma::mat &parameters, const size_t begin, GradType &gradient, const size_t batchSize)
 Evaluate the feedforward network with the given parameters, but using only a number of data points. More...
 
void Gradient (const arma::mat &parameters, const size_t begin, arma::mat &gradient, const size_t batchSize)
 Evaluate the gradient of the feedforward network with the given parameters, and with respect to only a number of points in the dataset. More...
 
void Shuffle ()
 Shuffle the order of function visitation. More...
 
template<class LayerType , class... Args>
void Add (Args... args)
 
void Add (LayerTypes< CustomLayers... > layer)
 
const std::vector< LayerTypes< CustomLayers... > > & Model () const
 Get the network model.
 
std::vector< LayerTypes< CustomLayers... > > & Model ()
 Modify the network model. More...
 
size_t NumFunctions () const
 Return the number of separable functions (the number of predictor points).
 
const arma::mat & Parameters () const
 Return the initial point for the optimization.
 
arma::mat & Parameters ()
 Modify the initial point for the optimization.
 
const arma::mat & Responses () const
 Get the matrix of responses to the input data points.
 
arma::mat & Responses ()
 Modify the matrix of responses to the input data points.
 
const arma::mat & Predictors () const
 Get the matrix of data points (predictors).
 
arma::mat & Predictors ()
 Modify the matrix of data points (predictors).
 
void ResetParameters ()
 Reset the module infomration (weights/parameters).
 
template<typename Archive >
void serialize (Archive &ar, const uint32_t)
 Serialize the model.
 
template<typename PredictorsType , typename ResponsesType >
void Forward (const PredictorsType &inputs, ResponsesType &results)
 Perform the forward pass of the data in real batch mode. More...
 
template<typename PredictorsType , typename ResponsesType >
void Forward (const PredictorsType &inputs, ResponsesType &results, const size_t begin, const size_t end)
 Perform a partial forward pass of the data. More...
 
template<typename PredictorsType , typename TargetsType , typename GradientsType >
double Backward (const PredictorsType &inputs, const TargetsType &targets, GradientsType &gradients)
 Perform the backward pass of the data in real batch mode. More...
 

Friends

template<typename Model , typename InitializerType , typename NoiseType , typename PolicyType >
class GAN
 

Detailed Description

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization, typename... CustomLayers>
class mlpack::ann::FFN< OutputLayerType, InitializationRuleType, CustomLayers >

Implementation of a standard feed forward network.

Template Parameters
OutputLayerTypeThe output layer type used to evaluate the network.
InitializationRuleTypeRule used to initialize the weight matrix.
CustomLayersAny set of custom layers that could be a part of the feed forward network.

Constructor & Destructor Documentation

◆ FFN()

template<typename OutputLayerType , typename InitializationRuleType , typename... CustomLayers>
mlpack::ann::FFN< OutputLayerType, InitializationRuleType, CustomLayers >::FFN ( OutputLayerType  outputLayer = OutputLayerType(),
InitializationRuleType  initializeRule = InitializationRuleType() 
)

Create the FFN object.

Optionally, specify which initialize rule and performance function should be used.

If you want to pass in a parameter and discard the original parameter object, be sure to use std::move to avoid unnecessary copy.

Parameters
outputLayerOutput layer used to evaluate the network.
initializeRuleOptional instantiated InitializationRule object for initializing the network parameter.

Member Function Documentation

◆ Backward()

template<typename OutputLayerType , typename InitializationRuleType , typename... CustomLayers>
template<typename PredictorsType , typename TargetsType , typename GradientsType >
double mlpack::ann::FFN< OutputLayerType, InitializationRuleType, CustomLayers >::Backward ( const PredictorsType &  inputs,
const TargetsType &  targets,
GradientsType &  gradients 
)

Perform the backward pass of the data in real batch mode.

Forward and Backward should be used as a pair, and they are designed mainly for advanced users. User should try to use Predict and Train unless those two functions can't satisfy some special requirements.

Parameters
inputsInputs of current pass.
targetsThe training target.
gradientsComputed gradients.
Returns
Training error of the current pass.

◆ Evaluate() [1/4]

template<typename OutputLayerType , typename InitializationRuleType , typename... CustomLayers>
template<typename PredictorsType , typename ResponsesType >
double mlpack::ann::FFN< OutputLayerType, InitializationRuleType, CustomLayers >::Evaluate ( const PredictorsType &  predictors,
const ResponsesType &  responses 
)

Evaluate the feedforward network with the given predictors and responses.

This functions is usually used to monitor progress while training.

Parameters
predictorsInput variables.
responsesTarget outputs for input variables.

◆ Evaluate() [2/4]

template<typename OutputLayerType , typename InitializationRuleType , typename... CustomLayers>
double mlpack::ann::FFN< OutputLayerType, InitializationRuleType, CustomLayers >::Evaluate ( const arma::mat &  parameters)

Evaluate the feedforward network with the given parameters.

This function is usually called by the optimizer to train the model.

Parameters
parametersMatrix model parameters.

◆ Evaluate() [3/4]

template<typename OutputLayerType , typename InitializationRuleType , typename... CustomLayers>
double mlpack::ann::FFN< OutputLayerType, InitializationRuleType, CustomLayers >::Evaluate ( const arma::mat &  parameters,
const size_t  begin,
const size_t  batchSize,
const bool  deterministic 
)

Evaluate the feedforward network with the given parameters, but using only a number of data points.

This is useful for optimizers such as SGD, which require a separable objective function.

Parameters
parametersMatrix model parameters.
beginIndex of the starting point to use for objective function evaluation.
batchSizeNumber of points to be passed at a time to use for objective function evaluation.
deterministicWhether or not to train or test the model. Note some layer act differently in training or testing mode.

◆ Evaluate() [4/4]

template<typename OutputLayerType , typename InitializationRuleType , typename... CustomLayers>
double mlpack::ann::FFN< OutputLayerType, InitializationRuleType, CustomLayers >::Evaluate ( const arma::mat &  parameters,
const size_t  begin,
const size_t  batchSize 
)

Evaluate the feedforward network with the given parameters, but using only a number of data points.

This is useful for optimizers such as SGD, which require a separable objective function. This just calls the overload of Evaluate() with deterministic = true.

Parameters
parametersMatrix model parameters.
beginIndex of the starting point to use for objective function evaluation.
batchSizeNumber of points to be passed at a time to use for objective function evaluation.

◆ EvaluateWithGradient() [1/2]

template<typename OutputLayerType , typename InitializationRuleType , typename... CustomLayers>
template<typename GradType >
double mlpack::ann::FFN< OutputLayerType, InitializationRuleType, CustomLayers >::EvaluateWithGradient ( const arma::mat &  parameters,
GradType &  gradient 
)

Evaluate the feedforward network with the given parameters.

This function is usually called by the optimizer to train the model. This just calls the overload of EvaluateWithGradient() with batchSize = 1.

Parameters
parametersMatrix model parameters.
gradientMatrix to output gradient into.

◆ EvaluateWithGradient() [2/2]

template<typename OutputLayerType , typename InitializationRuleType , typename... CustomLayers>
template<typename GradType >
double mlpack::ann::FFN< OutputLayerType, InitializationRuleType, CustomLayers >::EvaluateWithGradient ( const arma::mat &  parameters,
const size_t  begin,
GradType &  gradient,
const size_t  batchSize 
)

Evaluate the feedforward network with the given parameters, but using only a number of data points.

This is useful for optimizers such as SGD, which require a separable objective function.

Parameters
parametersMatrix model parameters.
beginIndex of the starting point to use for objective function evaluation.
gradientMatrix to output gradient into.
batchSizeNumber of points to be passed at a time to use for objective function evaluation.

◆ Forward() [1/2]

template<typename OutputLayerType , typename InitializationRuleType , typename... CustomLayers>
template<typename PredictorsType , typename ResponsesType >
void mlpack::ann::FFN< OutputLayerType, InitializationRuleType, CustomLayers >::Forward ( const PredictorsType &  inputs,
ResponsesType &  results 
)

Perform the forward pass of the data in real batch mode.

Forward and Backward should be used as a pair, and they are designed mainly for advanced users. User should try to use Predict and Train unless those two functions can't satisfy some special requirements.

Parameters
inputsThe input data.
resultsThe predicted results.

◆ Forward() [2/2]

template<typename OutputLayerType , typename InitializationRuleType , typename... CustomLayers>
template<typename PredictorsType , typename ResponsesType >
void mlpack::ann::FFN< OutputLayerType, InitializationRuleType, CustomLayers >::Forward ( const PredictorsType &  inputs,
ResponsesType &  results,
const size_t  begin,
const size_t  end 
)

Perform a partial forward pass of the data.

This function is meant for the cases when users require a forward pass only through certain layers and not the entire network.

Parameters
inputsThe input data for the specified first layer.
resultsThe predicted results from the specified last layer.
beginThe index of the first layer.
endThe index of the last layer.

◆ Gradient()

template<typename OutputLayerType , typename InitializationRuleType , typename... CustomLayers>
void mlpack::ann::FFN< OutputLayerType, InitializationRuleType, CustomLayers >::Gradient ( const arma::mat &  parameters,
const size_t  begin,
arma::mat &  gradient,
const size_t  batchSize 
)

Evaluate the gradient of the feedforward network with the given parameters, and with respect to only a number of points in the dataset.

This is useful for optimizers such as SGD, which require a separable objective function.

Parameters
parametersMatrix of the model parameters to be optimized.
beginIndex of the starting point to use for objective function gradient evaluation.
gradientMatrix to output gradient into.
batchSizeNumber of points to be processed as a batch for objective function gradient evaluation.

◆ Model()

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization, typename... CustomLayers>
std::vector<LayerTypes<CustomLayers...> >& mlpack::ann::FFN< OutputLayerType, InitializationRuleType, CustomLayers >::Model ( )
inline

Modify the network model.

Be careful! If you change the structure of the network or parameters for layers, its state may become invalid, so be sure to call ResetParameters() afterwards.

◆ Predict()

template<typename OutputLayerType , typename InitializationRuleType , typename... CustomLayers>
void mlpack::ann::FFN< OutputLayerType, InitializationRuleType, CustomLayers >::Predict ( arma::mat  predictors,
arma::mat &  results 
)

Predict the responses to a given set of predictors.

The responses will reflect the output of the given output layer as returned by the output layer function.

If you want to pass in a parameter and discard the original parameter object, be sure to use std::move to avoid unnecessary copy.

Parameters
predictorsInput predictors.
resultsMatrix to put output predictions of responses into.

◆ Shuffle()

template<typename OutputLayerType , typename InitializationRuleType , typename... CustomLayers>
void mlpack::ann::FFN< OutputLayerType, InitializationRuleType, CustomLayers >::Shuffle ( )

Shuffle the order of function visitation.

This may be called by the optimizer.

◆ Train() [1/2]

template<typename OutputLayerType , typename InitializationRuleType , typename... CustomLayers>
template<typename OptimizerType , typename... CallbackTypes>
double mlpack::ann::FFN< OutputLayerType, InitializationRuleType, CustomLayers >::Train ( arma::mat  predictors,
arma::mat  responses,
OptimizerType &  optimizer,
CallbackTypes &&...  callbacks 
)

Train the feedforward network on the given input data using the given optimizer.

This will use the existing model parameters as a starting point for the optimization. If this is not what you want, then you should access the parameters vector directly with Parameters() and modify it as desired.

If you want to pass in a parameter and discard the original parameter object, be sure to use std::move to avoid unnecessary copy.

Template Parameters
OptimizerTypeType of optimizer to use to train the model.
CallbackTypesTypes of Callback Functions.
Parameters
predictorsInput training variables.
responsesOutputs results from input training variables.
optimizerInstantiated optimizer used to train the model.
callbacksCallback function for ensmallen optimizer OptimizerType. See https://www.ensmallen.org/docs.html#callback-documentation.
Returns
The final objective of the trained model (NaN or Inf on error).

◆ Train() [2/2]

template<typename OutputLayerType , typename InitializationRuleType , typename... CustomLayers>
template<typename OptimizerType , typename... CallbackTypes>
double mlpack::ann::FFN< OutputLayerType, InitializationRuleType, CustomLayers >::Train ( arma::mat  predictors,
arma::mat  responses,
CallbackTypes &&...  callbacks 
)

Train the feedforward network on the given input data.

By default, the RMSProp optimization algorithm is used, but others can be specified (such as ens::SGD).

This will use the existing model parameters as a starting point for the optimization. If this is not what you want, then you should access the parameters vector directly with Parameters() and modify it as desired.

If you want to pass in a parameter and discard the original parameter object, be sure to use std::move to avoid unnecessary copy.

Template Parameters
OptimizerTypeType of optimizer to use to train the model.
Parameters
predictorsInput training variables.
Template Parameters
CallbackTypesTypes of Callback Functions.
Parameters
responsesOutputs results from input training variables.
callbacksCallback function for ensmallen optimizer OptimizerType. See https://www.ensmallen.org/docs.html#callback-documentation.
Returns
The final objective of the trained model (NaN or Inf on error).

◆ WarnMessageMaxIterations() [1/2]

template<typename OutputLayerType , typename InitializationRuleType , typename... CustomLayers>
template<typename OptimizerType >
std::enable_if< !HasMaxIterations< OptimizerType, size_t &(OptimizerType::*)()>::value, void >::type mlpack::ann::FFN< OutputLayerType, InitializationRuleType, CustomLayers >::WarnMessageMaxIterations ( OptimizerType &  optimizer,
size_t  samples 
) const

Check if the optimizer has MaxIterations() parameter, if it does then check if it's value is less than the number of datapoints in the dataset.

Template Parameters
OptimizerTypeType of optimizer to use to train the model.
Parameters
optimizeroptimizer used in the training process.
samplesNumber of datapoints in the dataset.

◆ WarnMessageMaxIterations() [2/2]

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization, typename... CustomLayers>
template<typename OptimizerType >
std::enable_if< !HasMaxIterations<OptimizerType, size_t&(OptimizerType::*)()>::value, void>::type mlpack::ann::FFN< OutputLayerType, InitializationRuleType, CustomLayers >::WarnMessageMaxIterations ( OptimizerType &  optimizer,
size_t  samples 
) const

Check if the optimizer has MaxIterations() parameter, if it doesn't then simply return from the function.

Template Parameters
OptimizerTypeType of optimizer to use to train the model.
Parameters
optimizeroptimizer used in the training process.
samplesNumber of datapoints in the dataset.

The documentation for this class was generated from the following files: