Deep Learning Algorithm Implementations 1.0.0
C++ implementations of fundamental deep learning algorithms
|
Base class for all neural network modules (PyTorch-like nn.Module) More...
#include <layers.hpp>
Public Member Functions | |
virtual | ~Module ()=default |
virtual Variable< T > | forward (const Variable< T > &input)=0 |
Forward pass through the module. | |
virtual std::vector< Variable< T > * > | parameters ()=0 |
Get all parameters of this module. | |
virtual void | zero_grad () |
Zero gradients of all parameters. | |
virtual void | train (bool training=true) |
Set training mode. | |
virtual void | eval () |
Set evaluation mode. | |
bool | is_training () const |
Check if module is in training mode. | |
Protected Attributes | |
bool | training_ = true |
Base class for all neural network modules (PyTorch-like nn.Module)
Definition at line 28 of file layers.hpp.
|
virtualdefault |
|
inlinevirtual |
Set evaluation mode.
Definition at line 65 of file layers.hpp.
|
pure virtual |
Forward pass through the module.
input | Input variable |
Implemented in dl::layers::Linear< T >, dl::layers::ReLU< T >, dl::layers::Sigmoid< T >, dl::layers::Tanh< T >, dl::layers::Dropout< T >, and dl::layers::Sequential< T >.
|
inline |
Check if module is in training mode.
Definition at line 72 of file layers.hpp.
|
pure virtual |
Get all parameters of this module.
Implemented in dl::layers::Linear< T >, dl::layers::ReLU< T >, dl::layers::Sigmoid< T >, dl::layers::Tanh< T >, dl::layers::Dropout< T >, and dl::layers::Sequential< T >.
|
inlinevirtual |
Set training mode.
training | Whether in training mode |
Reimplemented in dl::layers::Sequential< T >.
Definition at line 58 of file layers.hpp.
|
inlinevirtual |
Zero gradients of all parameters.
Reimplemented in dl::layers::Sequential< T >.
Definition at line 48 of file layers.hpp.
|
protected |
Definition at line 75 of file layers.hpp.