Deep Learning Algorithm Implementations 1.0.0
C++ implementations of fundamental deep learning algorithms
Loading...
Searching...
No Matches
dl::optimization Namespace Reference

Classes

class  Adam
 Adam optimizer with autograd support. More...
 
class  AdamW
 AdamW optimizer with autograd support. More...
 
class  AutogradOptimizer
 Base class for autograd-compatible optimizers. More...
 
class  LRScheduler
 Learning rate scheduler base class. More...
 
class  RMSprop
 RMSprop optimizer with autograd support. More...
 
class  SGD
 Stochastic Gradient Descent optimizer with autograd support. More...
 
class  StepLR
 Step learning rate scheduler Decays learning rate by gamma every step_size epochs. More...
 

Typedefs

using SGDD = SGD< double >
 
using SGDF = SGD< float >
 
using AdamD = Adam< double >
 
using AdamF = Adam< float >
 
using AdamWD = AdamW< double >
 
using AdamWF = AdamW< float >
 
using RMSpropD = RMSprop< double >
 
using RMSpropF = RMSprop< float >
 
using StepLRD = StepLR< double >
 
using StepLRF = StepLR< float >
 

Typedef Documentation

◆ AdamD

using dl::optimization::AdamD = typedef Adam<double>

Definition at line 337 of file optimizers.hpp.

◆ AdamF

using dl::optimization::AdamF = typedef Adam<float>

Definition at line 338 of file optimizers.hpp.

◆ AdamWD

using dl::optimization::AdamWD = typedef AdamW<double>

Definition at line 339 of file optimizers.hpp.

◆ AdamWF

using dl::optimization::AdamWF = typedef AdamW<float>

Definition at line 340 of file optimizers.hpp.

◆ RMSpropD

using dl::optimization::RMSpropD = typedef RMSprop<double>

Definition at line 341 of file optimizers.hpp.

◆ RMSpropF

using dl::optimization::RMSpropF = typedef RMSprop<float>

Definition at line 342 of file optimizers.hpp.

◆ SGDD

using dl::optimization::SGDD = typedef SGD<double>

Definition at line 335 of file optimizers.hpp.

◆ SGDF

using dl::optimization::SGDF = typedef SGD<float>

Definition at line 336 of file optimizers.hpp.

◆ StepLRD

using dl::optimization::StepLRD = typedef StepLR<double>

Definition at line 343 of file optimizers.hpp.

◆ StepLRF

using dl::optimization::StepLRF = typedef StepLR<float>

Definition at line 344 of file optimizers.hpp.