Deep Learning Algorithm Implementations 1.0.0
C++ implementations of fundamental deep learning algorithms
|
Leaky Rectified Linear Unit (Leaky ReLU) activation function. More...
#include <functions.hpp>
Public Member Functions | |
LeakyReLU (double alpha=0.01) | |
Constructor with configurable leak parameter. | |
double | forward (double x) |
Compute Leaky ReLU forward pass. | |
double | backward (double x) |
Compute Leaky ReLU derivative. | |
![]() | |
virtual | ~ActivationFunction ()=default |
Virtual destructor for proper cleanup. | |
Leaky Rectified Linear Unit (Leaky ReLU) activation function.
Leaky ReLU addresses the "dying ReLU" problem by allowing a small gradient when the input is negative, preventing neurons from becoming completely inactive.
Mathematical definition:
Definition at line 170 of file functions.hpp.
dl::activation::LeakyReLU::LeakyReLU | ( | double | alpha = 0.01 | ) |
Constructor with configurable leak parameter.
alpha | Leak coefficient for negative inputs (default: 0.01) |
Definition at line 63 of file functions.cpp.
double dl::activation::LeakyReLU::backward | ( | double | x | ) |
Compute Leaky ReLU derivative.
x | Input value |
Definition at line 73 of file functions.cpp.
double dl::activation::LeakyReLU::forward | ( | double | x | ) |
Compute Leaky ReLU forward pass.
x | Input value |
Definition at line 67 of file functions.cpp.