Deep Learning Algorithm Implementations 1.0.0
C++ implementations of fundamental deep learning algorithms
|
Rectified Linear Unit (ReLU) activation function. More...
#include <functions.hpp>
Public Member Functions | |
double | forward (double x) |
Compute ReLU forward pass. | |
double | backward (double x) |
Compute ReLU derivative. | |
![]() | |
virtual | ~ActivationFunction ()=default |
Virtual destructor for proper cleanup. | |
Rectified Linear Unit (ReLU) activation function.
ReLU is one of the most commonly used activation functions in deep learning. It outputs the input directly if positive, otherwise outputs zero.
Mathematical definition:
Definition at line 52 of file functions.hpp.
double dl::activation::ReLU::backward | ( | double | x | ) |
Compute ReLU derivative.
x | Input value |
Definition at line 13 of file functions.cpp.
double dl::activation::ReLU::forward | ( | double | x | ) |
Compute ReLU forward pass.
x | Input value |
Definition at line 7 of file functions.cpp.