Deep Learning Algorithm Implementations 1.0.0
C++ implementations of fundamental deep learning algorithms
Loading...
Searching...
No Matches
dl::activation::ReLU Class Reference

Rectified Linear Unit (ReLU) activation function. More...

#include <functions.hpp>

Inheritance diagram for dl::activation::ReLU:
Collaboration diagram for dl::activation::ReLU:

Public Member Functions

double forward (double x)
 Compute ReLU forward pass.
 
double backward (double x)
 Compute ReLU derivative.
 
- Public Member Functions inherited from dl::activation::ActivationFunction
virtual ~ActivationFunction ()=default
 Virtual destructor for proper cleanup.
 

Detailed Description

Rectified Linear Unit (ReLU) activation function.

ReLU is one of the most commonly used activation functions in deep learning. It outputs the input directly if positive, otherwise outputs zero.

Mathematical definition:

  • Forward: f(x) = max(0, x)
  • Derivative: f'(x) = 1 if x > 0, else 0
Note
ReLU helps mitigate the vanishing gradient problem

Definition at line 52 of file functions.hpp.

Member Function Documentation

◆ backward()

double dl::activation::ReLU::backward ( double  x)

Compute ReLU derivative.

Parameters
xInput value
Returns
1 if x > 0, else 0

Definition at line 13 of file functions.cpp.

◆ forward()

double dl::activation::ReLU::forward ( double  x)

Compute ReLU forward pass.

Parameters
xInput value
Returns
max(0, x)

Definition at line 7 of file functions.cpp.


The documentation for this class was generated from the following files: