Deep Learning Algorithm Implementations 1.0.0
C++ implementations of fundamental deep learning algorithms
Loading...
Searching...
No Matches
dl::activation::Softmax Class Reference

Softmax activation function. More...

#include <functions.hpp>

Inheritance diagram for dl::activation::Softmax:
Collaboration diagram for dl::activation::Softmax:

Public Member Functions

MatrixD forward (const MatrixD &x)
 Compute softmax forward pass.
 
MatrixD backward (const MatrixD &x)
 Compute softmax Jacobian matrix.
 
- Public Member Functions inherited from dl::activation::ActivationFunction
virtual ~ActivationFunction ()=default
 Virtual destructor for proper cleanup.
 

Detailed Description

Softmax activation function.

Softmax is commonly used in the output layer of multi-class classification networks. It converts a vector of real numbers into a probability distribution.

Mathematical definition:

  • Forward: f(x_i) = exp(x_i) / Σ(exp(x_j)) for j=1 to n
  • Backward: Jacobian matrix with elements ∂f_i/∂x_j
Note
Output values sum to 1, making them interpretable as probabilities
Warning
Numerically unstable for large input values without proper scaling

Definition at line 140 of file functions.hpp.

Member Function Documentation

◆ backward()

MatrixD dl::activation::Softmax::backward ( const MatrixD &  x)

Compute softmax Jacobian matrix.

Parameters
xInput matrix/vector
Returns
Jacobian matrix for backpropagation

Definition at line 55 of file functions.cpp.

◆ forward()

MatrixD dl::activation::Softmax::forward ( const MatrixD &  x)

Compute softmax forward pass.

Parameters
xInput matrix/vector
Returns
Probability distribution matrix

Definition at line 48 of file functions.cpp.


The documentation for this class was generated from the following files: