Module engram::optimizer

source ·
Expand description

Optimization algorithms.

Use to minimize the loss function during the training process of a neural network by adjusting the weights of the network based on the gradients of the loss function with respect to the weights.

Structs§

  • Adaptive Gradient (Adagrad):
  • Stochastic Gradient Descent (SGD).

Traits§