GradientDescent

open class GradientDescent(val learningRate: LearningRateSchedule, val entropy: Random = Random.Default) : SinglePassOptimizer

An optimizer that works by caching calculations during the forward pass and calculating gradients during the backward pass.

All cases are grouped into a single batch.

Inheritors

Constructors

Link copied to clipboard
constructor(learningRate: LearningRateSchedule, entropy: Random = Random.Default)

Properties

Link copied to clipboard
Link copied to clipboard

Functions

Link copied to clipboard
open override fun batch(cases: List<Exercise>): List<List<Exercise>>

Creates training batches out of the given cases.

Link copied to clipboard
open override fun update(epoch: Int, layer: Layer, weightGradients: Tensor, biasGradients: Tensor)

Updates the parameters of the model based on the outputs computed during the forward pass.