StochasticGradientDescent

class StochasticGradientDescent(val batchSize: Int, val learningRate: LearningRateSchedule, val entropy: Random = Random.Default, val discardExtras: Boolean = false) : GradientDescent

Stochastic Gradient Descent (SGD) optimizer with adjustable learning rate.

SGD operates on a subset of the training data (a single sample or a mini-batch) randomly selected from the entire dataset.

Parameters

learningRate

The learning rate for the optimizer.

Constructors

Link copied to clipboard
constructor(batchSize: Int, learningRate: LearningRateSchedule, entropy: Random = Random.Default, discardExtras: Boolean = false)

Properties

Link copied to clipboard
Link copied to clipboard
val discardExtras: Boolean = false
Link copied to clipboard
Link copied to clipboard

Functions

Link copied to clipboard
open override fun batch(cases: List<Exercise>): List<List<Exercise>>

Creates training batches out of the given cases.

Link copied to clipboard
open override fun update(epoch: Int, layer: Layer, weightGradients: Tensor, biasGradients: Tensor)

Updates the parameters of the model based on the outputs computed during the forward pass.