object FirstOrderMinimizer extends Serializable
- Alphabetic
- By Inheritance
- FirstOrderMinimizer
- Serializable
- Serializable
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Type Members
- trait ConvergenceCheck[T] extends AnyRef
- trait ConvergenceReason extends AnyRef
- case class FunctionValuesConverged[T](tolerance: Double, relative: Boolean, historyLength: Int) extends ConvergenceCheck[T] with Product with Serializable
- case class MonitorFunctionValuesCheck[T](f: (T) ⇒ Double, numFailures: Int, improvementRequirement: Double, evalFrequency: Int) extends ConvergenceCheck[T] with SerializableLogging with Product with Serializable
-
case class
OptParams(batchSize: Int = 512, regularization: Double = 0.0, alpha: Double = 0.5, maxIterations: Int = 1000, useL1: Boolean = false, tolerance: Double = 1E-5, useStochastic: Boolean = false, randomSeed: Int = 0) extends Product with Serializable
OptParams is a Configuration-compatible case class that can be used to select optimization routines at runtime.
OptParams is a Configuration-compatible case class that can be used to select optimization routines at runtime.
Configurations: 1) useStochastic=false,useL1=false: LBFGS with L2 regularization 2) useStochastic=false,useL1=true: OWLQN with L1 regularization 3) useStochastic=true,useL1=false: AdaptiveGradientDescent with L2 regularization 3) useStochastic=true,useL1=true: AdaptiveGradientDescent with L1 regularization
- batchSize
size of batches to use if useStochastic and you give a BatchDiffFunction
- regularization
regularization constant to use.
- alpha
rate of change to use, only applies to SGD.
- useL1
if true, use L1 regularization. Otherwise, use L2.
- tolerance
convergence tolerance, looking at both average improvement and the norm of the gradient.
- useStochastic
if false, use LBFGS or OWLQN. If true, use some variant of Stochastic Gradient Descent.
- case class SequenceConvergenceCheck[T](checks: IndexedSeq[ConvergenceCheck[T]]) extends ConvergenceCheck[T] with Product with Serializable
-
case class
State[+T, +ConvergenceInfo, +History](x: T, value: Double, grad: T, adjustedValue: Double, adjustedGradient: T, iter: Int, initialAdjVal: Double, history: History, convergenceInfo: ConvergenceInfo, searchFailed: Boolean = false, convergenceReason: Option[ConvergenceReason] = None) extends Product with Serializable
Tracks the information about the optimizer, including the current point, its value, gradient, and then any history.
Tracks the information about the optimizer, including the current point, its value, gradient, and then any history. Also includes information for checking convergence.
- x
the current point being considered
- value
f(x)
- grad
f.gradientAt(x)
- adjustedValue
f(x) + r(x), where r is any regularization added to the objective. For LBFGS, this is f(x).
- adjustedGradient
f'(x) + r'(x), where r is any regularization added to the objective. For LBFGS, this is f'(x).
- iter
what iteration number we are on.
- initialAdjVal
f(x_0) + r(x_0), used for checking convergence
- history
any information needed by the optimizer to do updates.
- searchFailed
did the line search fail?
- convergenceReason
the convergence reason
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[java.lang]
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate() @throws( ... )
- def defaultConvergenceCheck[T](maxIter: Int, tolerance: Double, relative: Boolean = true, fvalMemory: Int = 20)(implicit space: NormedModule[T, Double]): ConvergenceCheck[T]
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- def functionValuesConverged[T](tolerance: Double = 1E-9, relative: Boolean = true, historyLength: Int = 10): ConvergenceCheck[T]
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
- def gradientConverged[T](tolerance: Double, relative: Boolean = true)(implicit space: NormedModule[T, Double]): ConvergenceCheck[T]
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- def maxIterationsReached[T](maxIter: Int): ConvergenceCheck[T]
-
def
monitorFunctionValues[T](f: (T) ⇒ Double, numFailures: Int = 5, improvementRequirement: Double = 1E-2, evalFrequency: Int = 10): ConvergenceCheck[T]
Runs the function, and if it fails to decreased by at least improvementRequirement numFailures times in a row, then we abort
Runs the function, and if it fails to decreased by at least improvementRequirement numFailures times in a row, then we abort
- evalFrequency
how often we run the evaluation
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
- def searchFailed[T]: ConvergenceCheck[T]
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @throws( ... )
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
- object ConvergenceCheck
- object FunctionValuesConverged extends ConvergenceReason with Product with Serializable
- object GradientConverged extends ConvergenceReason with Product with Serializable
- object MaxIterations extends ConvergenceReason with Product with Serializable
- object MonitorFunctionNotImproving extends ConvergenceReason with Product with Serializable
- object ProjectedStepConverged extends ConvergenceReason with Product with Serializable
- object SearchFailed extends ConvergenceReason with Product with Serializable