class NNLS extends SerializableLogging
NNLS solves nonnegative least squares problems using a modified projected gradient method.
- Alphabetic
- By Inheritance
- NNLS
- SerializableLogging
- Serializable
- Serializable
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Type Members
- type BDM = DenseMatrix[Double]
- type BDV = DenseVector[Double]
- case class State extends Product with Serializable
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[java.lang]
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate() @throws( ... )
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
def
initialize(n: Int): State
Solve a least squares problem, possibly with nonnegativity constraints, by a modified projected gradient method.
Solve a least squares problem, possibly with nonnegativity constraints, by a modified projected gradient method. That is, find x minimising ||Ax - b||_2 given AT A and AT b.
We solve the problem min_x 1/2 x' ata x' - x'atb subject to x >= 0
The method used is similar to one described by Polyak (B. T. Polyak, The conjugate gradient method in extremal problems, Zh. Vychisl. Mat. Mat. Fiz. 9(4)(1969), pp. 94-112) for bound- constrained nonlinear programming. Polyak unconditionally uses a conjugate gradient direction, however, while this method only uses a conjugate gradient direction if the last iteration did not cause a previously-inactive constraint to become active.
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
def
logger: LazyLogger
- Attributes
- protected
- Definition Classes
- SerializableLogging
- val maxIters: Int
- def minimize(ata: DenseMatrix[Double], atb: DenseVector[Double], init: State): DenseVector[Double]
- def minimize(ata: DenseMatrix[Double], atb: DenseVector[Double]): DenseVector[Double]
- def minimizeAndReturnState(ata: DenseMatrix[Double], atb: DenseVector[Double]): State
-
def
minimizeAndReturnState(ata: DenseMatrix[Double], atb: DenseVector[Double], initialState: State, resetState: Boolean = true): State
minimizeAndReturnState allows users to hot start the solver using initialState.
minimizeAndReturnState allows users to hot start the solver using initialState. If a initialState is provided and resetState is set to false, the optimizer will hot start using the previous state. By default resetState is true and every time reset will be called on the incoming state
- ata
gram matrix
- atb
linear term
- initialState
initial state for calling the solver from inner loops
- resetState
reset the state based on the flag
- returns
converged state
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
- def reset(ata: DenseMatrix[Double], atb: DenseVector[Double], state: State): State
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @throws( ... )
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )