org.emmalanguage.lib.ml.optimization.error
Sum of Squares Error
loss: E(w) = 1/2 sum{ (wTx - y)**2 } gradient: dE(w) = sum{ (wTx - y) *x }
Sum of Squares Error
loss: E(w) = 1/2 sum{ (wTx - y)**2 } gradient: dE(w) = sum{ (wTx - y) *x }