ml.shifu.guagua.yarn.example.nn
类 Gradient
java.lang.Object
ml.shifu.guagua.yarn.example.nn.Gradient
public class Gradient
- extends Object
Gradient is copied from Encog framework. The reason is that we original Gradient don't pop up
gradients outside. While we need gradients accumulated into NNMaster to update NN weights.
|
构造方法摘要 |
Gradient(org.encog.neural.flat.FlatNetwork theNetwork,
org.encog.ml.data.MLDataSet theTraining,
double[] flatSpot,
org.encog.neural.error.ErrorFunction ef)
Construct a gradient worker. |
| 从类 java.lang.Object 继承的方法 |
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait |
Gradient
public Gradient(org.encog.neural.flat.FlatNetwork theNetwork,
org.encog.ml.data.MLDataSet theTraining,
double[] flatSpot,
org.encog.neural.error.ErrorFunction ef)
- Construct a gradient worker.
- 参数:
theNetwork - The network to train.theOwner - The owner that is doing the training.theTraining - The training data.theLow - The low index to use in the training data.theHigh - The high index to use in the training data.
run
public final void run()
- Perform the gradient calculation
getErrorCalculation
public org.encog.mathutil.error.ErrorCalculation getErrorCalculation()
getGradients
public double[] getGradients()
- 返回:
- the gradients
getError
public double getError()
- 返回:
- the error
getWeights
public double[] getWeights()
- 返回:
- the weights
setWeights
public void setWeights(double[] weights)
- 参数:
weights - the weights to set
setParams
public void setParams(org.encog.neural.networks.BasicNetwork network)
getNetwork
public org.encog.neural.flat.FlatNetwork getNetwork()
getLayerDelta
public double[] getLayerDelta()
Copyright © 2014. All Rights Reserved.