org.encog.mathutil.randomize
Class NguyenWidrowRandomizer
java.lang.Object
org.encog.mathutil.randomize.BasicRandomizer
org.encog.mathutil.randomize.RangeRandomizer
org.encog.mathutil.randomize.NguyenWidrowRandomizer
- All Implemented Interfaces:
- Randomizer
public class NguyenWidrowRandomizer
- extends RangeRandomizer
- implements Randomizer
Implementation of Nguyen-Widrow weight initialization. This is the
default weight initialization used by Encog, as it generally provides the
most trainable neural network.
- Author:
- St?phan Corriveau
|
Method Summary |
void |
randomize(BasicNetwork network,
int fromLayer)
Randomize one level of a neural network. |
void |
randomize(MLMethod method)
The Nguyen-Widrow initialization algorithm is the following :
1. |
| Methods inherited from class java.lang.Object |
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait |
NguyenWidrowRandomizer
public NguyenWidrowRandomizer(double min,
double max)
- Construct a Nguyen-Widrow randomizer.
- Parameters:
min - The min of the range.max - The max of the range.
randomize
public final void randomize(MLMethod method)
- The Nguyen-Widrow initialization algorithm is the following :
1. Initialize all weight of hidden layers with (ranged) random values
2. For each hidden layer
2.1 calculate beta value, 0.7 * Nth(#neurons of input layer) root of
#neurons of current layer
2.2 for each synapse
2.1.1 for each weight
2.1.2 Adjust weight by dividing by norm of weight for neuron and
multiplying by beta value
- Specified by:
randomize in interface Randomizer- Overrides:
randomize in class BasicRandomizer
- Parameters:
method - The network to randomize.
randomize
public void randomize(BasicNetwork network,
int fromLayer)
- Randomize one level of a neural network.
- Overrides:
randomize in class BasicRandomizer
- Parameters:
network - The network to randomizefromLayer - The from level to randomize.
Copyright © 2011. All Rights Reserved.