org.encog.neural.networks.training.cross
Class CrossValidationKFold

java.lang.Object
  extended by org.encog.neural.networks.training.BasicTraining
      extended by org.encog.neural.networks.training.cross.CrossTraining
          extended by org.encog.neural.networks.training.cross.CrossValidationKFold
All Implemented Interfaces:
Train

public class CrossValidationKFold
extends CrossTraining

Train using K-Fold cross validation. Each iteration will train a number of times equal to the number of folds - 1. Each of these sub iterations will train all of the data minus the fold. The fold is used to validate. Therefore, you are seeing an error that reflects data that was not always used as part of training. This should give you a better error result based on how the network will perform on non-trained data.(validation). The cross validation trainer must be provided with some other sort of trainer, perhaps RPROP, to actually perform the training. The training data must be the FoldedDataSet. The folded dataset can wrap most other training sets.


Constructor Summary
CrossValidationKFold(Train train, int k)
          Construct a cross validation trainer.
 
Method Summary
 void iteration()
          Perform one iteration.
 
Methods inherited from class org.encog.neural.networks.training.cross.CrossTraining
getFolded, getNetwork
 
Methods inherited from class org.encog.neural.networks.training.BasicTraining
addStrategy, finishTraining, getCloud, getError, getIteration, getStrategies, getTraining, isTrainingDone, iteration, postIteration, preIteration, setCloud, setError, setIteration, setTraining
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Constructor Detail

CrossValidationKFold

public CrossValidationKFold(Train train,
                            int k)
Construct a cross validation trainer.

Parameters:
train - The training
k - The number of folds.
Method Detail

iteration

public void iteration()
Perform one iteration.



Copyright © 2011. All Rights Reserved.