org.encog.engine.network.activation
Interface ActivationFunction

All Superinterfaces:
Cloneable, Serializable
All Known Implementing Classes:
ActivationBiPolar, ActivationCompetitive, ActivationGaussian, ActivationLinear, ActivationLOG, ActivationRamp, ActivationSigmoid, ActivationSIN, ActivationSoftMax, ActivationStep, ActivationTANH

public interface ActivationFunction
extends Serializable, Cloneable

This interface allows various activation functions to be used with the neural network. Activation functions are applied to the output from each layer of a neural network. Activation functions scale the output into the desired range. Methods are provided both to process the activation function, as well as the derivative of the function. Some training algorithms, particularly back propagation, require that it be possible to take the derivative of the activation function. Not all activation functions support derivatives. If you implement an activation function that is not derivable then an exception should be thrown inside of the derivativeFunction method implementation. Non-derivable activation functions are perfectly valid, they simply cannot be used with every training algorithm.


Method Summary
 void activationFunction(double[] d, int start, int size)
          Implements the activation function.
 ActivationFunction clone()
           
 double derivativeFunction(double d)
          Calculate the derivative of the activation.
 String getOpenCLExpression(boolean derivative)
          Returns the OpenCL expression for this activation function.
 String[] getParamNames()
           
 double[] getParams()
           
 boolean hasDerivative()
           
 void setParam(int index, double value)
          Set one of the params for this activation function.
 

Method Detail

activationFunction

void activationFunction(double[] d,
                        int start,
                        int size)
Implements the activation function. The array is modified according to the activation function being used. See the class description for more specific information on this type of activation function.

Parameters:
d - The input array to the activation function.
start - The starting index.
size - The number of values to calculate.

derivativeFunction

double derivativeFunction(double d)
Calculate the derivative of the activation. It is assumed that the value d, which is passed to this method, was the output from this activation. This prevents this method from having to recalculate the activation, just to recalculate the derivative. The array is modified according derivative of the activation function being used. See the class description for more specific information on this type of activation function. Propagation training requires the derivative. Some activation functions do not support a derivative and will throw an error.

Parameters:
d - The input array to the activation function.
Returns:
The derivative.

hasDerivative

boolean hasDerivative()
Returns:
Return true if this function has a derivative.

getParams

double[] getParams()
Returns:
The params for this activation function.

setParam

void setParam(int index,
              double value)
Set one of the params for this activation function.

Parameters:
index - The index of the param to set.
value - The value to set.

getParamNames

String[] getParamNames()
Returns:
The names of the parameters.

clone

ActivationFunction clone()
Returns:
A cloned copy of this activation function.

getOpenCLExpression

String getOpenCLExpression(boolean derivative)
Returns the OpenCL expression for this activation function.

Parameters:
derivative - True if we want the derivative, false otherwise.
Returns:
The OpenCL expression for this activation function.


Copyright © 2011. All Rights Reserved.