|
||||||||||
| PREV PACKAGE NEXT PACKAGE | FRAMES NO FRAMES | |||||||||
See:
Description
| Interface Summary | |
|---|---|
| ActivationFunction | This interface allows various activation functions to be used with the neural network. |
| Class Summary | |
|---|---|
| ActivationBiPolar | BiPolar activation function. |
| ActivationCompetitive | An activation function that only allows a specified number, usually one, of the out-bound connection to win. |
| ActivationGaussian | An activation function based on the gaussian function. |
| ActivationLinear | The Linear layer is really not an activation function at all. |
| ActivationLOG | An activation function based on the logarithm function. |
| ActivationRamp | A ramp activation function. |
| ActivationSigmoid | The sigmoid activation function takes on a sigmoidal shape. |
| ActivationSIN | An activation function based on the sin function. |
| ActivationSoftMax | The softmax activation function. |
| ActivationStep | The step activation function is a very simple activation function. |
| ActivationTANH | The hyperbolic tangent activation function takes the curved shape of the hyperbolic tangent. |
This package contains all of the classes for activation functions.
|
||||||||||
| PREV PACKAGE NEXT PACKAGE | FRAMES NO FRAMES | |||||||||