Uses of Class
org.joone.engine.SimpleLayer

Packages that use SimpleLayer
org.joone.engine   
 

Uses of SimpleLayer in org.joone.engine
 

Subclasses of SimpleLayer in org.joone.engine
 class BiasedLinearLayer
          This layer consists of linear neurons, i.e.
 class ContextLayer
          The context layer is similar to the linear layer except that it has an auto-recurrent connection between its output and input.
 class GaussianLayer
          This layer implements the Gaussian Neighborhood SOM strategy.
 class GaussLayer
          The output of a Gauss(ian) layer neuron is the sum of the weighted input values, applied to a gaussian curve (exp(- x * x)).
 class LinearLayer
          The output of a linear layer neuron is the sum of the weighted input values, scaled by the beta parameter.
 class LogarithmicLayer
          This layer implements a logarithmic transfer function.
 class SigmoidLayer
          The output of a sigmoid layer neuron is the sum of the weighted input values, applied to a sigmoid function.
 class SineLayer
          The output of a sine layer neuron is the sum of the weighted input values, applied to a sine (sin(x)).
 class SoftmaxLayer
          The outputs of the Softmax layer must be interpreted as probabilities.
 class TanhLayer
          Layer that applies the tangent hyperbolic transfer function to its input patterns
 class WTALayer
          This layer implements the Winner Takes All SOM strategy.
 



Submit Feedback to pmarrone@users.sourceforge.net