Uses of Class
org.joone.engine.Layer

Packages that use Layer
org.joone.engine   
org.joone.net   
org.joone.structure   
 

Uses of Layer in org.joone.engine
 

Subclasses of Layer in org.joone.engine
 class BiasedLinearLayer
          This layer consists of linear neurons, i.e.
 class ContextLayer
          The context layer is similar to the linear layer except that it has an auto-recurrent connection between its output and input.
 class DelayLayer
          Delay unit to create temporal windows from time series
O---> Yk(t-N)
|
...
 class GaussianLayer
          This layer implements the Gaussian Neighborhood SOM strategy.
 class GaussLayer
          The output of a Gauss(ian) layer neuron is the sum of the weighted input values, applied to a gaussian curve (exp(- x * x)).
 class LinearLayer
          The output of a linear layer neuron is the sum of the weighted input values, scaled by the beta parameter.
 class LogarithmicLayer
          This layer implements a logarithmic transfer function.
 class MemoryLayer
           
 class RbfGaussianLayer
          This class implements the nonlinear layer in Radial Basis Function (RBF) networks using Gaussian functions.
 class RbfLayer
          This is the basis (helper) for radial basis function layers.
 class SigmoidLayer
          The output of a sigmoid layer neuron is the sum of the weighted input values, applied to a sigmoid function.
 class SimpleLayer
          This abstract class represents layers that are composed by neurons that implement some transfer function.
 class SineLayer
          The output of a sine layer neuron is the sum of the weighted input values, applied to a sine (sin(x)).
 class SoftmaxLayer
          The outputs of the Softmax layer must be interpreted as probabilities.
 class TanhLayer
          Layer that applies the tangent hyperbolic transfer function to its input patterns
 class WTALayer
          This layer implements the Winner Takes All SOM strategy.
 

Fields in org.joone.engine declared as Layer
protected  Layer RTRLLearnerFactory.inputLayer
          The input layer
protected  Layer RTRLLearnerFactory.Weight.layer
          The joone layer which is used if this weight is a bias
protected  Layer RTRLLearnerFactory.Node.layer
          The layer at which this node is found
protected  Layer RTRLLearnerFactory.outputLayer
          The output layer from which we calculate errors and which we use to determine if a node is in T
 

Constructors in org.joone.engine with parameters of type Layer
RTRLLearnerFactory.Node(Layer layer, int index)
          Create a new node from a joone layer
RTRLLearnerFactory.Weight(Layer layer, int i, int K)
          Initialise this weight from a joone layer
 

Uses of Layer in org.joone.net
 

Subclasses of Layer in org.joone.net
 class NestedNeuralLayer
           
 

Methods in org.joone.net that return Layer
 Layer[] NeuralNet.calculateOrderedLayers()
          This method calculates the order of the layers of the network, from the input to the output.
 Layer NeuralNet.findInputLayer()
          Returns the input layer, by searching for it following the rules written in Layer.isInputLayer.
 Layer NeuralNet.findOutputLayer()
          Returns the output layer by searching for it following the rules written in Layer.isOutputLayer.
 Layer NeuralNet.getInputLayer()
          Returns the input layer of the network.
 Layer NeuralNet.getLayer(java.lang.String layerName)
           
 Layer[] NeuralNet.getOrderedLayers()
           
 Layer NeuralNet.getOutputLayer()
          Returns the output layer of the network.
 

Methods in org.joone.net with parameters of type Layer
 void NeuralNet.addLayer(Layer layer)
           
 void NeuralNet.addLayer(Layer layer, int tier)
           
 void NeuralNet.removeLayer(Layer layer)
           
 void NeuralNet.setInputLayer(Layer newLayer)
           
 void NeuralNet.setOrderedLayers(Layer[] orderedLayers)
          This method permits to set externally a particular order to traverse the Layers.
 void NeuralNet.setOutputLayer(Layer newLayer)
           
 

Uses of Layer in org.joone.structure
 

Subclasses of Layer in org.joone.structure
 class NetworkLayer
          Wraps an existing joone network into a single layer.
 

Fields in org.joone.structure declared as Layer
protected  Layer NodesAndWeights.Weight.layer
          The joone layer which is used if this weight is a bias
protected  Layer NodesAndWeights.Node.layer
          The layer at which this node is found
 

Methods in org.joone.structure that return Layer
protected  Layer Nakayama.findInputLayer(Synapse aSynapse)
          Finds the input layer of a synapse.
protected  Layer Nakayama.findOutputLayer(Synapse aSynapse)
          Finds the output layer of a synapse.
 

Methods in org.joone.structure with parameters of type Layer
 void Nakayama.addLayer(Layer aLayer)
          Adds layers to this optimizer.
protected  double Nakayama.getSumAbsoluteWeights(Layer aLayer, int aNeuron)
          Sums up all the absolute values of the output weights of a neuron within a layer.
static void NodeFactory.setNodeFunctions(AbstractNode node, Layer layer)
          Set the transport and derivative functions of a node from the type of layer it is found in
 

Constructors in org.joone.structure with parameters of type Layer
NodesAndWeights.Node(Layer layer, int index, int order)
          Create a new node from a joone layer and also check to see if it has a valid initial state
NodesAndWeights.Weight(Layer layer, int i, int I, int J)
          Initialise this weight from a joone layer
 



Submit Feedback to pmarrone@users.sourceforge.net