Using the newConx and conx libraries
Here is a summary of some of the key methods that are
part of the neural network libraries.
- n = BackpropNetwork()
Creates a backpropagation network
with no layers.
- n.addLayer('input', 2)
Adds an input layer to the network with 2 units.
- n.addLayer('output',1)
Adds an output layer to the network with 1 unit.
- n.connect('input', 'output')
Fully connects the given layers.
- n.addLayers(inputSize, hiddenSize, outputSize)
Creates
a three-layer feedforward network with the given layer sizes. The
input layer is fully connected to the hidden layer and the hidden
layer is fully connected to the output.
- n.setEpsilon(0.5)
Sets the learning rate of the network.
- n.setTolerance(0.1)
Sets how close an output must be
to the target to be considered correct.
- n.loadInputsFromFile(filename)
Sets up the input patterns that will be used to train the network.
- n.loadTargetsFromFile(filename)
Sets up the target
patterns that will be used to train the network. The files used for
inputs and targets should be the same length and have a one-to-one
pairing.
-
n.showPerformance()
Displays how the network responds to
each of the training patterns. Initially it will get every
pattern wrong since the weights are randomly initialized and no
learning has taken place.
-
n.printWeights(layer1, layer2)
Displays the network's
current weights between layer1 and layer2. For
simple two-layer networks, the layers are called 'input'
and 'output'. For three-layer networks, the middle layer is
called 'hidden'.
-
n.train(epochs)
Repeatedly trains the network on the set
of training patterns. Each time through all of the patterns is called
an epoch. The epochs parameter is optional. If provided, the
network will be trained for the given number of epochs only. If not
provided, the network will be trained until it has successfully
learned all of the training patterns (for some problems this may be
impossible to achieve). When the network is successfully learning, the
total amount of error per epoch should decrease over time.
- n.reset()
Reset the weights of the network to small random values.
- n.splitData(percentage)
Expects a percentage in the range
0-100. Randomly selects the given percentage of patterns from the
data to be the training set. The remaining patterns are considered to
be the testing set.
- n.swapData()
Swaps between training and testing sets
and vice versa. During training, periodically swap the data and then
check the network's performance on the test data. Swap back to
continue training. To avoid overfitting, stop training on the training
set when performance on the testing set is no longer improving.