Class | Ai4r::NeuralNetwork::Backpropagation |
In: |
lib/ai4r/neural_network/backpropagation.rb
|
Parent: | Object |
This is an implementation of a multilayer perceptron network, using the backpropagation algorithm for learning.
Backpropagation is a supervised learning technique (described by Paul Werbos in 1974, and further developed by David E. Rumelhart, Geoffrey E. Hinton and Ronald J. Williams in 1986)
Use class method get_parameters_info to obtain details on the algorithm parameters. Use set_parameters to set values for this parameters.
# Create the network with 4 inputs, 1 hidden layer with 3 neurons, # and 2 outputs net = Ai4r::NeuralNetwork::Backpropagation.new([4, 3, 2]) # Train the network 1000.times do |i| net.train(example[i], result[i]) end # Use it: Evaluate data with the trained network net.eval([12, 48, 12, 25]) => [0.86, 0.01]
More about multilayer perceptron neural networks and backpropagation:
Author: | Sergio Fierens |
License: | MPL 1.1 |
Url: | ai4r.rubyforge.org |
activation_nodes | [RW] | |
structure | [RW] | |
weights | [RW] |
Creates a new network specifying the its architecture. E.g.
net = Backpropagation.new([4, 3, 2]) # 4 inputs # 1 hidden layer with 3 neurons, # 2 outputs net = Backpropagation.new([2, 3, 3, 4]) # 2 inputs # 2 hidden layer with 3 neurons each, # 4 outputs net = Backpropagation.new([2, 1]) # 2 inputs # No hidden layer # 1 output
Evaluates the input. E.g.
net = Backpropagation.new([4, 3, 2]) net.eval([25, 32.3, 12.8, 1.5]) # => [0.83, 0.03]
This method trains the network using the backpropagation algorithm.
input: Networks input
output: Expected output for the given input.
This method returns the network error:
Calculate quadratic error for a expected output value Error = 0.5 * sum( (expected_value[i] - output_value[i])**2 )
Momentum usage need to know how much a weight changed in the previous training. This method initialize the @last_changes structure with 0 values.