Page 1 of 1

BackPropagation training data set

PostPosted: Fri Mar 30, 2018 5:23 am
by LuisMezasAguilar
Hello, my name is Luis Mezas and I have the next doubt... I hope someone can help me out a little bit.

I'm wondering how many inputs and outputs values do I need for correctly training my NeuralNetwork, (of course, using AForge's Framework), like this:

Code: Select all
//initialize input and output values
                double[][] input = new double[7][] {
                    new double[] {1, 1, 1, 1, 1, 1, 1},
                    new double[] {1, 1, 1, 1, 1, 1, 1},
                    new double[] {1, 1, 1, 1, 1, 1, 1},
                    new double[] {1, 0, 0, 0, 1, 1, 1},
                    new double[] {0, 0, 0, 0, 1, 1, 1},
                    new double[] {0, 0, 0, 0, 1, 1, 1},
                    new double[] {0, 0, 0, 0, 0, 0, 0}
            };
            double[][] output = new double[7][] {
                    new double[] {1},
                    new double[] {1},
                    new double[] {1},
                    new double[] {0},
                    new double[] {0},
                    new double[] {0},
                    new double[] {0},
            };

_Network = new ActivationNetwork(
               _Function,
               7, // seven inputs in the network
               7, // seven neurons in the first layer
               1); // one neuron in the second layer
            //create teacher
            _Teacher = new BackPropagationLearning(_Network);           
            _Teacher.LearningRate = 0.1;
            _Teacher.Momentum = 0;

Where: _Network belongs to ActivationNetwork class and _Function belongs to ThresholdFunction class

Running the Method RunEpoch from BackPropagationLearning class:
Code: Select all
double error = _Teacher.RunEpoch(input, output);

(After looking at Visual Studio's Debugger I can see how the weights of each Neuron in the NN have been changed)

I can correctly save the Network in a file with :
Code: Select all
_Network.Save("Network");


With all this said, I think Ican say that the NN has "learned".

Unfortunately, when I test the network with a double[] _Input, so I can get an output:
Code: Select all
_Output = _Network.Compute(_Inputs);

Where: _Output is a double[] variable and _Inputs is a double[] variable too, I have been testing the NN, and the desired output is not retrieved on _Output.

Let's say, if _Inputs is a double[] with values like: { 0, 0, 0 , 0, 0, 0, 0} I would like to expect an _Output with a single element double[] { 0 }, but the truth is that no matter how I change the _Inputs double[] values I always get a double[] value { 1 } from:
Code: Select all
_Network.Compute(_Inputs);


My first thoughts are that I need to improve the way the Neural Network is learning.

I think one way could be to increase the amount of inputs and outputs. (I don't know how to do this, I get errors when I instantiate the network with
Code: Select all
_Network = new ActivationNetwork(
               _Function,
               7, // seven inputs in the network
               7, // seven neurons in the first layer
               1); // one neuron in the second layer
)
with a different quantity of double[][] inputs and a different quantity of inputs in the network.

Or maybe I can change the structure of the Neural Network. (I have to say that I get some errors when I try to make a Neural Network with 7 Input Neurons on the Input Layer, and 2 Neurons on the Hidden Layer, instead of just 1 Neuron on the second Layer as you can see in the code. Errors like "Index was outside the bounds of the Array" when I use the method RunEpoch).

I hope someone can help me out a little bit, my apologies if this is waaay to newbie, I've been trying to understand a lot about Neural Networks and AForge Framework in the last 7 days. lol anyways, greetings from Mexico btw.

Re: BackPropagation training data set

PostPosted: Fri Mar 30, 2018 2:40 pm
by andrew.kirillov
Hello,

Looking at your data, I would say one single neuron is enough. Just one little neuron with 7 inputs. The input you have looks perfectly linear separable. It is like solving trivial OR or AND examples.

Re: BackPropagation training data set

PostPosted: Fri Mar 30, 2018 6:37 pm
by LuisMezasAguilar
Thank you for answering Andrew (I have to say I'm amazed with your framework, thank you for sharing your knowledge).

Do you have any thoughts about why the Neural Network is always retrieving the same Output value { 1 } no matter how I change the inputs at the Compute() method?

Have a good day, Andrew! :D

Re: BackPropagation training data set

PostPosted: Sat Mar 31, 2018 5:08 pm
by andrew.kirillov
LuisMezasAguilar wrote:Do you have any thoughts about why the Neural Network is always retrieving the same Output value { 1 } no matter how I change the inputs at the Compute() method?

Maybe it did not quite learn. Check the weights, see if they change during training. See if training error goes down during the process.

Re: BackPropagation training data set

PostPosted: Sat Mar 31, 2018 5:54 pm
by LuisMezasAguilar
I'll do that, Andrew. Thank you again for your patience, I'm pretty sure what I asked is a newbie question haha.

Have a nice day!

Re: BackPropagation training data set

PostPosted: Wed Apr 04, 2018 11:52 pm
by LuisMezasAguilar
Just as a comment, I ended up using another ActivationFunction for the network. Instead of a ThresholdFunction I used BipolarSigmoidFunction with 30 samples double[30][] inputs and outputs, with values of -1 and 1.

As of April 4th, the network succesfully learned and with Compute() method, with given inputs for the network, the different desired output values are retrieved.

Thank you again Andrew for sharing your knowledge, god bless.