Thursday, 10 March 2011

Developer Journal 70 - On Setting Neural Network Parameters

Seventy developer journals. That's at least seventy days. Whoa.

3:29 PM

Damn. The following bug took me ages to figure out. I was testing the activation level of the first excitatory neuron in the first internal group, I'll call this Alpha. I called the neural network update function and got the neuron activation level. I then calculated the activation level with another piece of code, I'll call this Bravo. I compared the values and kept getting different numbers.

What I did not realize was that the update function altered the input neuron activations and the synapse weights going into my test neuron. The result was that I was using different values to manually calculate Bravo to compare to Alpha.

4:13 PM

I'm pretty sure the structure and activation code is working. Now to test the learning code.

4:46 PM

Grr. The neural network learning code is not working like I expected. If the sending neuron activates and the receiving neuron activates, then the synapse ... wait. I got it. For some reason I was setting up a situation where the sending and neuron and the receiving neuron did not activate so the synapse should decrease by the decay rate. The situation that was happening was that the sending and receiving neurons were activating so the synapse weight should increase by the learning rate.

5:15 PM

I fixed that and found another misunderstanding. I was applying a decay rate even when I was applying a learning rate so my manual calculate was different to the code. I changed the code to either apply a learning rate or a decay rate. I don't know which way is correct, to apply one at a time or both, but I'll stick with this for now.

5:27 PM

I'm pretty sure the neural network structure, activation, and adaptation code is working. Now to set the parameters and get some action happening.

6:43 PM

I wish I knew more maths, physics, biology, neurology, and software engineering.

6:54 PM

I don't like the way the neural network code is working now. There are so many things I want to improve but I have to get a basic model working first.

7:29 PM

There are so many types of neural network connections.

After a bit of experimentation and consideration, I've enable some and disabled others. I kept the connections where excitatory neurons balance inhibitory neurons. The ones I turned off are where excitatory neurons overwhelm the neural network.

Input excitatory to internal excitatory on
Input excitatory to internal inhibitory on
Input excitatory to output excitatory off

Internal excitatory to internal excitatory on
Internal excitatory to internal inhibitory on
Internal inhibitory to internal inhibitory on
Internal inhibitory to internal excitatory on
Internal excitatory to output excitatory on
Internal inhibitory to output excitatory on

Output excitatory to output excitatory off
Output excitatory to internal excitatory on
Output excitatory to internal inhibitory on

7:39 PM

Still so many things to think about
+ neuron activation levels
+ neuron threshold levels
+ neuron activation frequencies
+ synapse decay rates
+ synapse learning rates
+ synapse minimum and maximum weights

+ print excitatory and inhibitory components of neuron summation
+ implement turn right, move back
+ consider adding the other neural network loops

+ limit maximum linear and angular force as well as velocity
+ fix neuron output
+ implement to neuron cursor and print synapse information
+ clear forces when resetting position
+ change from pull neural network to push
+ activate neurons in random order
+ capture neuron inputs over time
+ put threshold levels, activation levels, min and max synapse weights, learning rates, decay rates in genome


No comments:

Post a Comment