It's come back to bite me. Changing from genome version alpha to genome version bravo.
In version alpha, you retrieve the gene you want using a mathematical formula. There was an order to the sequence of genes. The advantage of this is that retrieving the gene you want is fast. The disadvantage is that adding new genes is hard because you have to modify the mathematical formula.
In version bravo, you retrieve genes by doing a search. There is no order to the sequence of genes. Each gene has a number of tags to uniquely identify it. The advantage of this is that adding new genes is easy. The disadvantage is that retrieving the gene you want is slow.
Starting up the simulation with a new population has become very slow. Not only that, when I ran the simulation last night, all of the robots died. Argh! I don't know what it is and it's hard to check because running the simulation takes time.
You have to make your changes small. You have to make sure your changes work. I wish I could have figured out a way to go from version alpha to bravo in smaller steps.
You have to keep an eye on how long things take. You have to time them.
Hopefully the population died because the gender constraint was active.
I added a few timers to look at how long things took to start up. The program takes 36 seconds for 8 robots. 6 seconds is for things like starting up the graphics and physics engine. Each robots takes about 3.75 seconds to start up. That doesn't sound like much but when you have about 128 robots that's 480 seconds or 8 minutes!
The majority of the time for creating a robot is creating the neural network and a large part of that is due to the searching for the required gene through the genome again and again.
I can't fix that now, indeed I'm not even sure of how to fix it other than to go back to version alpha but that has it's own problems, namely inserting new genes is difficult.
The good thing about this is that I know what's going, how long things are taking, and that it's not taking too long. This frees me from having to tinker with the genome for now. That means I can tinker with the neural network code to add neurons and synapses on the fly.
The easiest part of projects is at the start where you're day dreaming. The hard scary part is when you get to it and actually try to finish it.
Adding a neuron group looks easy. Adding a neuron to an existing group looks hard but it's not something I have to work on now. I'll have to work on it when I do things like increase the number of input neurons for vision on the fly.
Potential
On artificial intelligence and more by Binh Nguyen
Thursday, 10 November 2011
Saturday, 5 November 2011
Developer Journal 139
Arg! Another day on converting genome data from version alpha to version bravo.
Labels:
developer journal
Wednesday, 2 November 2011
Developer Journal 138
Today's task, update genome data when the genome configuration file changes.
My plan is to go through one gene at a time in the configuration file, and see whether it is in the genome data. If it is not, then I'll make the appropriate changes. For example, if there is are new genes for a new input group in the configuration file, I'll add the genes to the genome and generate what I'm calling the secondary genes.
Each neuron group has a bias gene and a bias learning rate gene.
In addition each neuron group has potential connections to every other group and vice versa. For each potential connection there is a connection density gene, a learning rate gene, and a topological distortion gene.
Whoa, each robot has about a megabyte of data.
The following error took me a long time to work out because of a serialization.
My plan is to go through one gene at a time in the configuration file, and see whether it is in the genome data. If it is not, then I'll make the appropriate changes. For example, if there is are new genes for a new input group in the configuration file, I'll add the genes to the genome and generate what I'm calling the secondary genes.
Each neuron group has a bias gene and a bias learning rate gene.
In addition each neuron group has potential connections to every other group and vice versa. For each potential connection there is a connection density gene, a learning rate gene, and a topological distortion gene.
Whoa, each robot has about a megabyte of data.
The following error took me a long time to work out because of a serialization.
for( int j = 0; i < numGene; ++j )
Labels:
developer journal
Tuesday, 1 November 2011
Developer Journal 137
Spent most of my day writing the crossover and mutation code.
Labels:
developer journal
Monday, 31 October 2011
Developer Journal 136
Just spent a day fixing bugs that caused the simulation to crash when converting genome data from version alpha to bravo. Seems to be good for now. Just went over the completed and next actions list. There are still so man things left to go.
Next up
Next up
- fix how robots are not getting visual data
- when genome configuration file changes, update genome
- when genome changes, update neural network
- when connection file changes, update connection
- implement gender touch sensor
- run gender simulations
- balance gender over time
Labels:
developer journal
Saturday, 29 October 2011
Developer Journal 135
Having had a good night's sleep feels so much better. After the fire alarms stopped yesterday, I had a look at the neural network input and output over time. The output was abnormally small which explained why the robots weren't moving. The robots also didn't seem to be receiving visual input. I've got one or two hours to work today before a relaxing weekend.
Aah, I printed out the minimum efficacy, maximum efficacy, and decay rate. The maximum efficacy was the same as the minimum efficacy. Yep that was it. A small bug that freaked me out on a rainy, warm, Friday night with blaring fire alarms.
I was worried that I'd made too many big changes, without testing. At the same time I couldn't really make the changes smaller in this case. More an more I'm appreciating the power of a well designed code interface.
Aah, I printed out the minimum efficacy, maximum efficacy, and decay rate. The maximum efficacy was the same as the minimum efficacy. Yep that was it. A small bug that freaked me out on a rainy, warm, Friday night with blaring fire alarms.
I was worried that I'd made too many big changes, without testing. At the same time I couldn't really make the changes smaller in this case. More an more I'm appreciating the power of a well designed code interface.
Labels:
developer journal
Friday, 28 October 2011
Developer Journal 134
So sleepy. Woke up at 3 with stomach and back pain. Too much sitting and not enough training. Nutrition, getting better. Finally I've got a few days to focus on programming and writing, and I'm tempted to goof off. I can feel myself being less motivated and less intelligent.
I've got a few things to do today. Make sure that genome version 2 works with brain version 1. Then make sure that I'm converting genome version 1 data to genome version 2.
After a day of slow programming, genome version2 works with brain version 1. It seems to anyway, I'm going to have to let it run for a while to see what happens.
I'll also have to do some profiling because the constant genome look ups are slowing down the system.
Now to check the differences that between genomes that are the result of the normal creation code and the conversion code. I hate working with the serialization code enabled, it takes a lot more time to compile, link, and load.
27 seconds extra to compile. 56 seconds extra to link. 6 minutes 6 seconds extra to load. That might not sound like much but it is.
When the program loads, class Genome converts the data from pre-pre-alpha to version pre-alpha. Then the data gets converted to version alpha.
Pre-pre alpha is when the data was in class Genome and was a vector of floats. This was how things were originally coded up. Pre-alpha when the data gets converted to a vector of class Gene. This was when I was feeling my way around.
Version alpha is when the data is moved to class GenomeAlpha. I moved it over here so that I could preserve the old code in class GenomeAlpha, use class Genome as a common interface and try new code in class GenomeBravo.
Doh. I think I broke something. When I loaded up a previous save file, the robots no longer moved. I didn't change anything major, I changed the genome code but that shouldn't affect the architecture of the neural network.
I think the conversion from class Genome to class GenomeAlpha is not quite right.
Grr. Fire alarms have gone off. I'm sure it's a drill but they're so loud and annoying!
I've got a few things to do today. Make sure that genome version 2 works with brain version 1. Then make sure that I'm converting genome version 1 data to genome version 2.
After a day of slow programming, genome version2 works with brain version 1. It seems to anyway, I'm going to have to let it run for a while to see what happens.
I'll also have to do some profiling because the constant genome look ups are slowing down the system.
Now to check the differences that between genomes that are the result of the normal creation code and the conversion code. I hate working with the serialization code enabled, it takes a lot more time to compile, link, and load.
27 seconds extra to compile. 56 seconds extra to link. 6 minutes 6 seconds extra to load. That might not sound like much but it is.
When the program loads, class Genome converts the data from pre-pre-alpha to version pre-alpha. Then the data gets converted to version alpha.
Pre-pre alpha is when the data was in class Genome and was a vector of floats. This was how things were originally coded up. Pre-alpha when the data gets converted to a vector of class Gene. This was when I was feeling my way around.
Version alpha is when the data is moved to class GenomeAlpha. I moved it over here so that I could preserve the old code in class GenomeAlpha, use class Genome as a common interface and try new code in class GenomeBravo.
Doh. I think I broke something. When I loaded up a previous save file, the robots no longer moved. I didn't change anything major, I changed the genome code but that shouldn't affect the architecture of the neural network.
I think the conversion from class Genome to class GenomeAlpha is not quite right.
Grr. Fire alarms have gone off. I'm sure it's a drill but they're so loud and annoying!
Labels:
developer journal
Thursday, 27 October 2011
Developer Journal 133
Slight change to the gene format.
I removed "value" and "isVariable". I figured that I can let "value" be whatever and I could set a gene to not be variable by making the min and the max the same. I'm still worried that this is going to come back to bite me but the format looks and feels good.
I should look up a good XML parser but I don't want to involve new code. I'm going write a little code to read the settings in.
Years ago when I started working on developing AI, I did not think I'd be spending time writing up an XML like file to initialise a genome structure.
Finished writing the code to create genome version 2 from a configuration file. Reading in the primary genes was easy. Creating the secondary genes that depend on the primary genes took a fair bit longer to do.
Still a fair bit of testing to do. I have to make sure genome version 2 works with brain version 1, then write brain version 2.
The main difference with brain version 2 is that it can grow new neurons and synapses when the genome changes. For example, I'm going to need to add a new input and output group to the genome and then I need the brain to grow the new input and output groups. Yee haw!
Hopefully I can get the version 2 gender experiments done soon and have some time to work on the robot arms and legs a bit more. The new feature of being able to add input and output groups on the fly is going to help a lot. Each joint in the robot body is going to need a brain output groups. In addition, I'm thinking of pumping limb data into the brain, so I'll need brain input groups.
<gene> tag input tag random tag numExc min 1 max 1 </gene>
I removed "value" and "isVariable". I figured that I can let "value" be whatever and I could set a gene to not be variable by making the min and the max the same. I'm still worried that this is going to come back to bite me but the format looks and feels good.
I should look up a good XML parser but I don't want to involve new code. I'm going write a little code to read the settings in.
Years ago when I started working on developing AI, I did not think I'd be spending time writing up an XML like file to initialise a genome structure.
Finished writing the code to create genome version 2 from a configuration file. Reading in the primary genes was easy. Creating the secondary genes that depend on the primary genes took a fair bit longer to do.
Still a fair bit of testing to do. I have to make sure genome version 2 works with brain version 1, then write brain version 2.
The main difference with brain version 2 is that it can grow new neurons and synapses when the genome changes. For example, I'm going to need to add a new input and output group to the genome and then I need the brain to grow the new input and output groups. Yee haw!
Hopefully I can get the version 2 gender experiments done soon and have some time to work on the robot arms and legs a bit more. The new feature of being able to add input and output groups on the fly is going to help a lot. Each joint in the robot body is going to need a brain output groups. In addition, I'm thinking of pumping limb data into the brain, so I'll need brain input groups.
Labels:
developer journal
Wednesday, 26 October 2011
Developer Journal 132
Oh XML, XML. Another technology I ignored during my undergraduate studies that's coming back to haunt me.
The settings file for the genome class has lots of different types of data and so much data. I've come up with a format that I hope will be extensible and won't come back to bite me later.
My long term goal is to move as much as I can to the genome so that evolutionary processes can determine the values.
The settings file for the genome class has lots of different types of data and so much data. I've come up with a format that I hope will be extensible and won't come back to bite me later.
<gene> tag input tag random tag numExc min 0 max 1 value 1 isVariable 0 </gene>
My long term goal is to move as much as I can to the genome so that evolutionary processes can determine the values.
Labels:
developer journal
Tuesday, 25 October 2011
Developer Journal 131
I won't be able to do too much development today, I've got a fair amount of marking and tutoring to do.
I'm still working on the code to transition from genome version alpha to bravo. I've been thinking about whether to put the conversion code in class GenomeAlpha, class GenomeBravo, or somewhere else. I've placed the conversion code in class GenomeBravo for now. The conversion function receives a pointer to class GenomeAlpha and converts the data into class GenomeBravo.
I thought I'd be able to get the new genome, neural network, embodiment code done, and run a number of simulations by Wednesday, starting from last Wednesday. The idea was to give the robots a new sensor that would allow them to detect whether nearby robots were male or female. I wanted to avoid hard coding, and to preserve older code so I could step back if I needed to, so the coding is taking longer than I thought it would.
I'm finding more and more that topics I've avoided in the past are coming back to bite me, maths, physics, databases, and networking. Working with simulated robots requires maths and physics. Working with population data over time requires databases. Scaling the simulation requires networking to to a client server architecture.
I'm still working on the code to transition from genome version alpha to bravo. I've been thinking about whether to put the conversion code in class GenomeAlpha, class GenomeBravo, or somewhere else. I've placed the conversion code in class GenomeBravo for now. The conversion function receives a pointer to class GenomeAlpha and converts the data into class GenomeBravo.
I thought I'd be able to get the new genome, neural network, embodiment code done, and run a number of simulations by Wednesday, starting from last Wednesday. The idea was to give the robots a new sensor that would allow them to detect whether nearby robots were male or female. I wanted to avoid hard coding, and to preserve older code so I could step back if I needed to, so the coding is taking longer than I thought it would.
I'm finding more and more that topics I've avoided in the past are coming back to bite me, maths, physics, databases, and networking. Working with simulated robots requires maths and physics. Working with population data over time requires databases. Scaling the simulation requires networking to to a client server architecture.
Labels:
developer journal
Subscribe to:
Posts (Atom)