In this lab you will experiment with an evolutionary computation
method called NEAT (Neuro
Evolution of Augmenting Topologies) developed by Kenneth
O. Stanley. NEAT operates on both the weights and the structure of the
neural network, allowing connections and nodes to be added as
potential mutations. We will be using a python package that
implements NEAT.
You may work with a partner on this lab.
Getting started
Evolving networks that solve XOR
In Lab1 we used the back propagation algorithm to learn the weights of
a fixed architecture neural neural network to solve XOR. In this lab
we will use NEAT to evolve both the weights and the architecture of a
neural network to solve XOR.
- Go to your cs81/labs/3/xor/ directory and open the file
xor.py in an editor. Notice at the top of this file that it
imports a number of different classes from the neat package,
including population, chromosome, genome,
and nn (for neural network). In Lab 1, we used
conx, which was part of pyrobot to implement neural
networks. For this lab we will be using neat's implementation
instead.
- Next the xor.py file loads a configuration file. Open
the file xor_config in the editor as well. This is where all
of the parameters are set for the evolutionary run. If you are
curious about particular paramters, go to the source code to see how
they are used.
- The user must define an appropriate fitness function for the
problem being solved. In this case, fitness is defined in the
eval_fitness function. For each chromosome in the
population, the chromosome is first translated into a feed-forward
neural network. Then that network is tested on the four standard XOR
patterns, and the total sum-squared error is calculated. Finally the
error value is used to create a fitness value that increases as error
decreases.
-
Go to a terminal window and try solving the XOR problem:
% python xor.py
This will produce a number of messages detailing the progress of the
evolutionary process. For each generation you will see the average
fitness, best fitness, number of species, and information about each
species. At the end, will be a summary of the architecture of the
best performing individual found, and a demonstration of how it
perfroms on the four patterns.
-
Executing the xor.py will also generate three files with
a .svg extension. These can be viewed using
inkscape:
- A graph of the average and best fitness over time
(avg_fitness.svg)
- A picture of the best individual network's architecture
(phenotype.svg)
- A depiction of how the speciation changed over time
(speciation.svg)
-
Try executing the xor evolution several times. Is the same neural
network architecture found each time? How variable are the results?
Evolving networks that control robots
The NEAT paper we discussed this week involved a very time consuming
fitness test intended to create a co-evolutionary arms race. We will
experiment with a simpler scenario in this lab. Rather than a robot
duel, we will focus on a single robot foraging for food. We will
place the robot in a large room with 10 randomly located lights,
representing food, and allow it to explore for 200 steps. We will test
the robot with three different random placements of the food. The
initial network will begin with two input units representing the light
sensor values and two output units representing the motor speeds of
the left and right wheels of a simulated Pioneer robot. The fitness
value will be based on the number of lights consumed by the robot.
Evolving solutions to your own robot task
We will be using NEAT for our midterm project. Start brainstorming
about what task you'd like to work on. Store your ideas in the directory
cs81/labs/3/midterm/. If you will need a
new simulator world create it here. Create a file called
midterm-proposal where you describe your proposed task and
provide a detailed explanation of the fitness function you would use.
Finishing up
-
Be sure that all the relevant files are in your cs81/labs/3/
directory.
- Only one member of each team needs to run handin81 to
turn in the completed lab work.