In this lab you will create simple CPPNs, which were introduced by Kenneth O. Stanley in the paper paper Compositional pattern producing networks: A novel abstraction for development.
A CPPN is similar to an artificial neural network, except that at each node a different arbitrary function may be used in place of the typical sigmoid function. For example, functions such as gaussian, sine, linear, and absolute value could be used, as shown below.
Stanley argues that CPPNs can serve as a developmental encoding to map a genotype to a phenotype in an efficient way that doesn't depend on any local interaction. In his paper the genotype is a CPPN and the phenotype is an image generated from the CPPN. The inputs to the CPPN are:
The simplest CPPN possible is one that has no hidden units (as shown below), where w0-w3 are weights in the range [-1,1], and the output of the network is a function of the weighted sum of the inputs.
In this lab we will generate CPPNs with the structure above using random weights and a randomly chosen function, to explore the range and complexity of images that result. Even with this very simple structure, some surprisingly complex images can be produced. Below are some sample images.
Do an update81 to get the starting point files for this lab.
Execute the drawPic.py file. You will use the display function as a helper in your implementation.
for row in range(dimension) y = row - dimension/2 for col in range(dimension) x = col - dimension/2 d = distance( (x, y), (0, 0) ) netInput = weighted sum of the inputs x, y, d, and bias output = function(netInput) if output > 1 set to 1 if output < 0 set to 0 matrix[row][col] = output display the matrix with a title that describes all of the relevant parameters
Run handin81 before the due date to turn in your lab.