Procedural landscape generators are often used in videogames, for example to create new unexplored planets. What would a procedural creature generator look like? Recent experiments demonstrate that a neural network can learn how to control robot arms, if it has enough to explore with. Your task is to create a 3D editor that allows users to define a new kind of alien creature - legs, claws, sensors whatever - which then learns how to behave through experimentation with the laws of physics applying to its own body in a simple environment (furniture food etc).
We were tasked with produced a 3D editor that lets you build an alien creature, which then learns how to interact with its environment. To keep our project scope reasonable we decided that the behavior we wanted our aliens to learn was walking. Our plan was for a single application with 3 main parts: an editor to build aliens; a way of training aliens to walk; and a simulator to view the results. The editor allowed aliens to be created with simple 3D shapes, with control over many of the join properies. To train the alien, we used an evolutionary neural network (NEAT: NeuroEvolution of Augmenting Topologies, modifying weights and structures of the NN using genetic algorithm). First, a generation of aliens exhibit some random behaviour, and we give them a score based on how far along the x-axis they walk. The fittest aliens survive and produce offspring, which drives our evolution forwards towards an optimal behaviour. To make training as fast as possible, simulated 8 concurrently. Aliens were simulated for 20 real time seconds per generation (sped up), and for a fairly sensible alien we could usually get some sort of walking in about 50 generations. The alien simulator would display the best generation so far, and simulate it for 20 real seconds.