Invent something - Squared!
Since the last blogpost, we've installed the NVIDIA Tesla GPGPU (General Purpose Graphics Processing Unit) card in a desktop computer, got everything working, and did the first thing you do with new computational gear: run a few benchmarks. We were simply blown away by it. The results were amazing! We've never seen so much computing power in such a small package. This will let us do more number crunching in much, much less time.
Now, what to do with a Tesla? Simply put: Invent something - Squared!
Let the dots fly!
First off, there's a Particle Swarm Optimizer (PSO) to be written. Take the (overly) simple function "5x + 2y" and suppose you are trying to find its minimum value. Each particle is a possible evaluation of the function; in this case it is an [x,y] coordinate pair. A PSO creates many such particles, evaluates them with respect to the function, and moves the whole swarm roughly (i.e. with built-in error) in the direction of the best solutions in the current swarm, which are the lowest values. In this way, the whole swarm moves around a wide range of possible coordinates, all the while imperfectly attracted towards good solutions already seen by themselves and the larger swarm. The key is the imperfection in the attraction, it keeps the system from getting stuck at a local optimum. When visualized, the result looks like a swarm of insects, each moving independently yet still moving as one. The massive parallelization possible with the Tesla, due to the number of processing cores and very large memory, make swarms of swarms possible.
Darwin went programming
Second, there's a Linear Genetic Programming engine to be written. Unlike PSO, Genetic Programming (GP) is not a numerical optimization technique, but it uses a similar random guided search technique. GP creates algorithms - short programs of basic programming blocks (add, multiply, sine, cosine, if..then, etc) - and lets them compete against each other. Elements of the best programs are recombined to make new programs and the competition repeats, over and over. Surprisingly (or maybe not depending on your faith in random processes), strong algorithms emerge. A GPU is especially well suited to GP execution given the large number of trials and data that must be used. GP's building blocks aren't limited to programming blocks either. GP has been used to make patented circuits (VLSI design) and NASA even used GP to quickly design a new antenna on a space probe.
Once those two are done, we start inventing. We're going to combine GP with a PSO in two ways. There are a number of tunable parameters in a GP study - program size, mutation rates, etc. We'll use PSO to guide a series of GP studies. In addition, an individual program generated by GP may contain a number of constants that can be tuned with a PSO, thus greatly reducing the search space that GP must consider. This combination is particularly powerful in that it automates and guides a search over a very vast search space, and it should really help generate GP solutions faster.
Needles in Pixel-Haystacks
Next, and this is the part that keeps us awake at night, once we've invented the PSO-GP-PSO sandwich, we're going to turn it loose on generating image processing algorithms. We're looking for Apollo's shiny needles in haystacks and haystacks of pixels: a few shiny glimpses of man-made metal and mylar in an otherwise rocky and dull lunar surface transmitted by Asimov’s HD camera - a task perfect for automated algorithm design.
So, what do we do with a Tesla? We're off to invent a better algorithm inventing machine!