This page has been archived and is no longer updated

 
September 15, 2011 | By:  Nature Education
Aa Aa Aa

Genetic Algorithms: Harnessing Natural Selection

A guest post by Eric Sawyer. Also check out Eric's own Scitable blog, Bio 2.0. 'Dissecting the next revolution in biology'

We live in a world utterly and completely inundated with living things. You live in a community of your own species, play host to a massive number of individual bacteria, inside and out, sustain yourself by feeding on other species' tissues, and breathe oxygen that was generated in a cellular reaction. It's amazing that all these things are true, and the degree of perfection of the organisms and systems responsible is even more amazing. Birds "figured out" the wing millions of years before human engineers, ants form massively complicated societies of specialized individuals, and living cells are the most extensive and efficient chemical factories we know. We see this level of sophistication today because in the course of evolution, organisms' traits have slowly become optimized by natural selection. Natural selection, colloquially survival of the fittest, is a truly powerful optimizer.

Mathematicians and computer scientists often try to optimize some value, given a set of conditions. For instance, it is worthwhile to optimize the amount of fuel loaded into an airplane, with the condition that you must have enough to reach your destination and an alternate airport. Loading too much fuel means that extra fuel has to be spent just to keep the fuel-heavy craft aloft. Load too little and you're not around to tell the tale. This fuel economy calculation is trivially simple, and there is only a single optimum (running out of fuel as soon as you reach the alternate airport. Real airlines mandate that planes be fueled beyond this optimum for obvious reasons). However, in many real world applications there are many variables to optimize simultaneously and what is known as local optima. These local optima represent solutions that are optimum, compared to all the solutions immediately surrounding them. However, there is also a global optimum (though there may be ties between the local optima and global optimum), which is the most optimal solution out of all possible solutions. The distinction between these two terms is illustrated in two graphs. In the first graph there is a single optimum (peak). In the second there are five peaks: the central peak is a local optimum and the other four are tied as global optima. An object trapped on this surface (red dot) has to escape its local optimum to find a global optimum, which requires that it traverse a sub-optimal valley.

Natural selection operates on an unfathomably large number of variables simultaneously, and it obviously has coped quite well. Evolution also has an inherent random component, called genetic drift, whereby genetic changes in a population can occur without incremental steps of improvement. This allows living organisms to escape the pull of a local optimum and stumble onto a better optimum (local or global). We can mimic natural selection and drift in computer programs designed to solve optimization problems. Programmers can make use of a suite of approaches, all based on the characteristics of evolving living systems.

These programs rely on methods called genetic algorithms, because they mimic the genetic processes of organisms. Variables, which can carry a range of values, are treated like genes. Just as alleles compete for a particular position on a chromosome, different variable values compete in the simulation. The programmer chooses how many chromosomes to make and which genes to assign to each chromosome. A solution is represented by an individual with a particular set of alleles arranged on its chromosomes. Then, every generation, a subset of the population reproduces by crossing over their chromosomes and mating with other individuals. The subset is chosen based on how optimal its combination of variables (genes) is, analogous to natural selection, and the strength of selection pressure. On one extreme, only the most optimal individual would survive, and at the other all survive. Of course, the latter case wouldn't allow the population to evolve at all, and you would never reach an optimum. The first extreme seems at first like a good option for speeding up the rate of finding a solution, but the drawback is that the population loses an enormous amount of genetic diversity. A population with low diversity is more likely to get trapped at a local optimum, when the goal is to find the global optimum.

Genetic algorithms are a clever and effective way to solve certain difficult optimization problems, and you can find them put to practical use from time to time. My personal favorite is applying this computer analog of genetics to modeling actual living organisms, which Richard Dawkins introduces in his book The Blind Watchmaker. By assigning a series of parameters to a single-chromosome, asexual reproducer with a defined mutation rate, he was able to generate "biomorphs" that looked like trees, insects, ordinary objects, and other things, all with a surprising degree of resemblance to their real world partners.

By mimicking the power of natural selection, the jostler of species through an unimaginably complex n-dimensional space of peaks and valleys of fitness, we can solve important problems that affect our own lives.

Image Credits: Graphs were generated using Wolfram Mathematica 7.0

References:

Dawkins, R. The Blind Watchmaker. New York: Norton, 1996.

Winston, P. H. Artificial Intelligence. 3rd ed. Reading, MA: Addison-Wesley Publishing Company, 1992.

0 Comment
Blogger Profiles
Recent Posts

« Prev Next »

Connect
Connect Send a message

Scitable by Nature Education Nature Education Home Learn More About Faculty Page Students Page Feedback



Blogs