Genetic Algorithms: Evolution as Computation
Nature's most powerful optimisation process is evolution. Over billions of years, natural selection has produced organisms exquisitely adapted to their environments - from the aerodynamic shape of a peregrine falcon to the light-harvesting efficiency of photosynthetic pigments. Genetic algorithms (GAs) borrow this logic and apply it to computational problems, turning Darwinian evolution into a general-purpose search and optimisation engine.
The Biological Inspiration
In biological evolution, a population of organisms reproduces with variation. Offspring inherit traits from their parents, sometimes with mutations. Those individuals best suited to the environment survive and reproduce more successfully - a process Darwin called natural selection. Over many generations, the population adapts. No central planner is needed; adaptation emerges from the interplay of variation, inheritance, and selection.
This insight - that complex, well-adapted solutions can arise from simple evolutionary mechanisms - is the foundation of evolutionary computation. John Holland formalised genetic algorithms in the 1970s, showing that populations of candidate solutions can evolve toward optimal or near-optimal answers through analogues of biological processes [1].
How Genetic Algorithms Work
A genetic algorithm maintains a population of candidate solutions, each encoded as a string (often binary, but any representation works). The algorithm repeatedly applies three operators inspired by biology:
Selection - individuals are chosen for reproduction based on their fitness. Fitter individuals are more likely to be selected, mimicking survival of the fittest. Common methods include tournament selection and roulette-wheel (fitness-proportionate) selection.
Crossover - selected parents exchange segments of their genetic material to produce offspring. This recombination allows useful building blocks from different parents to be combined, just as sexual reproduction shuffles alleles in biology [1].
Mutation - small random changes are introduced into offspring, maintaining genetic diversity and preventing the population from converging prematurely on a suboptimal solution. In a binary GA, this might mean flipping a single bit.
Each cycle of selection, crossover, and mutation constitutes one generation. Over successive generations, average fitness tends to increase - the population evolves.
The Broader Family: Evolutionary Algorithms
Genetic algorithms are part of a larger family called evolutionary algorithms (EAs), which includes evolution strategies, evolutionary programming, and genetic programming. Evolution strategies, developed by Rechenberg and Schwefel in the 1960s, focus on continuous parameter optimisation and use self-adaptive mutation step sizes [2]. Genetic programming, pioneered by Koza, evolves entire computer programmes represented as tree structures [3].
All evolutionary algorithms share the same Darwinian loop: a population of varied individuals undergoes selection based on fitness, produces offspring through recombination and mutation, and iterates over generations. The differences lie in representation, operator design, and selection pressure.
Schema Theory and Building Blocks
Holland's Schema Theorem provides a theoretical underpinning for why GAs work. A schema is a template describing a subset of strings that share certain fixed positions. The theorem shows that short, low-order schemata with above-average fitness receive exponentially increasing representation in the population over time - the building block hypothesis [1]. This means GAs implicitly sample a vast number of candidate building blocks in parallel.
Real-World Applications
Genetic algorithms have been applied across an extraordinary range of domains. In engineering, they optimise the topology of structures, antenna designs for NASA missions, and turbine blade shapes. In scheduling, they solve timetabling and job-shop problems. In machine learning, they tune hyperparameters and evolve neural network architectures - a technique called neuroevolution [4].
A landmark demonstration was Karl Sims' 1994 work on evolving virtual creatures with morphologies and behaviours simultaneously co-optimised through artificial evolution - showing that complex body plans and locomotion strategies can emerge purely from evolutionary pressure [5].
Why It Matters
Genetic algorithms demonstrate that you do not need to understand the structure of a problem to solve it - you only need to measure how good a candidate solution is. This makes GAs applicable to problems where the search space is vast, discontinuous, or poorly understood. They are inherently parallel, naturally avoid local optima through population diversity, and require no gradient information. You can explore this process yourself in the Genetic Algorithm simulation, where a population of strings evolves toward a target phrase through selection, crossover, and mutation.
References
- Holland, J. H. (1975). Adaptation in Natural and Artificial Systems. University of Michigan Press (reprinted by MIT Press, 1992). doi:10.7551/mitpress/1090.001.0001
- Schwefel, H.-P. (1995). Evolution and Optimum Seeking. Wiley-Interscience. ISBN 978-0-471-57148-3
- Koza, J. R. (1992). Genetic Programming: On the Programming of Computers by Means of Natural Selection. MIT Press. doi:10.7551/mitpress/3108.001.0001
- Stanley, K. O., Clune, J., Lehman, J. & Miikkulainen, R. (2019). Designing neural networks through neuroevolution. Nature Machine Intelligence, 1, 24–35. doi:10.1038/s42256-018-0006-z
- Sims, K. (1994). Evolving virtual creatures. Proceedings of SIGGRAPH '94, 15–22. doi:10.1145/192161.192167