In artificial intelligence, genetic programming (GP) is a technique of evolving programs, starting from a population of unfit (usually random) programs, fit for a particular task by applying operations analogous to natural genetic processes to the population of programs. It is essentially a heuristic search technique often described as 'hill climbing' , i.e. searching for an optimal or at least suitable program among the space of all programs.
The operations are: selection of the fittest programs for reproduction (crossover) and mutation according to a predefined fitness measure, usually proficiency at the desired task. The crossover operation involves swapping random parts of selected pairs (parents) to produce new and different offspring that become part of the new generation of programs. Mutation involves substitution of some random part of a program with some other random part of a program. Some programs not selected for reproduction are copied from the current generation to the new generation. Then the selection and other operations are recursively applied to the new generation of programs.
Typically, members of each new generation are on average more fit than the members of the previous generation, and the best-of-generation program is often better than the best-of-generation programs from previous generations. Termination of the recursion is when some individual program reaches a predefined proficiency or fitness level.
It may and often does happen that a particular run of the algorithm results in premature convergence to some local maximum which is not a globally optimal or even good solution. Multiple runs (dozens to hundreds) are usually necessary to produce a very good result. It may also be necessary to increase the starting population size and variability of the individuals to avoid pathologies.
The first record of the proposal to evolve programs is probably that of Alan Turing in 1950. There was a gap of 25 years before the publication of John Holland's 'Adaptation in Natural and Artificial Systems' laid out the theoretical and empirical foundations of the science. In 1981, Richard Forsyth demonstrated the successful evolution of small programs, represented as trees, to perform classification of crime scene evidence for the UK Home Office.
Although the idea of evolving programs, initially in the computer language Lisp, was current amongst John Holland’s students, it was not until they organised the first Genetic Algorithms conference in Pittsburgh that Nichael Cramer published evolved programs in two specially designed languages, which included the first statement of modern "tree-based" Genetic Programming (that is, procedural languages organized in tree-based structures and operated on by suitably defined GA-operators). In 1988, John Koza (also a PhD student of John Holland) patented his invention of a GA for program evolution. This was followed by publication in the International Joint Conference on Artificial Intelligence IJCAI-89.
Koza followed this with 205 publications on “Genetic Programming” (GP), name coined by David Goldberg, also a PhD student of John Holland. However, it is the series of 4 books by Koza, starting in 1992 with accompanying videos, that really established GP. Subsequently, there was an enormous expansion of the number of publications with the Genetic Programming Bibliography, surpassing 10,000 entries. In 2010, Koza listed 77 results where Genetic Programming was human competitive.
In 1996, Koza started the annual Genetic Programming conference which was followed in 1998 by the annual EuroGP conference, and the first book in a GP series edited by Koza. 1998 also saw the first GP textbook. GP continued to flourish, leading to the first specialist GP journal and three years later (2003) the annual Genetic Programming Theory and Practice (GPTP) workshop was established by Rick Riolo. Genetic Programming papers continue to be published at a diversity of conferences and associated journals. Today there are nineteen GP books including several for students.
Foundational work in GP
Early work that set the stage for current genetic programming research topics and applications is diverse, and includes software synthesis and repair, predictive modeling, data mining, financial modeling, soft sensors, design, and image processing. Applications in some areas, such as design, often make use of intermediate representations, such as Fred Gruau’s cellular encoding. Industrial uptake has been significant in several areas including finance, the chemical industry, bioinformatics and the steel industry.
GP evolves computer programs, traditionally represented in memory as tree structures. Trees can be easily evaluated in a recursive manner. Every tree node has an operator function and every terminal node has an operand, making mathematical expressions easy to evolve and evaluate. Thus traditionally GP favors the use of programming languages that naturally embody tree structures (for example, Lisp; other functional programming languages are also suitable).
Non-tree representations have been suggested and successfully implemented, such as linear genetic programming which suits the more traditional imperative languages [see, for example, Banzhaf et al. (1998)]. The commercial GP software Discipulus uses automatic induction of binary machine code ("AIM") to achieve better performance. µGP uses directed multigraphs to generate programs that fully exploit the syntax of a given assembly language. Other program representations on which significant research and development have been conducted include programs for stack-based virtual machines, and sequences of integers that are mapped to arbitrary programming languages via grammars. Cartesian genetic programming is another form of GP, which uses a graph representation instead of the usual tree based representation to encode computer programs.
Most representations have structurally noneffective code (introns). Such non-coding genes may seem to be useless because they have no effect on the performance of any one individual. However, they alter the probabilities of generating different offspring under the variation operators, and thus alter the individual's variational properties.
Experiments seem to show faster convergence when using program representations that allow such non-coding genes, compared to program representations that do not have any non-coding genes.
Selection is a process whereby certain individuals are selected from the current generation that would serve as parents for the next generation. The individuals are selected probabilistically such that the better performing individuals have a higher chance of getting selected. The most commonly used selection method in GP is tournament selection, although other methods such as fitness proportionate selection, lexicase selection, and others have been demonstrated to perform better for many GP problems.
Elitism, which involves seeding the next generation with the best individual (or best n individuals) from the current generation, is a technique sometimes employed to avoid regression.
Various genetic operators (i.e., crossover and mutation) are applied to the individuals selected in the selection step described above to breed new individuals. The rate at which these operators are applied determines the diversity in the population.
Flip one or more bits from the previous offspring to generate a new child or a generation.
GP has been successfully used as an automatic programming tool, a machine learning tool and an automatic problem-solving engine. GP is especially useful in the domains where the exact form of the
solution is not known in advance or an approximate solution is acceptable (possibly because finding the exact solution is very difficult). Some of the applications of GP are curve fitting, data modeling, symbolic regression, feature selection, classification, etc. John R. Koza mentions 76
instances where Genetic Programming has been able to produce results that are competitive with human-produced results (called Human-competitive results). Since 2004, the annual Genetic and Evolutionary Computation Conference (GECCO) holds Human Competitive Awards (called Humies) competition, where cash awards are presented to human-competitive results produced by any form of genetic and evolutionary computation. GP has won many awards in this competition over the years.
Meta-genetic programming is the proposed meta learning technique of evolving a genetic programming system using genetic programming itself. It suggests that chromosomes, crossover, and mutation were themselves evolved, therefore like their real life counterparts should be allowed to change on their own rather than being determined by a human programmer. Meta-GP was formally proposed by Jürgen Schmidhuber in 1987. Doug Lenat's Eurisko is an earlier effort that may be the same technique. It is a recursive but terminating algorithm, allowing it to avoid infinite recursion. In the "autoconstructive evolution" approach to meta-genetic programming, the methods for the production and variation of offspring are encoded within the evolving programs themselves, and programs are executed to produce new programs to be added to the population.
Critics of this idea often say this approach is overly broad in scope. However, it might be possible to constrain the fitness criterion onto a general class of results, and so obtain an evolved GP that would more efficiently produce results for sub-classes. This might take the form of a meta evolved GP for producing human walking algorithms which is then used to evolve human running, jumping, etc. The fitness criterion applied to the meta GP would simply be one of efficiency.
- ^ "Computing Machinery and Intelligence". www.cs.bham.ac.uk. Retrieved 2018-05-19.
- ^ "BEAGLE A Darwinian Approach to Pattern Recognition". www.cs.bham.ac.uk. Retrieved 2018-05-19.
- ^ A personal communication with Tom Westerdale
- ^ "A representation for the Adaptive Generation of Simple Sequential Programs". www.cs.bham.ac.uk. Retrieved 2018-05-19.
- ^ "Non-Linear Genetic Algorithms for Solving Problems". www.cs.bham.ac.uk. Retrieved 2018-05-19.
- ^ "Hierarchical genetic algorithms operating on populations of computer programs". www.cs.bham.ac.uk. Retrieved 2018-05-19.
- ^ Goldberg. D.E. (1983), Computer-aided gas pipeline operation using genetic algorithms and rule learning. Dissertation presented to the University of Michigan at Ann Arbor, Michigan, in partial fulfillment of the requirements for Ph.D.
- ^ "Genetic Programming: On the Programming of Computers by Means of Natural Selection". www.cs.bham.ac.uk. Retrieved 2018-05-19.
- ^ "Genetic Programming:The Movie". www.cs.bham.ac.uk. Retrieved 2018-05-19.
- ^ "The effects of recombination on phenotypic exploration and robustness in evolution". www.cs.bham.ac.uk. Retrieved 2018-05-19.
- ^ "Human-competitive results produced by genetic programming". www.cs.bham.ac.uk. Retrieved 2018-05-20.
- ^ "Genetic Programming 1996: Proceedings of the First Annual Conference". www.cs.bham.ac.uk. Retrieved 2018-05-19.
- ^ "Genetic Programming". www.cs.bham.ac.uk. Retrieved 2018-05-19.
- ^ "Genetic Programming and Data Structures: Genetic Programming + Data Structures = Automatic Programming!". www.cs.bham.ac.uk. Retrieved 2018-05-20.
- ^ a b "Genetic Programming -- An Introduction; On the Automatic Evolution of Computer Programs and its Applications". www.cs.bham.ac.uk. Retrieved 2018-05-20.
- ^ Banzhaf, Wolfgang (2000-04-01). "Editorial Introduction". Genetic Programming and Evolvable Machines. 1 (1–2): 5–6. doi:10.1023/A:1010026829303. ISSN 1389-2576.
- ^ "Genetic Programming Theory and Practice". www.cs.bham.ac.uk. Retrieved 2018-05-20.
- ^ a b c "A Field Guide to Genetic Programming". www.gp-field-guide.org.uk. Retrieved 2018-05-20.
- ^ "Data Mining and Knowledge Discovery with Evolutionary Algorithms". www.cs.bham.ac.uk. Retrieved 2018-05-20.
- ^ "EDDIE beats the bookies". www.cs.bham.ac.uk. Retrieved 2018-05-20.
- ^ "Applying Computational Intelligence How to Create Value". www.cs.bham.ac.uk. Retrieved 2018-05-20.
- ^ "Human-competitive machine invention by means of genetic programming". www.cs.bham.ac.uk. Retrieved 2018-05-20.
- ^ "Discovery of Human-Competitive Image Texture Feature Extraction Programs Using Genetic Programming". www.cs.bham.ac.uk. Retrieved 2018-05-20.
- ^ "Three Ways to Grow Designs: A Comparison of Embryogenies for an Evolutionary Design Problem". www.cs.bham.ac.uk. Retrieved 2018-05-20.
- ^ "Cellular encoding as a graph grammar - IET Conference Publication". ieeexplore.ieee.org. April 1993. pp. 17/1–1710. Retrieved 2018-05-20.
- ^ "Genetic Algorithm Decoding for the Interpretation of Infra-red Spectra in Analytical Biotechnology". www.cs.bham.ac.uk. Retrieved 2018-05-20.
- ^ "Genetic Programming for Mining DNA Chip data from Cancer Patients". www.cs.bham.ac.uk. Retrieved 2018-05-20.
- ^ "Genetic Programming and Jominy Test Modeling". www.cs.bham.ac.uk. Retrieved 2018-05-20.
- ^ Nichael L. Cramer "A Representation for the Adaptive Generation of Simple Sequential Programs" Archived 2005-12-04 at the Wayback Machine.
- ^ Garnett Wilson and Wolfgang Banzhaf. "A Comparison of Cartesian Genetic Programming and Linear Genetic Programming".
- ^ (Peter Nordin, 1997, Banzhaf et al., 1998, Section 11.6.2-11.6.3)
- ^ Giovanni Squillero. "µGP (MicroGP)".
- ^ Perkis, T. (1994). "Stack-based genetic programming". Proceedings of the First IEEE Conference on Evolutionary Computation. IEEE World Congress on Computational Intelligence. IEEE. pp. 148–153. CiteSeerX 10.1.1.27.7055. doi:10.1109/icec.1994.350025. ISBN 978-0780318991. S2CID 745141.
- ^ a b Spector, Lee; Robinson, Alan (2002-03-01). "Genetic Programming and Autoconstructive Evolution with the Push Programming Language". Genetic Programming and Evolvable Machines. 3 (1): 7–40. doi:10.1023/A:1014538503543. ISSN 1389-2576. S2CID 5584377.
- ^ Spector, Lee; Klein, Jon; Keijzer, Maarten (2005-06-25). The Push3 execution stack and the evolution of control. ACM. pp. 1689–1696. CiteSeerX 10.1.1.153.384. doi:10.1145/1068009.1068292. ISBN 978-1595930101. S2CID 11954638.
- ^ Ryan, Conor; Collins, JJ; Neill, Michael O (1998). Lecture Notes in Computer Science. Berlin, Heidelberg: Springer Berlin Heidelberg. pp. 83–96. CiteSeerX 10.1.1.38.7697. doi:10.1007/bfb0055930. ISBN 9783540643609.
- ^ O'Neill, M.; Ryan, C. (2001). "Grammatical evolution". IEEE Transactions on Evolutionary Computation. 5 (4): 349–358. doi:10.1109/4235.942529. ISSN 1089-778X. S2CID 10391383.
Julian F. Miller.
"Cartesian Genetic Programming".
Janet Clegg; James Alfred Walker; Julian Francis Miller.
A New Crossover Technique for Cartesian Genetic Programming".
- ^ Spector, Lee (2012). Assessment of problem modality by differential performance of lexicase selection in genetic programming: a preliminary report. Proceedings of the 14th Annual Conference Companion on Genetic and Evolutionary Computation. Gecco '12. ACM. pp. 401–408. doi:10.1145/2330784.2330846. ISBN 9781450311786. S2CID 3258264.
- ^ Koza, John R (2010). "Human-competitive results produced by genetic programming". Genetic Programming and Evolvable Machines. 11 (3–4): 251–284. doi:10.1007/s10710-010-9112-3.
- ^ "Humn =Human-Competitive Awards".
- ^ "1987 THESIS ON LEARNING HOW TO LEARN, METALEARNING, META GENETIC PROGRAMMING,CREDIT-CONSERVING MACHINE LEARNING ECONOMY".
- ^ GECCO '16 Companion : proceedings of the 2016 Genetic and Evolutionary Computation Conference : July 20-24, 2016, Denver, Colorado, USA. Neumann, Frank (Computer scientist), Association for Computing Machinery. SIGEVO. New York, New York. ISBN 9781450343237. OCLC 987011786.CS1 maint: others (link)