Elitism
In biological systems, there is no guarantee that the most fit individual in a population will survive long enough to procreate because of environmental influences. For example, the most fit grizzly bear in Alaska could be killed by a hunter. While examples of this occur every day in the natural world, as engineers we may not want to lose the best induction motor design in our population. This has led to the elitism operator. This operator compares the most fit individual in a new population to the most fit individual in the previous population. If the best fitness in the new population is lower than the previous population, the most fit member of the previous population replaces a member of the new population. In this way, the best fitness in the population cannot go down. While this operator is distinctly nonbiological, it normally results in improved algorithm performance.
Migration
In biological populations, it sometimes occurs that a population becomes geographically dispersed, and then members of these geographically isolated regions continue to evolve on separate evolutionary tracks. Sometimes, because of storms or random events, a few members of one region migrate into another region. These individuals then have children with members of the region which they have migrated into, often yielding very fit offspring.
It is possible to create GAs where such migration occurs. In such a GA, each individual is associated with a region, and for the most part, only individuals within a given region interact. Each region acts as a separate population. Occasionally, however, a few individuals are transferred between regions.
At first consideration, one might conclude that incorporating such behavior would have little effect on the algorithm. However, in Sudhoff [7] it is shown that at least in the case of one motor design example, the migration operator had a profound effect on the performance of the algorithm. That paper included a comparison between GA optimization and particle swarm optimization, and it found that choice of algorithm was less important than the use of migration in the GA or a somewhat analogous feature known as neighborhoods in particle swarm optimization.
Death
In the canonical GA, the entire population is replaced by children. Those individuals who are selected in the mating pool but do not undergo any crossover, segregation, or mutation “survive” to the next generation unchanged. However, in some GAs, rather than forming a mating pool of the same size as the population, a mating pool that is a fraction of the size of the population is formed, thereby generating a population of children a fraction of the size of the population. The children replace members of the existing population, so that a new population is formed which has the children, plus some members of the previous population. In order to make room for the children (to hold the population size constant), it becomes necessary to decide which members of the current population will “die” to make room for the introduction of children. The selection of those individuals to be replaced can be made on criteria other than those used for selection—for example, based on diversity (or more particularly lack thereof) or some other attribute. In some sense, these approaches are reminiscent of biological populations, where the boundaries between generations are gradual.
Local Search
After the initial generations of a GA, it is often the case that the most fit individuals are near a solution. In this case, some GAs will initiate local searches around the most fit individual. One way to accomplish this is to create a set of slight mutations of the most fit individual. Let us refer to the most fit individual as individual A, and refer to the population of mutations of this individual as PA. If the most fit individual in PA is more fit than individual A, then the most fit individual of PA replaces individual Α in the population.
Deterministic Search
Many classical optimization methods are very effective if they initialized to be close to the solution. This is because many functions are “locally” convex. At the same time, while GAs are often very effective at getting close to a global optimum, they may not always converge rapidly from a good approximation of a solution to the exact solution. This suggests a combination of the two approaches, wherein the best individual of the final population is used to initialize a classical optimization method. To this end, the Nelder–Mead simplex method [1] is particularly attractive because it does not require gradients or Hessians. In performing such an optimization, it is normally the case that in problems with a mixed search space, those genes that represent discrete choices are held fixed.
Enhanced Real‐Coded Genetic Algorithm
Before concluding this section, it is interesting to place all of the aforementioned operators into a single block diagram so that the relationships between the operators can be made clear. This is illustrated in Figure 1.14. Therein, initialization and the first fitness evaluation are as in the canonical GA diagramed in Figure 1.8. Next, the scaling algorithm is (optionally) employed. Note that the input and output of the scaling operator is the population P[k]; the same symbol for the input and output is used since the scaling operator does not operate on the population; it merely adjusts fitness values. The same is true of the diversity control operator, if utilized. Next, the selection operator creates a mating pool M[k]. Based on the mating pool, the mating, crossover, segregation, and mutation operators yield a population of children C[k].
Figure 1.14 Enhanced real‐coded genetic algorithm.
The death algorithm selects a member of the current population to be replaced by the children, yielding the beginnings of the next generation, P‴[k + 1]. The primes are used because this population will be modified before becoming the next generation. First the migration operator will be applied, which may occasionally move individuals from one region to another. This yields P″[k + 1]. Next, the gene values are decoded and the fitness evaluated. The result is again denoted P″[k + 1] since the population itself does not change. The elitism operator is then applied, which compares the most fit individual of the previous population P[k] to the most fit individual in P″[k + 1] and replaces that individual if appropriate. This yields population P′[k + 1]. The local search operator is employed to generate P[k + 1].
Once the next generation is formed, the stopping criterion is checked. Often, the stopping criterion is a check to see if a sufficient number of generations have passed. If the stopping criterion has not been met, the process repeats, starting with scaling. If the stopping criterion has been met, then an estimate of the solution x** is selected as the most fit individual of the population. This estimate can then be passed to a deterministic optimization algorithm for further refinement, yielding x*.
The algorithm depicted in Figure 1.14 is probably more involved than is typical, because many optional operators have been used. For example, diversity control does not have to be used. Scaling is not necessary when using tournament or when the fitness values are always positive. A separate death algorithm is not necessary if the entire