next up previous print clean
Next: Selection of Evolution Parameters Up: Alvarez: Genetic Algorithm Inversion Previous: Parameter Summary

Micro-Genetic Algorithm

When dealing with high dimensionality problems, it may be difficult or too time consuming for all the model parameters to converge within a given margin of error. In particular, as the number of model parameters increases, so does the required population size. Recall that large population sizes imply large numbers of cost-function evaluations. An alternative is the use of micro-genetic algorithms Krishnakumar (1989), which evolve very small populations that are very efficient in locating promising areas of the search space. Obviously, the small populations are unable to maintain diversity for many generations, but the population can be restarted whenever diversity is lost, keeping only the very best fit individuals (usually we keep just the best one, that is, elitism of one individual). Restarting the population several times during the run of the genetic algorithm has the added benefit of preventing premature convergence due to the presence of a particularly fit individual, which poses the risk of preventing further exploration of the search space and so may make the program converge to a local minimum. Also, since we are not evolving large populations, convergence can be achieved more quickly and less memory is required to store the population.



 
next up previous print clean
Next: Selection of Evolution Parameters Up: Alvarez: Genetic Algorithm Inversion Previous: Parameter Summary
Stanford Exploration Project
11/11/2002