Next: REFERENCES
Up: Appendix A: Review of
Previous: Fitness function
In general, establishing the convergence of a genetic-algorithm optimization
may not be an easy matter because we do not know for sure if the algorithm has
converged to a local or global minimum. In the first case we would like the
algorithm to continue exploring the search space and in the second case we
would like the algorithm to stop. There are several different ways in which
we can proceed, for example:
- Set a threshold for the minimum cost function that constitutes an
acceptable solution and stop the algorithm when this value is reached.
The problem with this approach is that in some cases it may be too
difficult to establish a priori what a good threshold should be.
- Set a threshold for the difference between the best and the
average fit individual in the population. This has the advantage of not
requiring a priori knowledge of the actual cost function values.
- Set a threshold for the difference between the best individuals of the
present and the previous generation or generations. This has the advantage
of preventing the algorithm from attempting to refine the search too much
by slowly crawling to the bottom of the valley.
- Set a maximum number of generations to be evolved. This is a fail
safe criterion which can be useful when comparing the performance of genetic
algorithms with different combination of evolution parameters.
In most cases a combination of two or more of these and similar criteria are
employed.
Next: REFERENCES
Up: Appendix A: Review of
Previous: Fitness function
Stanford Exploration Project
11/11/2002