Genetic Algorithm

Matthew Wall mbwall_NO_SPAM at mit.edu
Sun Mar 21 20:38:55 EST 1999


Masoud,

>I was wondering if your GA code could work more effectively than the
>traditional methods of optimization and is applicable to our problem.  I
>also need to know that if your code is able to work with a system of
>nonlinear error functions rather than a single scalar objective function.
>If you wish I could fax you some of the 3D plots of the error surfaces we
>are dealing with.

in general, GAs will work _very_ well at solving the first 80% of the
search (exploration) but will be less effective at solving the last 20%
(exploitation).  however this varies quite a bit (in some problems we see
something more like 95/5 or even 99/1, depending on problem formulation and
types of constraints).  they excel at finding families of solutions
(particularly with niching algorithms such as deterministic crowding).  and
they are often the _only_ choice for problems with a mix of discrete and
continuous variables (these are the types of problems i'm working with now)

distinguishing between global/local minima is not a problem with GAs.  as
long as your objective function accurately conveys the quality of a
solution, the GA will find better and better solutions.

30 parameters is nothing.  using a GA with hundreds of parameters is
commonplace, and we run many optimizations with thousands.  the keys are to:

  1 get the representation right
  2 get the objective function right
  3 use the right operators and algorithm

please read the overview that comes with GAlib for details about GAs in
general and implementing a GA in particular.

from your description, i would guess that you will find the performance of
a GA quite good (perhaps better than what you do now).  the combination of
a GA with a local hill-climber (perhaps a variation of the methods you use
now) should perform even better.

matthew

Matthew Wall
mbwall_NO_SPAM at mit.edu
http://mit.edu/mbwall





More information about the galib mailing list