Genetic Algorithm
Genetic Algorithm
GENETIC ALGORITHM
Genetic Algorithms(GAs) are adaptive heuristic search algorithms that belong to the part of evolutionary
algorithms. Genetic algorithms are based on the ideas of natural selection and genetics. They are commonly
used to generate high-quality solutions for optimization problems and search problems.
Genetic algorithms simulate the process of natural selection which means those species who can adapt to
changes in their environment are able to survive and reproduce and go to next generation. In simple words, they
simulate “survival of the fittest” among individual of consecutive generation for solving a problem. Each
generation consist of a population of individuals and each individual represents a point in search space and
possible solution. Each individual is represented as a string of bits. This string is analogous to the Chromosome.
Genetic algorithms generate successor hypotheses by repeatedly mutating and recombining parts of the best
currently known hypotheses. At each step, a collection of hypotheses called the current population is updated by
replacing some fraction of the population by offspring of the fit current hypotheses. The process forms a
generate-and-test beam-search of hypotheses, in which variants of the best current hypotheses are most likely to
be considered next.
1. Evolution is known to be a successful, robust method for adaptation within biological systems.
2. GAs can search spaces of hypotheses containing complex interacting parts, where the impact of each
part on overall hypothesis fitness may be difficult to model.
3. Genetic algorithms are easily parallelized and can take advantage of the decreasing costs of powerful
computer hardware.
Applications:
1. Optimization problem like graph coloring, travelling salesman problems.
2. Continuous system like efficient designing of airfoil in aerospace engineering.
3. Multi-objective optimization problem
4. Financial Market
Disadvantage:
1. Takes longer time to find near optimal solution.
2. Sensitive to input parameters.
Genetic Algorithms:
The problem addressed by GAs is to search a space of candidate hypotheses to identify the best
hypothesis. In GAs the "best hypothesis" is defined as the one that optimizes the hypothesis fitness
which is the evaluation function for hypothesis.
The algorithm operates by iteratively updating a pool of hypotheses, called the population.
On each iteration, all members of the population are evaluated according to the fitness function.
A new population is then generated by probabilistically selecting the most fit individuals from the
current population. The probability of selecting hypothesis is given by-
Some of these selected individuals are carried forward into the next generation population intact
(elitism), while others are used as the basis for creating new offspring individuals by applying genetic
operations such as crossover and mutation.
This GA algorithm thus performs a randomized, parallel beam search for hypotheses that perform well
according to the fitness function.
Applications of GA:
Optimization − Genetic Algorithms are most commonly used in optimization problems wherein we have
to maximize or minimize a given objective function value under a given set of constraints.
Economics− GAs are used to characterize various economic models like the cobweb model, game
theory equilibrium resolution, asset pricing etc.
Neural Networks− GAs are also used to train neural networks, particularly recurrent neural networks.
Parallelization − GAs also have very good parallel capabilities, and prove to be very effective means in
solving certain problems, and also provide a good area for research.
Vehicle routing problems− With multiple soft time windows, multiple depots and a heterogeneous fleet.
Scheduling applications− GAs are used to solve various scheduling problems as well, particularly the
time tabling problem.
Robot Trajectory Generation− GAs have been used to plan the path which a robot arm takes by moving
from one point to another.
Parametric Design of Aircraft− GAs have been used to design aircrafts by varying the parameters and
evolving better solutions.
DNA Analysis− GAs have been used to determine the structure of DNA using spectrometric data about
the sample.
Multimodal Optimization − GAs are obviously very good approaches for multimodal optimization in
which we have to find multiple optimum solutions.
Traveling salesman problem and its applications − GAs have been used to solve the TSP, which is a
well-known combinatorial problem using novel crossover and packing strategies.
Hypotheses in GAs are often represented by bit strings, so that they can be easily manipulated by genetic
operators such as mutation and crossover. The hypotheses represented by these bit strings can be quite complex.
For example, sets of if-then rules can easily be represented by choosing an encoding of rules that allocates
specific substrings for each rule precondition and postcondition.
Genetic Operators:
The generation of successors in a GA is determined by a set of operators that recombine and mutate selected
members of the current population. The two most common operators are crossover and mutation.
1.Crossover
The crossover operator produces two new offspring from two parent strings, by copying selected bits from each
parent. The bit at position i in each offspring is copied from the bit at position i in one of the two parents. The
choice of which parent contributes the bit for position i is determined by an additional string called the
crossover mask.
In single-point crossover, the crossover mask is always constructed so that it begins with a string containing n
contiguous 1s, followed by the necessary number of 0s to complete the string. This results in offspring in which
the first n bits are contributed by one parent and the remaining bits by the second parent. Each time the single-
point crossover operator is applied, the crossover point n is chosen at random, and the crossover mask is then
created and applied.
In two-point crossover, offspring are created by substituting intermediate segments of one parent into the
middle of the second parent string. Put another way, the crossover mask is a string beginning with n0 zeros,
followed by a contiguous string of nl ones, followed by the necessary number of zeros to complete the string.
Each time the two-point crossover operator is applied, a mask is generated by randomly choosing the integers
no and nl.
Uniform crossover combines bits sampled uniformly from the two parents. In this case the crossover mask is
generated as a random bit string with each bit chosen at random and independent of the others.
2.Mutation operator: It produces small random changes to the bit string by choosing a single bit at random,
then changing its value in order to maintain genetic diversity from one generation of a population to another.
The fitness function defines the criterion for ranking potential hypotheses and for probabilistically selecting
them for inclusion in the next generation population (ex. classification accuracy of the rule over a set of
provided training examples).
The fitness function may measure the overall performance of the resulting procedure rather than performance of
individual rules. The probability that a hypothesis will be selected is given by the ratio of its fitness to the
fitness of other members of the current population. This method is sometimes called fitness proportionate
selection, or roulette wheel selection.
In tournament selection method, two hypotheses are first chosen at random from the current population. With
some predefined probability p the more fit of these two is then selected, and with probability (1 - p) the less fit
hypothesis is selected. Tournament selection often yields a more diverse population than fitness proportionate
selection.
In another method called rank selection, the hypotheses in the current population are first sorted by fitness. The
probability that a hypothesis will be selected is then proportional to its rank in this sorted list, rather than its
fitness.
Genetic Programming-
Genetic programming (GP) is a form of evolutionary computation in which the individuals in the evolving
population are computer programs rather than bit strings. The genetic programming algorithm then uses an
evolutionary search to explore the vast space of programs that can be described using these primitives.
Programs manipulated by a GP are typically represented by trees corresponding to the parse tree of the program.
Each function call is represented by a node in the tree, and the arguments to the function are given by its
descendant nodes.
Example: The tree representation for the function:
One interesting question regarding evolutionary systems is "What is the relationship between learning during
the lifetime of a single individual, and the longer time frame species-level learning afforded by evolution?
There are two models of evolution and learning-
1. Lamarckian Evolution: Lamarck was a scientist who, in the late nineteenth century, proposed that evo-
lution over many generations was directly influenced by the experiences of individual organisms during
their lifetime.
In particular, he proposed that experiences of a single organism directly affected the genetic makeup of
their offspring: If an individual learned during its lifetime to avoid some toxic food, it could pass this
trait on genetically to its offspring, which therefore would not need to learn the trait. This is an attractive
conjecture, because it would allow for more efficient evolutionary progress than a generate-and-test
process (like that of GAs and GPs) that ignores the experience gained during an individual's lifetime.
Despite the attractiveness of this theory, current scientific evidence overwhelmingly contradicts
Lamarck's model. The currently accepted view is that the genetic makeup of an individual is, in fact,
unaffected by the lifetime experience of one's biological parents..
The recent computer studies have shown that Lamarckian processes can sometimes improve the
effectiveness of computerized genetic algorithms by making efficient evolutionary progress.
2. Baldwin Effect:
The Baldwin effect (after J. M. Baldwin (1896)) is based on the following observations:
1. If a species is evolving in a changing environment, there will be evolutionary pressure to favor
individuals with the capability to learn during their lifetime. For example, if a new predator appears
in the environment, then individuals capable of learning to avoid the predator will be more
successful than individuals who cannot learn. In effect, the ability to learn allows an individual to
perform a small local search during its lifetime to maximize its fitness. In contrast, nonlearning
individuals whose fitness is fully determined by their genetic makeup will operate at a relative
disadvantage.
2. Those individuals who are able to learn many traits will rely less strongly on their genetic code to
"hard-wire" traits. As a result, these individuals can support a more diverse gene pool, relying on
individual learning to overcome the "missing" or "not quite optimized" traits in the genetic code.
This more diverse gene pool can, in turn, support more rapid evolutionary adaptation. Thus, the
ability of individuals to learn can have an indirect accelerating effect on the rate of evolutionary
adaptation for the entire population.
The Baldwin effect provides an indirect mechanism for individual learning to positively impact the rate
of evolutionary progress. By increasing survivability and genetic diversity of the species, individual
learning supports more rapid evolutionary progress, thereby increasing the chance that the species will
evolve genetic, non learned traits that better fit the new environment.
There have been several attempts to develop computational models to study the Baldwin effect. For
example, Hinton and Nowlan (1987) experimented with evolving a population of simple neural
networks, in which some network weights were fixed during the individual network "lifetime," while
others were trainable. The genetic makeup of the individual determined which weights were trainable
and which were fixed.