GeneticAlgorithm

Main Task

Optimization is related to StrcturalHealthMonitoring too

search space is not a practical form of problem solving. On the other hand, any search other than random search imposes some bias in terms of how it looks for better solutions and where it looks in the search space. Genetic algorithms indeed introduce a particular bias in terms of what new points in the space will be sampled. Nevertheless, a genetic algorithm belongs to the class of methods known as weak methods in the Artificial Intelligence community because it makes relatively few assumptions about the problem that is being solved. ( Darrell Whitley)

The Multi-Objective Optimization Problem can be formalized as follow:

$$ minimize \; y = \mathrm{f (x)} = (f_1 (x), f_2 (x), ..... , f_m (x)) $$

subject to e(x) = (e1 (x), e2 (x), \xB7, ek (x)) ≤ 0 x∈S

involving m(≥ 2) conflicting objective functions fi : R(n) → R(1) that we want to minimize simultaneously, R(n) is an n-dimensional real space.

The decision vectors x = (x1 , x2 , \xB7 \xB7 \xB7 , xn ) belong to the feasible region S ⊂ R(n). The feasible region is formed by constraint functions e(x). We denote the image of the feasible region by Z ⊂ R(m) and call it a feasible objective region.

The elements of Z are called objective vectors and they consist of objective (function) values

f (x) = (f1 (x), f2 (x), \xB7 \xB7 \xB7 , fm (x)).

Pareto Sets

A decision vector x1 ∈ S is said to dominate a decision vector x2 ∈ S if both of the following conditions are true:
  1. The decision vector x1 is not worse than x2 in all objectives, i.e., fi (x1 ) ≤ fi (x2 ) for all i = 1, \xB7 \xB7 \xB7 , m.
  2. The decision vector x1 is strictly better than x2 in at least one objective, i.e., fi (x1 ) < fi (x2 ) for at least one i = 1, \xB7 \xB7 \xB7 , m.

A decision vector x1 ∈ S is called Pareto-optimal if there does not exist another x2 ∈ S that dominates it. Finally, an objective vector is called Pareto- optimal if the corresponding decision vector is Pareto-optimal.

Why Genetic Algorithms

Genetic algorithms are often described as a global search method that does not use gradient information. Thus:
  • non differentiable functions
  • functions with multiple local optima

represent classes of problems to which genetic algorithms might be applied.

It is applied also in case when high non-linear objective functon are present. Genetic algorithms, as a weak method, are robust but very general. ( Darrell Whitley)

General Algorhitm:
BEGIN
     Step 1: t = 0;
     Step 2: Generate the initial population P0 and initial archive A0
     Step 3: Evaluate P[t]
     Step 4: A[t+1] := Update(P[t];A[t])
     Step 5: P[t+1] := Generate(P[t];A[t])
     Step 6: t = t + 1
     Step 7: Unless a termination criterion is met, goto Step 3
END

where
P[t] the population, 
A[t] the archive at generation t.

Initialize Population

Each element of the design set represents an "Individual" which from a coding point of view is represented by a

string of length L ....... Each string is sometimes referred to as a "genotype" (Holland, 1975) or, alternatively, a "chromosome" (Schauer, 1987). In most cases the initial population is generated randomly. After creating an initial population, each string is then evaluated and assigned a fitness value. 1

Binary coding of real value variables

A binary string may also be used to represent a real value. The decoding of a binary string to a real-value is a two-step process. First, the binary string is decoded into an integer next, the integer is mapped from the discrete interval [a; b] to the real interval [Low; Upper ] by using the formula

x = (k-a)*(U-L)/(b-a) + L

k can be represented using N bits mapping the integer interval from [ 0 ; 2**N - 1] to [a ; b ]

Evaluation and fitness

The evaluation process involve some simulation trying to define the performance of the genotype. In structural optimization this means evaluation of structural response. Inside the set of allowable structural response the different value of the objective functions will allow for the assignment of the fitness function to each genotype.
The evaluation function, or objective function, provides a measure of performance with respect to a particular set of parameters. The fitness function transforms that measure of performance into an allocation of reproductive opportunities.

For each of member (genotype) of the population mechanical analysis is performed to evaluate the candidate design and determine desired quantities, such as (only as an example):
  1. peak effective stress
  2. factor of safety for a particular set of loads and boundary conditions.
  3. Robustness
  4. reliability index...

These outputs are then utilized to create and assign the values of the objective functions, f(X), and constraint violations, c(X), to the individual. In the case of multiple constraint violations the violations can be linearly combined to supply a single number associated to the violation.

Pareto Dominance

Zitzler and Thiele 2 suggested an elitist multi-criterion EA with the concept of non-domination in their Strength Pareto EA (SPEA). They suggested maintaining an external population at every generation storing all non-dominated solutions discovered so far from the initial population.

This external population participates in genetic operations. At each generation, a combined population with the external and the current population is first constructed. All non-dominated solutions in the combined population are assigned a fitness based on the number of solutions they dominate and dominated solutions are assigned fitness worse than the worst fitness of any nondominated solution. This assignment of fitness makes sure that the search is directed towards the non-dominated solutions. A deterministic clustering technique is used to

Performances evaluation maight be very expensive some aouthors try to exploit fitness inheritance in a multi-objective optimization algorithm (i.e. NSGA-II) by substituting the expensive real evaluations (calculation of the true structural response) with estimations based on closeness in the design space. The estimations are based on the measure of the distance between individuals and a weighted average of the fitnesses of the closest ones.

The last approach cannot be used when strong discontinuities are present in the objective functions

A fast non-dominated sorting routine

In multi-objective evolutionary algorithms (MOEAs) with elitism, the data structures for storing and updating archives may have a great impact on the required computational (CPU) time, especially when optimizing higher dimensional problems with large Pareto-sets.

Following the definition by Deb [7], an individual X (n) is said to constrain-dominate an individual X (m) , if any of the following conditions are true:

  1. X (n) and X (m) are feasible, with
    1. X (n) is no worse than X (m) in all objectives and
    2. X (n) is strictly better than X (m) in at least one objective.
  2. X(n) is feasible while individual X (m) is not.
  3. X(n) and X (m) are both infeasible, but X(n) has a smaller constraint violation.

Selection

Selection is performed using probability (Roulettes); density probability function of the individual is proportional to the fitness function. Each random "run" select an individual that will be a parent. Higher fitness value results in hogher probablity to become a parent.

Crowded Comparison Operator (Density Estimation)

One of the goal developing Genetic Algorithm is to ensure population diversity. One way to get it is to select the individuals that are in area less crowded in the objective space To this aim the Crowding Distance (CD) has been defined: Deb)
For a member of non-dominated set, CD is calculated by finding distance between two nearest solutions on either side of the member along each of the objectives. These distances are normalized divid- ing them by the difference between maximum and mini- mum values of corresponding objectives, and then these normalized distances are summed up giving a CD for the corresponding member. For those members of the non-dominated set, which have maximum or minimum value for any objective, CD is assigned to have an infinite value. Finally, the members of the non-dominated set are sorted in monotonically decreasing order accord- ing to CDs and a desired number of members having the largest CD values are selected.

Recombination

The first step in creating an offspring population for a multi-objective GA is to construct a mating pool. Once finished, the mating pool will contains M individuals which are copied from the parent population that will be utilized to create the finished offspring population. To create the mating pool, we utilize a crowded tournament selection operator. To begin constructing the mating pool, two solutions from the parent population are selected at random to compete in a tournament. A solution X (n ) wins the tournament over a solution X (m ) if either

  1. Solution X (n ) has a better (lower number) rank than X ( m )
  2. Solutions X (n ) and X (m ) have the same rank, but solution X ( n ) has a larger crowding distance.

The winner of the tournament has a copy of it- self placed in the mating pool. This process is repeated until each solution in the parent population has competed twice and the spots in the mating pool have been filled. The resulting mating pool contains more copies of the more desirable solutions in the parent population and

Recombination can be performed using Crossover operator. One point crossover cut genotype at 1 point recombinig two parents to get two offsprings. Probability of recombination is clearly linket to cut position.

Mutation

The motivation for using mutation, then, is to prevent the permanent loss of any particular bit or allele in the string code.

After several generations it is possible that selection will drive all the bits in some position to a single value: either 0 or 1. If this happens without the genetic algo- rithm converging to a satisfactory solution, then the algorithm has prematurely converged. This may particularly be a problem if one is working with a small population. Without a mutation operator, there is no possibility for reintroducing the missing bit value. Also, if the

Update Population

In classical model the offspring population will form the the population at time t+1. Some improvements can be obtained using elitism the two polulations are mixed then the best N individuals are chosen to constitute the new population With the mating pool complete, the task of constructing the offspring population can commence. To generate two offspring solutions, one begins by selecting two individuals from the mating pool at random. Once selected, the two individuals undergo crossover and mutation to create two offspring solutions which are placed in the offspring population. As an example Simulated-binary crossover (SBX) and polynomial mutation operators to create the child solutions 7 .

For example in the NSGA-II, one first combines both the parent population and the offspring population together to form an intermediate population of size 2M. The en- tire combined population is then assigned a new non-dominated rank and is sorted in order of in- creasing rank. Then, each member of the entire population is assigned a new crowding distance, CD, and is sorted within each rank from largest to smallest crowding distance. The final result is a population of size 2M that is sorted first in order of increasing rank, and then within each rank, sorted in order of decreasing crowding distance. The top M solutions are then selected as the next generation, since these desirable solutions possess the lowest ranks, and if the M-th solution falls within a rank, the solutions within that rank with the largest crowding distances are chosen for the next generation. Of all the parent solutions and offspring solutions, only the best M are kept for the next generation.

Test Problem

Some implementation example HERE

Reference

[1] Darrell Whitley, A Genetic Algorithm Tutorial, Computer Science Department, Colorado State University Fort Collins, CO 80523

[2] Zitzler, E. and Thiele, L. (1998) Multiobjective optimization using evolutionary algorithms\x97A comparative case study. In Eiben, A. E., B\xA8 ck, T., Schoenauer, M., and

[3] Kalyanmoy Deb, Samir Agrawal, Amrit Pratap, and T Meyarivan "A Fast Elitist Non-Dominated Sorting Genetic

[4] Algorithm for Multi-Objective Optimization NSGA-II" KanGAL Report No. 200001

[5] Schwefel, H.-P., editors, Parallel Problem Solving from Nature, V, pages 292\x96301, Springer, Berlin, Germany.

[6] Quad-trees: A Data Structure for Storing Pareto-sets in Multi-objective Evolutionary Algorithms with Elitism Sanaz Mostaghim1 and Jurgen Teich

[7] K. Deb "Multi-Objective Optimization Using Evolutionary Algorithms," John Wiley & Sons,Ltd., Chichester; 2001

[8] Saku Kukkonen and Kalyanmoy Deb, "Improved Pruning of Non-Dominated Solutions Based on Crowding Distance for Bi-Objective Optimization Problems", Kanpur Genetic Algorithms Laboratory (KanGAL) Indian Institute of Technology Kanpur, PIN 208 016, India KanGAL Report Number 2006007

-- RobertoBernetti - 06 Mar 2009
Topic revision: r2 - 26 Feb 2017, RobertoBernetti
This site is powered by FoswikiCreative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License1999-2025
Ideas, requests, problems regarding this site? Send feedback