0% found this document useful (0 votes)
12 views

A Two-Stage Evolutionary Algorithm With Variable Mutation Intervals For Solving Complex Optimization Problems

This paper proposes a new evolutionary algorithm called Two-stage Evolutionary Algorithm with Variable Mutation Intervals (TEAVMI) to improve the performance of traditional evolutionary algorithms. TEAVMI introduces several new evolutionary operators, including a multi-parent crossover operator with elite preservation, a dynamical mutation operator with variable mutation intervals, and a population re-initialization operator. It also uses a two-stage algorithm framework. Simulation results show that TEAVMI finds more accurate solutions than traditional evolutionary algorithms on typical test problems.

Uploaded by

Izz Danial
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views

A Two-Stage Evolutionary Algorithm With Variable Mutation Intervals For Solving Complex Optimization Problems

This paper proposes a new evolutionary algorithm called Two-stage Evolutionary Algorithm with Variable Mutation Intervals (TEAVMI) to improve the performance of traditional evolutionary algorithms. TEAVMI introduces several new evolutionary operators, including a multi-parent crossover operator with elite preservation, a dynamical mutation operator with variable mutation intervals, and a population re-initialization operator. It also uses a two-stage algorithm framework. Simulation results show that TEAVMI finds more accurate solutions than traditional evolutionary algorithms on typical test problems.

Uploaded by

Izz Danial
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

2009 International Conference on Information and Multimedia Technology

A Two-Stage Evolutionary Algorithm with Variable Mutation Intervals


for Solving Optimization Problems

Yunhao Li Shuting Chen


School of Information Engineering School of Architecture & Survey Engieering
Jiangxi University of Science & Technology Jiangxi University of Science & Technology
Ganzhou, China Ganzhou, China
[email protected] [email protected]

Abstract—A new evolutionary algorithm called Two-stage efficiency of the crossover operators, and a lot of work has
Evolutionary Algorithm with Variable Mutation Intervals been done to analyze its performance [4]. Schaffer and
(TEAVMI) is proposed in this paper. TEAVMI improves the Eshelman in Ref. [5] made the conclusion that the only
performance of traditional EAs by introducing new variation is not always sufficient. But it is of great
evolutionary operators, and it has many new features. It importance to design new operators to improve the
introduces the multi-parent crossover operator with elite- performance of algorithms, which is also the focus of our
preservation, and develops dynamical mutation operator and study.
space contraction operator; It also introduces a new two-stage A new evolutionary algorithm called Two-stage
algorithm framework. Simulation results on some typical test
Evolutionary Algorithm with Variable Mutation Intervals
problems show that TEAVMI is better than existing
traditional evolutionary algorithm in the accuracy of solutions.
(TEAVMI) is proposed in this paper. TEAVMI improves the
performance of traditional EAs by introducing new
Keywords-evolutionary algorithm; funtion optimization; evolutionary operators, and it has many new features. It
multi-parent crossover; variable mutation intervals introduces the multi-parent crossover operator with elite-
preservation, and develops dynamical mutation operator and
space contraction operator; It also introduces a new two-
I. INTRODUCTION stage algorithm framework. Simulation results on some
An evolutionary algorithm (EA) [1] is a heuristic that typical test problems show that TEAVMI is better than
mimics the evolution of natural species in searching for the existing traditional evolutionary algorithm in the accuracy of
optimal solution to a problem. It is based on Darwin’s theory solutions.
of the survival of the fittest and Mendel’s theory of heredity. The remainder of this paper is organized as follows.
It is a search algorithm that locates optimal solution by Section II is the main part of our content which discusses the
processing an initially random population of individuals approach proposed in this paper in detail. Section III is the
using artificial mutation, crossover and selection operations, simulation experiments, followed by a brief conclusion in
in an analogy with the process of natural selection. So far, it section IV.
can be categorized into four parts: Genetic Algorithm (GA),
Evolutionary Strategy (ES), Evolutionary Programming (EP), II. THE APPROACH
and Genetic Programming (GP). EA has been applied In this section we will show you in detail how to
effectively to solve complicated problems which are almost construct TEAVMI. Since TEAVMI is a improved version
impossible for traditional optimization algorithms to deal of the traditional evolutionary algorithm, and only introduces
with. or develops some novel operations, and uses a two-stage
Though EA has shown some advantages in solving algorithm framework. So in the following subsections,
complex optimization problems in many fields, it still has TEAVMI will be elaborated in the following aspects: 1) a
three main problems, which sometimes restrict its use, new crossover operator, multi-parent crossover with elite
including 1) Premature convergence; 2) Their computational preservation; 2) a new mutation operator, dynamical
quantity is large and the precision of the solutions is not high; mutation operator with variable mutation intervals; 3) a
and 3) They lack a proper stopping criterion, and it is population re-initialization operator; and 4) a two-stage
difficult to select operation parameters [2]. These defects algorithm framework.
make EA always can only get some local optimal solutions
when solving complex optimization problems. A. Multi-Parent Crossover with Elite preservation
So some useful strategies should be taken to improve the By saying multi-parent crossover, we mean that more
performance of traditional EAs. In the field of EAs, then two parents are involved the process of generating
evolutionary operators are seen as one of the key factors that offspring. Guo et al. in Ref. [6] proposed a highly efficient
influence the performance of the algorithms. Literature on crossover operator, multi-parent crossover operator.
ES and EP stressed the effectiveness of the mutation Experiment results show that this operator has better
operators [3]. The researchers in GA convinced the performance than many other crossover operators. In this

978-0-7695-3922-5/09 $26.00 © 2009 IEEE 284


DOI 10.1109/ICIMT.2009.19
Authorized licensed use limited to: UNIVERSITI TEKNOLOGI MARA. Downloaded on July 15,2023 at 05:23:18 UTC from IEEE Xplore. Restrictions apply.
paper, a multi-parent crossover operator with elite- C. Population Reinitialization
preservation strategy is proposed. That is to say, in the After the evolution of EAs enters the later period (i.e., the
crossover operation, the elite individual with the best fitness current generation Gen>CGen), the real searching space the
will be definitely chosen to participate in crossover operation. global optimum exists is much smaller than the original
The new crossover operator can be described as: searching space. So we re-initialize the population. But when
k M
re-initializing the population, we store the top v individuals
X new ¦a X  ¦ a X
i i
i k 1
i i (1) with better fitness, and regenerate the left popsize-v (popsize
i 1 is the population size) individuals. That is to say, if the each
Where Xnew is the newly produced individual; M is the element of the ith individual Xi. Xij belongs to (Li, Ui), we
number of individuals which are selected to participate the regenerate the left popsize-v individuals with each element of
crossover operation; Xi (i=1,2,…,k) is the top k elite them belongs to (Li/2, Ui/2).
individual which has a relative better fitness in the current
population; Xi (i=k+1,k+2,…M) are the other individuals in D. The Two-Stage Algorithm Framework
the selected M individuals. ai (i=1,2,…,M) is a random value In TEAVMI, we use a two-stage algorithm framework,
between 0 and 1. which is very similar to that we have proposed in Ref. [8].
In TEAVMI, we use Guo’s crossover operator with elite- Figure 1 gives a short overview of the workflow of the
preservation strategy to make full use of the good proposed approach. It starts from an initial population,
information in current population and to converge quickly. followed by fitness calculation. After fitness calculation
The value of k is determined by the M to a certain degree, i.e., operation, we select M individuals in the current population,
the larger M is, the large k can be. If the new generated with k individuals ranked in the top k according to their
individual Xnew is better than the worst one Xworst with the fitness and M-k are selected randomly. Then we do multi-
worst fitness, the Xworst will be replaced by Xnew. parent crossover operation with elite preservation according
In our previous work Ref. [7], this operator also has been to formula (1) and mutation operation with variable mutation
introduced to improve the performance of our algorithm. intervals according to formula (2) as we talked above. After
B. Mutation Opertor with Variable Mutation Intervals that, if the current generation Gen is not bigger than the
critical generation CGen, the algorithm process will go back
Traditional EAs always use one type of mutation after the to fitness calculation; otherwise, it will do re-initialization
crossover operation. And if the newly generated individual is operation. Then it will do fitness calculation, new crossover
better than the old one, it will be used to replace the old one. operation, and new mutation according to formula (3). After
TEAVMI, however, will take another strategy to do the that, if Gen is not bigger than the maximum generation
mutation operation. It will take a two-stage strategy: (1) early MaxGen, it will go back to fitness calculation at the second
period evolution stage, and (2) later period evolution stage. stage; otherwise, it will output the solutions.
In the early period evolution stage, we first select the
worst individual Xworst and best individual Xbest in the current The 1st stage The 2ed stage
population, and apply mutation operation on every individual
Xi in the population but Xbest according to formula (2) (a is a Population initial Re-initialization
random number between 0 and 1). If the newly generated
individual Xi’ is better than the old one Xi, we use Xi’ to
Fitness caculation
replace Xi. Fitness caculation
X i ' X i  a( X best  X worst ) (2)
New crossover
In the later period evolution stage, we use the following
New crossover
formula (3) to perform the mutation operation:
X i ' X i  a( X best  X sBest ) (3)
New mutation

Where XsBest is the second best individual in the current New mutation > MaxGen?
population. The meaning of other notations in formula (3)
Yes No
are similar to that in formula (2).
> CGen ?
By the two types of mutation operators proposed above,
No Yes Print solutions
it can obviously improve the traditional evolution algorithm
in the precision of solutions. Every time the term Xbest-Xworst
or Xbest-XsBest in above formulas vary with the generation, i.e., Figure 1. Workflow of TEAVMI.
the radius of the mutation varies. So we call it mutation
operation with variable mutation intervals. And the critical
generation between the early evolution stage and the later E. The Procedure of TEAVMI
evolution stage, CGen are determined by the total generation Step 1: Initialize the population with popsize individuals
MaxGen to a certain degree. It is one of the parameters set * popsize ^x1 , x 2 ," , x popsize ` at random, CGen, MGen, M,
manually. It is an improved version of mutation operator that
we have proposed in our previous work Ref. [7]. Here we k, v, and Gen : 0 .
use two different operators to do the mutation operation.

285

Authorized licensed use limited to: UNIVERSITI TEKNOLOGI MARA. Downloaded on July 15,2023 at 05:23:18 UTC from IEEE Xplore. Restrictions apply.
Step 2: Calculate the fitness of each individual in the The comparison data with other methods is presented in
population, and do the multi-parent crossover Table II. It is clear that TEAVMI is able to get the optimal
operation according to formula (1). solutions in all the 3 test functions which are all better than
Step 3: If Gen<CGen, do mutation operation with variable that proposed in literature. Our results seem to suggest that
mutation intervals according to formula (2), and then the use of the strategies proposed in this paper has a more
go back to Step 2; otherwise, do mutation operation significant impact on the performance of the algorithm.
according to formula (3) and go to Step 4. And TABLE II COMPARISONS DATA
Gen : Gen  1 .
Experiment Method Best Solution
Step 4: If Gen<MaxGen, calculate the fitness of each
IQGA -1.03152343
individual in the population, do the multi-parent f1 NQGA -1.03158079
crossover operation, and mutation operation TEAVMI -1.0316284
according to formula (3), and then go back to Step IQGA 3.000362450
4, Gen : Gen  1 ; otherwise, go to Step 5 and f2 NQGA 3.000121210
Gen : Gen  1 . TEAVMI 2.9999999999999
IQGA 0.00058494
Step 5: Output the solutions.
f3 NQGA 0.00043370
III. NUMERICAL EXPERIMENTS TEAVMI 0

In this section, the performance of TEAVMI will be IV. CONCLUSIONS


tested by some benchmark function optimization problems
In order to improve the performance of traditional EAs in
which are widely used in literature. These functions are all
soloing complex function optimization problems, a new
chosen from Ref. [9].
evolutionary algorithm, TEAVMI, is proposed. It introduced
A. Experiment 1 a multi-parent crossover operator, developed a new mutation
operator with variable mutation intervals, and introduced a
1 two-stage algorithm framework. The simulation results on
min f1(x1, x2 ) (4  2.1x12  x14 )x12  x1x2  (4  4x22 )x22
3 some typical test problems show that the algorithm proposed
in this paper is better than existing evolutionary algorithm in
3 d x1 d 3, 2 d x2 d 2 the accuracy.
To our best knowledge, the best solution -1.031628 of f1
till now can be obtained at point (-.089842, 0.712656) or REFERENCES
(0.089842, -0.712656). [1] Z. Pan, L. Kang, and Y Chen, Evolutionary Couputation. Tsinghua
University Press, 1998. (In Chinese)
B. Experiment 2
[2] K. Li, W. Pan, Z. Chen, et al, “A Bionics Evolutionary Algorithm
min f2 ( x, y) [1  ( x  y  1)2 (19 14x  3x2 14 y  6xy  3 y2 )] Based on Average Vector Deviation,” Proceedings of the 4th
International Conference on Fuzzy Systems and Knowledge
[30  (2x  3 y)2 (18  32x  12x2  48 y  36xy  27 y2 )] Discovery (FSKD 07), IEEE Press, Dec. 2007, pp. 386-390.
2 d x, y d 2. [3] T. Baeck, F. Hoffme, H. Schwefel, A Survey of Evolution Stragegies.
In Proceedings of the 4th International Conference on Genetic
The function has a global minimum value of 3 at the point Algorithms and Their Application. CA, Morgan Kanfmann, 1991.
(x1, x2) = (0,-1).
[4] Spears, W.M., De Jong, K.A., On the Virtues of Parameterized
C. Experiment 3 Uniform Crossover. In Proceedings of the 4th International
Conference on Genetic Algorithms and Their Application. CA,
min f 3 ( x1 , x2 ) 100( x12  x2 ) 2  (1  x1 ) 2 Morgan Kanfmann, 1991.
[5] Schaffer, J.D., Eshelman, L.J., On Crossover as an Evolutionarily
2.048 d x1 d 2.048, 2.048 d x2 d 2.048 Viable Strategy. In Proceedings of the 4th International Conference on
Genetic Algorithms and Their Application. CA, Morgan Kanfmann,
The function has a global minimum value of 0 at the 1991.
point (x1, x2) = (1,1). [6] T. Guo, Evolutionary Algorithm and Optimization. Wuhan University,
Table 1 shows the parameter setting of TEAVMI when 1999. (In Chinese)
solving the three complex function optimization problems. [7] Yunhao Li, Shuting Chen. A Multi-stage Evolutionary Algorithm for
Solving Complex Optimization Problems, unpublished.
TABLE I PARAMETER SETTING [8] Shuting Chen, Yunhao Li. A Novel Two-Level Evolutionary
Algorithm for Solving Constrained Function Optimization,
value Parameter value
unpublished.
popsize 100 k 3 [9] Gexiang Zhang, Na li, Weidong Jin, et al. A Novel Quantum Genetic
M 8 MaxGen 10,000 Algorithm and its Application. Acta Electronica Sinica, 32(3), pp.
476-479.
CGen 5,000 v 10

286

Authorized licensed use limited to: UNIVERSITI TEKNOLOGI MARA. Downloaded on July 15,2023 at 05:23:18 UTC from IEEE Xplore. Restrictions apply.

You might also like