Neural Networks: Hong Zhang, Masumi Ishikawa
Neural Networks: Hong Zhang, Masumi Ishikawa
Neural Networks
journal homepage: www.elsevier.com/locate/neunet
1. Introduction PSO and a real-coded genetic algorithm (Zhang & Ishikawa, 2007,
2008).
Particle swarm optimization (PSO) is an adaptive, stochastic During the last decade, many variants of the original PSO such
and population-based optimization method proposed by Kennedy as a particle swarm optimizer with inertia (Shi & Eberhart, 1998),
and Eberhart, inspired by the social behavior of flocks of birds and canonical particle swarm optimizer (CPSO) (Clerc & Kennedy,
schools of fish (Eberhart & Kennedy, 1995; Kennedy & Eberhart, 2002), fully informed particle swarm (FIPS) (Kennedy & Mendes,
1995). Its major features include particle dynamics, information 2002), gregarious particle swarm optimizer (G-PSO) (Pasupuleti
exchange, intrinsic memory, and directional search. Due to ease & Battiti, 2000), adaptive hierarchical particle swarm optimizer
of understanding and implementation, and ability in search, PSO (H-PSO) (Janson & Middendorf, 2005), and dissipative particle
has widely been applied to large-scale, highly nonlinear, and swarm optimization (DPSO) (Xie, Zhang, & Yang, 2002) were
complex optimization problems (Eberhart, Shi, & Kennedy, 2001; proposed. Although the mechanisms of these methods are simple
Poli, Kennedy, & Blackwell, 2007; Reyes-Sierra & Coello, 2006). with only a few parameters to adjust, they have better search
Like other optimization methods, an essential issue in PSO is performance compared with other conventional approaches such
efficiency in solving diverse optimization problems. To this end, as machine learning, neural network learning, genetic algorithms,
we believe that systematic selection of values of parameters in and simulated annealing (Abraham, Guo, & Liu, 2006; Kennedy,
PSO is fundamental and the most important. Generally speaking, 2006).
the selection process may be regarded as meta-optimization in The CPSO is a constricted version of the original PSO, which
the sense that the selection process is located at the outer loop decreases the amplitude of oscillation of particles owing to a
in addition to PSO at the inner loop. We previously proposed constriction coefficient, resulting in quick convergence in search.
evolutionary particle swarm optimization (EPSO) by combining Since the values of parameters in the CPSO are based on an analysis
of dynamics of the original PSO (Clerc & Kennedy, 2002), the CPSO
has become a highly influential method in PSO research.
I This paper was originally presented at ICANN2008 (Zhang & Ishikawa, 2008); Despite recent efforts in theoretical analysis (Campana, Fasano,
this is a substantially extended version. Peri, & Pinto, 2006; Clerc, 2006; Trelea, 2003), the values of
∗ Corresponding address: Department of Brain Science and Engineering, Gradu- parameters in the CPSO still remain empirical. This is because
ate School of Life Science & Systems Engineering, Kyushu Institute of Technology, behavior of a swarm of particles is highly stochastic and dynamic. A
2-4, Hibikino, Wakamatsu, 808-0196, Kitakyushu, Japan. Tel.: +81 93 695 6112; fax:
+81 93 695 6112.
major difficulty of the CPSO is that knowledge on search space such
E-mail addresses: [email protected] (H. Zhang), as unimodal or multimodal, which is hard to acquire, is required to
[email protected] (M. Ishikawa). guide a swarm of particles.
0893-6080/$ – see front matter © 2009 Elsevier Ltd. All rights reserved.
doi:10.1016/j.neunet.2009.12.002
H. Zhang, M. Ishikawa / Neural Networks 23 (2010) 510–516 511
(v1i , v2i , . . . , vNi )T . PSO (Eberhart & Kennedy, 1995; Kennedy &
Eberhart, 1995) is formulated as
Fig. 2 illustrates the basic architecture of the ECPSO which 5. Computer experiments
optimizes the values of parameters in the CPSO by evolutionary
computation. It is composed of two loops: an outer loop and an The aims of experiments are to evaluate the effectiveness of the
inner loop. The inner loop runs the CPSO, which solves a given real- proposed method, to analyze the characteristics of the optimized
valued optimization problem with a particular set of parameter CPSO models, and to compare the search performance with other
values in the CPSO. The outer loop runs RGA/E, which simulates the methods, i.e., the original CPSO, OPSO, RGA/E, and EPSO.
survival of the fittest among individuals (individual being a set of
parameter values) over generations for finding the best parameter 5.1. Benchmark problems
values, i.e., χ , c1 , and c2 , in the CPSO, by genetic operations.
The genetic operations adopted here in RGA/E are roulette To facilitate comparison and analysis of the search performance
wheel selection, BLX-α crossover (Eshelman & Schaffer, 1993), of the ECPSO, we use the following suite of multidimensional
random mutation, non-redundant search, rank operation, and benchmark problems (http://www.ntu.edu.sg/home/epnsugan/
mixing operation for efficiently obtaining the optimized parameter index_files/CEC-05/Tech-Report-May-30-05.pdf, 0000) for mini-
values in the CPSO to a given optimization problem. The ith mization with various complexities.
individual, E ei , in a population is represented by an N-dimensional Sphere function:
vector of real numbers, E ei = (ei1 , ei2 , . . . , eiN ). N
X
Roulette wheel selection: The probability of selecting an fSp (E
x) = x2d .
individual, E ei , is expressed by d=1
Griewank function:
g (E
ei )
ei ] =
p[E 1
N N x
M d
X Y
P
g (E )
em fGr (E
x) = x2d − cos √ + 1.
4000 d=1 d=1 d
m=1
Rastrigin function:
where g (·) is the fitness value of the individual, E ei , and M is the
number of individuals in the population. N
X
fRa (E
x) = x2d − 10 cos 2π xd + 10 .
BLX-α crossover3: This reinitializes the values of offspring E eij
with values from an extended range given by the parents (E ei , E
ej ), d=1
ij ij ij ij j
where ek = U (ek min − α I , ek max + α I )1 with ek min = min(eik , ek ), Rosenbrock function:
ij j ij ij
ek max = max(eik , ek ) and I = ek max − ek min . The parameter, α , is a N −1 h
X 2 2 i
predetermined constant. fRo (E
x) = 100 xd+1 − x2d + xd − 1 .
d=1
Random mutation: For a randomly chosen chromosome k of
ei , the allele eik is added by a randomly chosen value,
an individual E Schwefel function:
∆ek ∼ U [−∆eb , ∆eb ]. Note that ∆eb is the given boundary value
i
N X
X d 2
fS w (E
x) = .
0
for a mutant variant in operation. Therefore, the allele, eik , of the xj
0
eik , can be obtained as follows: eik = eik + ∆eik .
offspring, E d=1 j=1
Ranking operation: For executing an elitism strategy, the Hybrid composition function:
individuals in the current population, Pt , are sorted according to 1 1
their fitness. Let the sorted population be Pt00 . fHy (E
x) = fRa (E
x) + 2fS w (E
x) + fGr (E
x) + fSp (E
x).
12 20
Non-redundant search: Delete redundant individuals to in-
crease the probability of finding the optimal values of parameters. The following fitness function in the search space, S ∈ (−5.12,
Let a new population with additional random individuals be Pt000 . 5.12)N , is used.
Mixing operation: Let the population at the next generation 1
be Pt +1 by merging superior individuals from Pt00 and non-inferior gω (E
x) = , (6)
fω (E
x) + 1
individuals from Pt000 . Here, superior individual means an individual
with superior fitness, and inferior individual means an individual where the subscript, ω, stands for one of the following: Sp (Sphere),
with inferior fitness. Gr (Griewank), Ra (Rastrigin), Ro (Rosenbrock), S w (Schwefel), and Hy
(Hybrid composition). fω (E
x) is zero at the global best position; hence
the largest fitness value of Eq. (6) is 1 for all problems.
Fig. 3 visualizes fitness functions for the given benchmark
1 U [a, b] denotes that the uniform probability distribution is on [a b]. problems with two dimensions. The Sphere function is unimodal
H. Zhang, M. Ishikawa / Neural Networks 23 (2010) 510–516 513
Fig. 3. The fitness functions on the two-dimensional plane. (a) Sphere problem, (b) Griewank problem, (c) Rastrigin problem, (d) Rosenbrock problem, (e) Schwefel problem,
and (f) Hybrid composition problem.
Table 1 operator makes cuts between genes, which changes the values of
Major parameters in the ECPSO.
the corresponding variables. On the other hand, in a real-coded
Parameter Value GA it just interchanges the variables between individuals without
The number of individuals, M 10 changing their values. Similarly, the mutation operator is only to
The number of generations, G 20 create a new value for the corresponding variable in a real-coded
The number of superior individuals, sn 2 GA. Because of this, generally speaking, the probability of genetic
Probability of BLX-2.0 crossover, pc 0.5
Probability of random mutation, pm 0.5
operation in a real-coded GA could be set to a bigger value than
The boundary value, ∆eb 1.0 that in a binary-coded GA for achieving good search performance.
The number of iterations, K 400 To verify this, preliminary experiments are carried out. Fig. 4
The number of particles, P 10 illustrates the resulting mean and standard deviation of fitness
values for RGA/E with different probabilities, pc and pm , of
with axis symmetry, the Rosenbrock function is unimodal with crossover and mutation. Fig. 4 indicates that the fitness is not much
asymmetry, the Schwefel function is unimodal with line symmetry, affected by the probabilities of crossover and mutation with the
the Griewank and Rastrigin functions are multimodal with different exception of the Rastrigin and Schwefel problems. In Rastrigin and
distribution density and axis symmetry, and the Hybrid composition Schwefel problems, the fitness value is small and its variants are
function is multimodal with different distribution density and line large for small probabilities of crossover and mutation. Based on
symmetry. the above results, the probability of crossover and mutation in
RGA/E is set to be 0.5 in our experiments.
5.2. Experimental setting
5.3. Experimental results
The initial position of particles is uniformly distributed in the
search space, S, and the initial velocity is set to zero. Table 1 gives
Computer experiments for optimizing CPSO models by evolu-
the major parameters in the ECPSO.
tionary computation are carried out for the suite of 5-, 10-, and
Note that both non-redundant search and roulette wheel
20-dimensional benchmark problems. It is to be noted that the ap-
selection in genetic operations have no parameter. The smaller
propriate values of parameters in the CPSO are estimated with a
number of individuals, particles and iterations are chosen in order
to acquire a balance between accuracy and computation speed. non-negative constraint.
By repeatedly taking the average of the results, the accuracy of Fig. 5 illustrates the cumulative fitness function of the six best
optimization is guaranteed. individuals for the 5-dimensional Sphere problem. Fig. 5 indicates
It is well known that crossover and mutation contribute that the fitness value abruptly changes except for the two best
to retaining the diversity of individuals during evolutionary individuals.
computation. The probabilities, pc and pm , of crossover and The abrupt change of fitness values is considered to be mainly
mutation are usually set to a small value such as 0.05, because caused by the high probability of crossover and mutation, and by
a larger probability of crossover and mutation is considered to the non-redundant search. On the other hand, the fitness values
destroy most of the sound chromosomes and lead to disastrous of the two best individuals increase monotonically with elitism
solutions. strategy.
However, there is a big difference between a binary-coded Based on the resulting values of parameters, the optimized PSO
GA and a real-coded GA in genetic operation for optimization models can be divided into two types (a-type and b-type). Table 2
with real-valued variables. In a binary-coded GA, the crossover shows the resulting values of parameters in CPSO models and their
514 H. Zhang, M. Ishikawa / Neural Networks 23 (2010) 510–516
a b
c d
e f
g h
Fig. 4. The search performance of RGA/E with setting the probability of crossover and mutation. (a) Sphere problem, (b) Griewank problem, (c) Rastrigin problem,
(d) Rosenbrock problem, (e) Schwefel problem, (f) Hybrid composition problem, (g) Rastrigin problem (setting pm with pc = 0.1), (h) Rastrigin problem (setting pm with
pc = 0.9).
frequencies in 20 trials. 2 It indicates that the frequency of the b- • The resulting value of the constriction coefficient, χ , is always
type is not smaller than that of the a-type. Table 2 provides the less than 1. This suggests that the ‘‘constriction’’ term is
following findings. effective in improving the convergence of the CPSO.
• There are two types of optimized CPSO model structure. The
In addition, it is a fact that the temporary condition of φ = c1 +
optimized values of parameters are not unique and are subject
to variation. c2 6> 4 in Eq. (4) is satisfied in many cases. This suggests that the
• The resulting value of parameter c2 never becomes zero for any parameter selection under a conventional model is very significant.
problem. This suggests that the swarm confidence factor, c2 ,
plays an essential role in the CPSO. 5.4. Performance comparison
Table 2
The resulting values of parameters in CPSO models and their frequency for benchmark problems (20 trials in each case).
Problem Dim. Optimized CPSO Values of parameters Frequency (%)
χ c1 c2
Table 3
The average fitness values and standard deviations for the original CPSO, ECPSO, OPSO, EPSO, and RGA/E for each benchmark problem (mean and standard deviation over 20
trials; the number of iterations, K = 400). The average with bold fonts is the best among the five methods.
Problem Dim. Original CPSO ECPSO OPSO RGA/E EPSO
5 1.0000 ± 0.0000 1.0000 ± 0.0000 1.0000 ± 0.0000 0.9990 ± 0.0005 1.0000 ± 0.0000
Sphere 10 0.9997 ± 0.0002 0.9998 ± 0.0002 0.9980 ± 0.0077 0.9957 ± 0.0028 0.9998 ± 0.0003
20 0.7357 ± 0.2940 0.9797 ± 0.0233 0.6939 ± 0.3131 0.9207 ± 0.0290 0.9755 ± 0.0406
5 0.8688 ± 0.0916 0.9901 ± 0.0050 0.9448 ± 0.0439 0.9452 ± 0.0784 0.9876 ± 0.0104
Griewank 10 0.9814 ± 0.0143 0.9924 ± 0.0068 0.8236 ± 0.1835 0.9136 ± 0.1415 0.9706 ± 0.0243
20 0.9688 ± 0.0275 0.9954 ± 0.0046 0.8073 ± 0.1742 0.8816 ± 0.1471 0.9731 ± 0.1856
5 0.1828 ± 0.1154 0.2710 ± 0.1300 0.2652 ± 0.1185 0.9064 ± 0.2256 1.0000 ± 0.0000
Rastrigin 10 0.0310 ± 0.0090 0.0331 ± 0.0110 0.0321 ± 0.0255 0.6693 ± 0.2061 0.0429 ± 0.0256
20 0.0105 ± 0.0026 0.0172 ± 0.0051 0.0147 ± 0.0033 0.0844 ± 0.0292 0.0174 ± 0.0064
5 0.6206 ± 0.2583 0.7008 ± 0.3557 0.3926 ± 0.1976 0.3898 ± 0.2273 0.7564 ± 0.2723
Rosenbrock 10 0.0788 ± 0.0992 0.1262 ± 0.1508 0.0825 ± 0.0719 0.1243 ± 0.0650 0.1149 ± 0.0780
20 0.0114 ± 0.0145 0.0270 ± 0.0267 0.0084 ± 0.0108 0.0108 ± 0.0082 0.0271 ± 0.0154
5 0.9954 ± 0.0103 0.9965 ± 0.0086 0.7677 ± 0.4127 0.9875 ± 0.2145 1.0000 ± 0.0000
Schwefel 10 0.7895 ± 0.2912 0.8528 ± 0.1780 0.4936 ± 0.4470 0.7028 ± 0.1339 0.8592 ± 0.3397
20 0.0339 ± 0.0221 0.1378 ± 0.1172 0.0287 ± 0.0214 0.0923 ± 0.0405 0.1343 ± 0.1294
5 0.1348 ± 0.2084 0.3247 ± 0.3253 0.3061 ± 0.3591 0.1531 ± 0.1339 0.3298 ± 0.3384
Hybrid 10 0.0197 ± 0.0173 0.0318 ± 0.0230 0.0281 ± 0.0168 0.0126 ± 0.0144 0.0293 ± 0.0135
20 0.0032 ± 0.0009 0.0069 ± 0.0022 0.0043 ± 0.0016 0.0028 ± 0.0010 0.0064 ± 0.0024
The experimental results clearly indicate that the parameter Rastrigin problem in the 5-dimensional case. This suggests that the
values, i.e., χ = 0.7298, c1 = c2 = 2.05 in the original CPSO ECPSO is almost equivalent to optimized PSO models.
are not appropriate in these problems. The Sphere problem with 5
and 10 dimensions is too simple to demonstrate the effectiveness
6. Conclusions
of the ECPSO. The situation is exactly the same in the comparison
between the ECPSO and the OPSO.
In this paper, we have proposed an evolutionary canonical
Table 3 also indicates that the search performance of the ECPSO
is superior to that of the OPSO for the given benchmark problems particle swarm optimizer, ECPSO, for effectively solving various
except for the Sphere problem. It suggests that in optimization optimization problems. The crucial idea is to optimize the values of
of nonlinear functions with many local peaks and with different parameters in the CPSO by a real-coded genetic algorithm, which
heights, the ECPSO is not so effective as RGA/E. The Hybrid was originally proposed by the authors in extending PSO to EPSO.
composition function has also many local peaks with different Applications of the ECPSO to a suite of multidimensional
heights, but the ECPSO is superior to the Hybrid task. This might benchmark problems well demonstrated the superiority of the
be due to the fact that the above inherent difficulty is ameliorated ECPSO on the average. Our experimental results clearly indicated
by mixing with other less difficult tasks. that (1) the optimized values of parameters in the CPSO is not
Table 3 indicates that there is not much difference of search unique and are subject to variation; and (2) the values of optimized
performance between the ECPSO and the EPSO except for the parameters are different from those often used in the original
516 H. Zhang, M. Ishikawa / Neural Networks 23 (2010) 510–516
Eberhart, R. C., & Kennedy, J. (1995). A new optimizer using particle swarm theory.
In Proceedings of the sixth international symposium on micro machine and human
science (pp. 39–43).
Eberhart, R. C., & Shi, Y. (2000). Comparing inertia weights and constriction factors
in particleswarm optimization, In Proceedings of the 2000 IEEE congress on
evolutionary computation. Vol. 1 (pp. 84–88).
Eberhart, R. C., Shi, Y., & Kennedy, J. (2001). Swarm intelligence. CA, USA: Morgan
Kaufmann Publishers.
Eshelman, L. J., & Schaffer, J. D. (1993). Foundations of genetic algorithms: Vol. 2.
Real-coded genetic algorithms and interval-schemata (pp. 187–202). San Mateo:
Morgan Kaufman Publishers.
Janson, S., & Middendorf, M. (2005). A hierarchical particle swarm optimizer and
its adaptive variant. IEEE Transactions on Systems, Man and Cybernetics, Part B,
35(6), 1272–1282.
Kennedy, J., & Eberhart, R. C. (1995). Particle swarm optimization. In Proceedings of
the 1995 IEEE international conference on neural networks (pp. 1942–1948).
Kennedy, J., & Mendes, R. (2002). Population structure and particles swarm
performance. In Proceedings of the IEEE congress on evolutionary computations
(pp. 1671–1676).
Kennedy, J. (2006). In search of the essential particle swarm. In Proceedings of the
2006 IEEE congress on evolutionary computations (pp. 6158–6165). BC, Canada:
Vancouver.
Fig. 5. Fitness values of the six best individuals. Lane, J., Engelbrecht, A., & Gain, J. (2008). Particle swarm optimization with
spatially meaningful neighbours. In Proceedings of swarm intelligence symposium
(pp. 1–8).
CPSO. This suggests the importance and necessity of introducing Meissner, M., Schmuker, M., & Schneider, G. (2006). Optimized particle swarm
evolutionary computation into the CPSO. Therefore, it is expected optimization (OPSO) and its application to artificial neural network training.
that the performance of evolutionary neural networks (Dehuri & BMC Bioinformatics, 7(125).
Parsopoulos, K. E., & Vrahatis, M. N. (2002). Recent approaches to global opti-
Cho, 2009; Spina, 2006; Yu, Wang, & Xi, 2008) could be improved mization problems through particle swarm optimization. Natural Computing,
by the ECPSO. 1, 235–306.
Applications of the proposed method to complex real-world Pasupuleti, P., & Battiti, R. (2000). The Gregarious recent particle swarm optimizer
problems are left for future study. Although the proposed method (G-PSO). In Proceedings of the IEEE congress on evolutionary computation
(pp. 84–88).
cannot completely prevent premature convergence and stagnation Poli, R., Kennedy, J., & Blackwell, T. (2007). Particle swarm optimization — An
in optimization, it is still expected that fruitful results will be overview. Swarm Intelligence, 1, 33–57.
obtained by combining a family of PSO methods and evolutionary Reyes-Sierra, M., & Coello, C. A. C. (2006). Multi-objective particle swarm optimiz-
ers: A survey of the state-of-the-art. International Journal of Computational In-
computation. telligence Research, 2(3), 287–308.
Shi, Y., & Eberhart, R. C. (1998). A modified particle swarm optimiser. In Proceedings
Acknowledgements of the IEEE international conference on evolutionary computation (pp. 69–73).
Spina, R. (2006). Optimisation of injection moulded parts by using ANN-PSO
approach. Journal of Achievements in Materials and Manufacturing Engineering,
This research was partially supported by a COE (Center 15(1–2), 146–152.
of Excellence) program (#J19) granted to Kyushu Institute of Stron, R., & Price, K. (1997). Differential evolution — A sample and efficient heuristic
for global optimization over continues space. Journal of Global Optimization, 11,
Technology from the Ministry of Education, Culture, Sports, Science 341–359.
and Technology (MEXT), Japan. It was also supported by Grant-in- Trelea, I. C. (2003). The particle swarm optimization algorithm: Convergence
Aid Scientific Research(C) (18500175) from MEXT, Japan. analysis and parameter selection. Information Processing Letters, 85, 317–325.
Xie, X.-F., Zhang, W.-J., & Yang, Z.-L. (2002). A dissipative particle swarm
optimization. In Proceedings of The IEEE congress on evolutionary computation
References (pp. 1456–1461).
Yu, J.-B., Wang, S.-J., & Xi, L.-F. (2008). Evolving artificial neural networks using an
Abraham, A., Guo, H., & Liu, H. (2006). In Nedjah N., & Mourelle. L. (Eds.), Studies improved PSO and DPSO. Neurocomputing, 71, 1054–1060.
in computational intelligence, Swarm intelligence: Foundations, perspectives and Zhang, H., & Ishikawa, M. (2005). A hybrid real-coded genetic algorithm with local
applications, swarm intelligence systems (SCI) (pp. 3–25). Germany: Springer- search. In Proceedings of the 12th international conference on neural information
Verlag. processing (pp. 732–737).
Beielstein, T., Parsopoulos, K. E., & Vrahatis, M. N. (2002). Tuning PSO parameters Zhang, H., & Ishikawa, M. (2007). Evolutionary particle swarm optimization (EPSO)
through sensitivity analysis. Technical report of the collaborative research center — Estimation of optimal PSO parameters by GA. In Proceedings of the IAENG
531 computational intelligence CI-124/02, University of Dortmund. international multiconference of engineers and computer scientists (pp. 13–18).
Carlisle, A., & Dozier, G. (2001). An off-the-shelf PSO. In Proceedings of the workshop Hong Kong, China: Newswood Limited, Vol. 1.
on particle swarm optimization (pp. 1–6). Zhang, H., & Ishikawa, M. (2008). Designing particle swarm optimization —
Campana, E. F., Fasano, G., Peri, D., & Pinto, A. (2006). Particle swarm optimization: Performance comparison of two temporally cumulative fitness functions in
Efficient globally convergent modifications. In Motasoares C.A., et al., (Eds.), III EPSO. In Proceedings of the 26th IASTED international conference on artificial
European conference on computational mechanics solids, structures and coupled intelligence and applications (pp. 301–306).
problems in engineering (pp. 412–429). Netherlands: Springer, doi:10.1007/ Zhang, H., & Ishikawa, M. (2008). Evolutionary particle swarm optimization —
1-4020-5370-3. Metaoptimization method with GA for estimating optimal PSO methods.
Clerc, M., & Kennedy, J. (2002). The particle swarm-explosion, stability, and In Castillo O., et al., (Eds.), Lecture notes in electrical engineering: Vol. 6. Trends
convergence in a multidimensional complex space. IEEE Transactions on in intelligent systems and computer engineering (pp. 75–90). Springer-Verlag.
Evolutionary Computation, 6(1), 58–73. Zhang, H., & Ishikawa, M. (2008). Evolutionary canonical particle swarm optimizer
Clerc, M. (2006). Particle swarm optimization. UK: Iste Publishing Co. — A proposal of meta-optimization in model selection. In LNCS: Vol. 5163.
Dehuri, S., & Cho, S.-B. (2009). A comprehensive survey on functional link neural Proceedings of the 18th international conference on artificial neural networks, part
networks and an adaptive PSO-BP learning for CFLNN. Journal of Neural 1 (pp. 472–481). Prague, Czech Republic.
Computing & Applications, 196, 578–593. doi:10.1007/s00521-009-0288-5. http://www.ntu.edu.sg/home/epnsugan/index_files/CEC-05/
Springer London. Tech-Report-May-30-05.pdf.