0% found this document useful (0 votes)
39 views70 pages

PSO and Its Variants: Swarm Intelligence Group Peking University

This document discusses particle swarm optimization (PSO) and its variants. It provides an outline covering classical and standard PSO, analyzing PSO approaches in the literature as well as the authors' ideas. The authors analyze PSO's search strategy, including exploitation versus exploration. They explore sampling probability, inertia weight, and topology. The goal is to understand PSO's properties and how to improve its search strategy.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
39 views70 pages

PSO and Its Variants: Swarm Intelligence Group Peking University

This document discusses particle swarm optimization (PSO) and its variants. It provides an outline covering classical and standard PSO, analyzing PSO approaches in the literature as well as the authors' ideas. The authors analyze PSO's search strategy, including exploitation versus exploration. They explore sampling probability, inertia weight, and topology. The goal is to understand PSO's properties and how to improve its search strategy.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 70

PSO and its variants

Swarm Intelligence Group


Peking University
Outline
 Classical and standard PSO

 PSO on Benchmark Function

 Analysis of PSO_state of art

 Analysis of PSO_our idea

 variants of PSO_state of art

 Our variants of PSO

 Applications of PSO
Classical and standard PSO
 Swarm is better than personal
Classical and standard PSO

Russ Eberhart James Kennedy


Classical
Vid  w  Vid  c1  Rand ()  ( pid  xid )  c2  Rand ()  ( g d  xid ) (1)
xid  xid  Vid (2)
 Vid : Velocity of each particle in each dimension
 i: Particle
 D: Dimension
 W: Inertia Weight
 c1 、 c2 : Constants
 Rand() : Random
 Pid : Best position of each particle
 gd : Best position of swarm
 xid : Current position of each particle in each dimension
Classical and standard PSO
Vid  w  Vid  c1  Rand ()  ( pid  xid )  c2  Rand ()  ( g d  xid ) (1)
xid  xid  Vid (2)

xid (t  1)
y Vid (t  1)
g d (t )

Vid (t )
pid (t )
xid (t )
x
Flow chart depicting the General PSO Algorithm:
simulation 1

max
y

min
fitness

x
search space
simulation 2

max
y

min
fitness

x
search space
simulation 3

max
y

min
fitness

x
search space
simulation 4

max
y

min
fitness

x
search space
simulation 5

max
y

min
fitness

x
search space
simulation 6

max
y

min
fitness

x
search space
simulation 7

max
y

min
fitness

x
search space
simulation 8

max
y

min
fitness

x
search space
Schwefel's function
n
f ( x)   ( xi )  sin( xi )
i 1

where
 500  xi  500
global minimum
f ( x) = n  418.9829;
xi = 420.9687, i = 1 : n
Evolution - Initialization
Evolution - 5 iteration
Evolution - 10 iteration
Evolution - 15 iteration
Evolution - 20 iteration
Evolution - 25 iteration
Evolution - 100 iteration
Evolution - 500 iteration
Search result
Iteration Swarm best
0 416.245599
5 515.748796
10 759.404006
15 793.732019
20 834.813763
100 837.911535
5000 837.965771
Global 837.9658
Standard benchmark functions

1 ) Sphere Function
n
f x    xi , x   5,5
2 n

i 1

2 ) Rosenbrock Function
n 1
 
f x    100 xi 1  xi
i 1

2 2

 xi  1 , x   10,10
2 n

3 ) Rastrigin Function
D
f x    x  10 cos2x   10
2
i i
i 1

4 ) Ackley Function
 1 n 2 
  exp 1  cos2xi , x   32,32n
n
f x   20  e  20 exp  0.2 x
 n i 1   n i 1 
 
Composition Function
Analysis of PSO_state of art
 Stagnation - Convergence
 Clerc 2002
 The particle swarm - explosion, stability, and convergence in a multidimensional
complex space,2002
 Kennedy 2005
 Dynamic-Probabilistic Particle Swarms,2005
 Poli 2007
 Exact Analysis of the Sampling Distribution for the Canonical Particle
Swarm Optimiser and its Convergence during Stagnation,2007
 On the Moments of the Sampling Distribution of Particle Swarm
Optimisers,2007
 Markov Chain Models of Bare-Bones Particle Swarm Optimizers,2007
 standard PSO
 Defining a Standard for Particle Swarm Optimization,2007
Analysis of PSO_state of art
 standard PSO: constriction factor - convergence
 Update formula

Vid    (Vid  c1  Rand ()  ( pid  xid )  c2  Rand ()  ( g d  xid ) )


xid  xid  Vid

Equivalent
Vid  w  Vid  c1  Rand ()  ( pid  xid )  c2  Rand ()  ( g d  xid )
xid  xid  Vid
Analysis of PSO_state of art
 standard PSO
 50 particles
 Non-uniform initialization
 No evaluation when particle is out of boundary
Analysis of PSO_state of art
 standard PSO
 A local ring topology
Analysis of PSO_state of art
 How does PSO works?
 Stagnation versus objective function
 Classical PSO versus Standard PSO
 Search strategy versus performance
Classical PSO
 Main idea: Particle swarm optimization,1995
 Exploit the current best position
 Pbest
 Gbest
 Explore the unkown space

pbest gbest
Classical PSO
 Implementation
pbest
gbest

wV

pbest gbest

Vid  w  Vid  c1  Rand ()  ( pid  xid )  c2  Rand ()  ( g d  xid ) (1)


xid  xid  Vid (2)
Analysis of PSO_our idea
 Search strategy of PSO
 Exploitation
 Exploration
Analysis of PSO_our idea
 Hybrid uniform distribution
wV

pbest gbest

Exploitation
wV

Exploration
Analysis of PSO_our idea
Sampling probability density-computable

x(t  1)  x(t )  wV (t )  Z
Analysis of PSO_our idea
Analysis of PSO_our idea
Analysis of PSO_our idea
Sampling probability
wV
Analysis of PSO_our idea
 No inertia part(wV)
Analysis of PSO_our idea
 Inertia part(wV)
Analysis of PSO_our idea
 No inertia part(wV)
Analysis of PSO_our idea
 Inertia part(wV)
Analysis of PSO_our idea
 Difference among variants of PSO

Exploitation Exploration
Probability

Balance
Analysis of PSO_our idea
 What is the property of the iteration?
Analysis of PSO_our idea
 Whether the search strategy is the same or whether
the PSO is adaptive when
 Same parameter(during the convergent process)
 Different parameter
 Different dimensions
 Different number of particles
 Different topology
 Different objective functions
 In different search phase(when slow or sharp
slope,stagnation,etc)
 What’s the change pattern of the search strategy?
Analysis of PSO_our idea
 What is the better PSO on the search strategy?
 Simpler implement
 Using one parameter as a tuning knob instead of two in
standard PSO
 Prove they are equialent when setting some value of parameter
 Effective on most objective functions
 Adaptive
Analysis of PSO_our idea
 Markov chain
 State transition matrix
Analysis of PSO_our idea
 Random process
 Gaussian process
 Kernel mapping

Covarance matrix Kernel function Search straegy Effective?

Gauss process Objective


Mapping ability
problem
Analysis of PSO_our idea
 the object of our analysis
 search strategy of PSO
 Different parameter sets
 In different dimensions
 Using different number of particles
 On different objective functions
 Fitness evaluation
 Different topology
 Markov or gauss process and kernel function
 Direction to PSO
 Knob PSO
Analysis of PSO_our idea

w: x1 1
c: x2 2
dim: x3 3
Num: x4 4
5   f  (  i xi   ) P( Exploitation)
Top: x5 6
FEs: x6 7
Fun: x7
Current results
 Variance with convergence
 func_num=1;
 fes_num=5000;
 run_num=10;
 particles_num=50;
 dims_num=30;
Current results
 Variance with dimensions
 func_num=1;
 fes_num=3000;
 run_num=10;
 particles_num=50;
Current results
 Variance with number of particles
 func_num=1;
 fes_num=3000;
 run_num=10;
 dims_num=30;
Current results
 Variance with topology
Current results
 Variance with inertia weight
Current results
 1. Shifted Sphere Function
 2. Shifted Schwefel's Problem 1.2

4
x 10
4
x 10
5
8
4
6
3

2 4

1 2

0
0
-1
100 -2
100
50 100
50 50 100
0 50
0 0
-50 -50 0
-50 -50
-100 -100
-100 -100
PSO on Benchmark Function
 3. Shifted Rotated High Conditioned Elliptic Function
 4. Shifted Schwefel's Problem 1.2 with Noise in Fitness

10
x 10 4
x 10
4
5

3 4

2 3

2
1

1
0
100 0
50 100 100
0 50 0
0 -20
-50 50 -40
-50
-60
-100 -100 -80
0 -100
Current results
 Variance with objective functions
 Unimodal Functions

 Multimodal Functions

 Expanded Multimodal Functions

 Hybrid Composition Functions


Current results
 Variance with objective functions
 func_num=1,2,3,4;
 fes_num=3000;
 run_num=5;
 particles_num=50;
 dims_num=30;
variants of PSO_state of art
 Traditional strategy
 Simulated annealing
 Tabu strategy
 Gradient methods
 Adopted from other fields
 Clonal operation
 Mutation operation
 Heuristical Methods
 Advance and retreat
 Structure topology
 Full connection
 Ring topology
Our variants of PSO
 CPSO
 AR-CPSO
 MPSO
 RBH-PSO
 FPSO
Our variants of PSO
 CPSO

保存每一代的全局最优粒子作为第二步中克隆算子的父粒子

n次迭代过后克隆保存的n个全局最优粒子

将所有克隆出来的粒子利用随机扰动变异

基于浓度机制的多样性保持策略进行选择操作
Our variants of PSO
 MPSO
Our variants of PSO
 AR-CPSO
Our variants of PSO
 FPSO
Applications of PSO
Applications of PSO
Applications of PSO

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy