Particle Swarm Optimization Mohamed A. El-Sharkawi Computational Intelligence Applications (CIA) Lab. Department of EE, Box 352500 University of Washington Seattle, WA 98195-2500 elsharkawi@ee.washington.edu http://cialab.ee.washington.edu Particle Swarm Optimization = Coordination with Direct Communication M. A. El-Sharkawi, PSO 2 PSO vs SST Single Search PSO M. A. El-Sharkawi, PSO 3 M. A. El-Sharkawi, PSO 4 Particle Swarm Optimization Inventors: James Kennedy and Russell Eberhart An Algorithm originally developed to imitate the motion of a Flock of Birds, or insects Assumes Information Exchange (Social Interactions) among the search agents Basic Idea: Keep track of Global Best Self Best M. A. El-Sharkawi, PSO 5 M. A. El-Sharkawi, PSO 6 1
How does it work? Problem: Find X which minimizes f(x) Particle Swarm: Start: Random set of solution vectors Experiment: Include randomness in the choice of new states. Remember: Encode the information about good solutions. Improvise: Use the experience information to initiate search in a new regions M. A. El-Sharkawi, PSO 7 Component in the direction of previous motion Current motion New Motion Component in the direction of global best Component in the direction of personal best Personal Best at previous step Global best M. A. El-Sharkawi, PSO 8 PSO Modeling Each solution vector is modeled as The coordinates of a bird or a particle in a swarm flying through the search space All the particles have a non-zero velocity and thus never stop flying and are always sampling new regions. Each particle remembers Where the global best and where the local best are. M. A. El-Sharkawi, PSO 9 M. A. El-Sharkawi, PSO 10 PSO Modeling The search is guided by The collective consciousness of the swarm Introducing randomness into the dynamics in a controlled manner Particle Swarm Dynamics r r r x( k + 1) = x( k) + v( k) Inertia non-zero velocity PS never stop flying Self consciousness of the swarm Controlled randomness r r r r v( k + 1) = w. v( k) + r(0, a1).( xselfbest ( k) x( k)) r r v new + r(0, a ).( x ( k) x( k)) 2 GroupBest x gb v old x sb The collective consciousness of the swarm M. A. El-Sharkawi, PSO 11 M. A. El-Sharkawi, PSO 12 2
where, PSO x is a solution vector particle and v is the velocity of this particle a 1 and a 2 are two scalars, w is the inertia r(0,1) is a uniform random number generator between 0 and 1 Design Parameters a 1 and a 2 w: Should be between [0.9 and 1.2] High values of w gives a global search Low values of w gives a local search v max : To be designed according to the nature of the search surface. M. A. El-Sharkawi, PSO 13 M. A. El-Sharkawi, PSO 14 Example: Boundary Identification (Edge detector) To identify a subset of the search space( the boundary) with specific value Each flock finds one point on that boundary (edge) Flocks search sequentially Border (Edge) Identification Class 1 Class 2 M. A. El-Sharkawi, PSO 15 M. A. El-Sharkawi, PSO 16 Border (Edge) Identification The Art of Fitness Function To find points anywhere on the boundary Metric: f(x)-boundary value Techniques: Particle Swarm, Genetic Algorithm PSO is faster and more accurate M. A. El-Sharkawi, PSO 18 M. A. El-Sharkawi, PSO 17 3
Results - Case 1 The Art of Fitness Function Distribute points uniformly on the boundary Metric: f(x)-boundary value - Distance to closest neighbor (to penalize proximity to neighbors) M. A. El-Sharkawi, PSO 19 M. A. El-Sharkawi, PSO 20 Results - Case 2 The Art of Fitness Function Distribute points uniformly on the boundary close to current state Metric: f(x)-boundary value -Distance to closest neighbor + Distance to current state (penalize proximity to neighbors, penalize distance from current state) M. A. El-Sharkawi, PSO 21 M. A. El-Sharkawi, PSO 22 Results - Case 3 Application of PSO PSO is particularly suited for problems with many local minima and difficult global minima (e.g., along narrow valleys or in small holes ) M. A. El-Sharkawi, PSO 23 M. A. El-Sharkawi, PSO 24 4
PSO Challenges Like any search technique, PSO could be unsuccessful at distinguishing between global and local minima Local minimum is easier to find If fitness function cannot amplify the difference between global and local minima, PSO is likely to stay in the local minima Modified PSO Two-step PSO (or gradient-approximation) Cluster PSO M. A. El-Sharkawi, PSO 25 M. A. El-Sharkawi, PSO 26 Two-Step PSO Two-Step PSO Each particle takes two steps: one short and one long Then decide on optimal step based on steeper negative gradient x 0 x L x S M. A. El-Sharkawi, PSO 27 M. A. El-Sharkawi, PSO 28 Two-Step PSO Better method at not overflying narrow valleys Problems: It may take the short step more often than the long step, resulting in slower convergence Can still get trapped in local minima Cluster PSO It is a hierarchical version of PSO: PSO are arranged in clusters Each cluster contains multiple agents Each cluster has a centroid that acts, effectively, as a standard PSO agent Each agent within the cluster is attracted to its personal best, the cluster best, and the cluster centroid M. A. El-Sharkawi, PSO 29 M. A. El-Sharkawi, PSO 30 5
Cluster PSO Cluster PSO v c =w c v c +a c1 rand() (x cb -x cc ) +a c2 rand() (x gb -x cc ) xab v a x cb x cc x cb v c x gb v a =w a v a +a 1 rand() (x ab -x a ) +a 2 rand() (x cb -x a ) +a 3 rand() (x cc -x a ) x cc =x cc +v c x a =x a +v c +v a M. A. El-Sharkawi, PSO 31 M. A. El-Sharkawi, PSO 32 Cluster PSO Cluster PSO combines globally superior ability of standard PSO in avoiding local minima with the locally efficient search which can find narrow global minima. MAYBE! M. A. El-Sharkawi, PSO 33 6