Heuristic Methods For Optimization
Heuristic Methods For Optimization
q (9) ‘+ Local updating. A local updating rule is applied whenever a edge from city r to city s is taken: (1,8) — (1p) -7(r,8) + pAr(r, 8) (10) where Ar(r, s) ‘n+ Lnn)7', Lan is a rough estimation of circuit length calculated using the nearest neighbor heuristic;FPises), Sen — Hi if £5) > fGen) (4) Where f(Z) is the object function subject to maximization. Step 5 Termination Checking The algorithm repeats Steps 2 to 4 until certain termination conditions are met, such as pre-defined number of iterations oF a failure to make progress for certain number of iterations Once terminated, the algorithm reports the dex and f(dex) as its solution. Program 3 at the end of this chapter is a straightforward implemen- tation of the algorithm above. To experience the power of particle swarm optimization, Program 3 is applied to the following test func- tion, as visualized in Figure 8. Fi(a,y) =—zsin (viz) an (viv), 600 < 2,y < 500 as) A Tutorial on Meta-Heuristics for Optimization 113 ‘where global optimum is at F,(—420.97, —420.97) = 837.97. Figure 8. Object function Fe I ing factors, c, and c, are set to a value of ants waa nt weight is used according tothe suggestion from Shi and Eberhart (1999), Figure 9 reports the progress of parti- cle swarm optimization on the test function F(z, y) for the frst 300 iterations. At the end of 1000 iterations, F,(—420.97, 420.96) = 837.97 is located, which is close to the global optimum, is ics of particle swarm opti- It is worthwhile to look into the dynamics of partic n mization. Figure 10 presents the distribution of particles at different iterations. There is a clear trend that particles start from their initial positions and fly toward the global optimum. Numerous variants had been introduced since the first particle swarmMA S.C, Chu, C-8. Shick, and J. P. Roddick a = | a8 pecan Foo & ‘Beat Object Vote @ 8 8 a8 ee Figure 9. Progress of PSO on object function F, optimization. A discrete binary version of the ticle mization algorithm was Proposed by Kennedy and Ebeton ( om) Shi and Eberhart (2001) applied fuzzy theory to particle swarm op- timization algorithm, and successfully incorporated the concept of ¢0-evolution in solving min-max problems (Shi and Krohling 2002) (Chu et al. 2003) have proposed a parallel architecture with commis nication mechanisms for information exchange among independent ee Sroups, in which solution quality can be significantly im- 5 Discussions and Conclusions The nonstop evolution process has succ: ; essfully driven natural Species to develop effective solutions to a wide range of problems ‘A Tutorial on Meta-Heuristics for Optimization 116 Bo 8 os (4) oh iteration, () 10-thitertion, be (© 300-th iteration (6) 100-th iterton, Figure 10. The distribution of particles at diferent iterations Genetic algorithms, ant systems, and particle swarm optimization, all inspired by nature, have also proved themselves to be effective solutions to optimization problems. However, readers should remem- ber that, despite of the robustness these approaches claim to be of, there is no panacea. As discussed in previous sections, there are con- ‘rol parameters involved in these meta-heuristics and an adequate setting of these parameters is a key point for success. In general, some kind of trial-and-error tuning is necessary for each particular instance of optimization problem. In addition, these meta-heuristics ‘should not be considered in isolation. Prospective users should. ‘spec- ulate on the possibility of hybrid approaches and the integration of gradient-based methods, which are promising directions deserving further study.NG S.-C. Chu, OS. Shieh, and J. F. Roddick References Abramson, D. and Abela, J. (1991), “A parallel it J. genetic algorithm for solving the school timetabling problem,” Technical report, Divi sion of Information Technology, CSIRO. , Bull, L. (2001), “On coevolutionary genetic algorithms,” S L , 5” Soft Com- uting, vol. 5, no. 3, pp. 201-207, SOREN SA Come Chu, S. C. and Roddick, J. F. and Pan, J. $. (2003), “Parallel par- ticle swarm optimization algorithm with communication strate- gies,” personal communication, Chu, S. C. and Roddick, J. F. and Pan, J. S. and Su, C. J. (2003), aa an colony systems,” 14th International Symposium on fethodologies for Intelligent Systems, LNCS, Springer-Ve (will appear in Japan). ene Dorigo, M. and Maniezzo, V. and Colom, A. (1996), “The ant sys- tem: optimization by a colony of cooperating agents,” IEEE Trans on Systems, Man, and Cybernetics-Part B, vol. 26, no. 2, pp. 29- Dorigo, J. M. and Gambardella, L. M. (1997), “Ant colony system: a ‘cooperative learning approach to the traveling salesman problem,” Pas Trans. on Evolutionary Computation, vol. 26, no. 1, pp. 53~ Fonseca, C. M. and Fleming, P. J. (1993), “Multiobj . CM, PJ. pjective genetic algorithms," IEE Colloquium on Genetic Algorithms for Control ‘Systems Engineering, number 193/130, pp. 6/1-6/5. Fonseca, C. M. and Fleming, P. J. (1998), “Multiobjective opti mization and multiple constraint handling with evolutionary a gorithms I: A unified formulation,” IEEE Trans. on Systems, Man 4and Cybernetics-Part A, vol. 28, 0. 1, pp. 26-37, , A Tutorial on Meta-Heuristice for Optimization 117 Gao, Y., Shi, L., and Yao, P, (2000), “Study on multi-objective ge- netic algorithm,” Proceedings of the Third World Congress on In- telligent Control and Automation, pp. 646-650. Goldberg, D. E. (1989), Genetic Algorithm in Search, Optimization ‘and Machine Learning, Addison-Wesley, Reading, MA. Handa, H., Baba, M., Horiuchi, T., and Katai, O. (2002), “A novel hybrid framework of coevolutionary GA and machine learning,” International Journal of Computational Intelligence and Appli- cations, vol. 2, no. 1, pp. 33-52. Holland, J. (1975), Adaptation In Natural and Artificial Systems, University of Michigan Press. Kennedy, J. and Eberhart, R. (1995), “Particle swarm optimization,” IEEE International Conference on Neural Networks., pp. 1942- 1948 Papadimitriou C. H. and Steiglitz, K. (1982), Combinatorial Opti- ‘mization — Algorithms and Complexity, Prentice Hall. Sareni, B. and Krahenbuhl, L. (1998), “Fitness sharing and niching methods revisited,” IEEE Trans. on Evolutionary Computation, vol. 2, no. 3, pp. 97-106. Shi, Y. and Eberhart, R. C. (2001), “Fuzzy adaptive particle swarm optimization,” Proceedings of 2001 Congress on Evolutionary Computation (CEC'2001), pp. 101-106. Shi, Y. and Krohling, R. A. (2002), “Co-evolutionary particle swarm optimization to solve min-max problems,” Proceedings of 2002 Congress on Evolutionary Computation (CEC’2002), vol. 2, pp. 1682-1687. ‘Wang, L. and Wu, Q. (2001), “Ant system algorithm for optimization, in continuous space,” IEEE International Conference on Control Applications (CCA'2001), pp. 395-400.M8 $C. Chu, C.-S. Shieh, and J. P. Roddick Whitley, D. (1993), A genetic algorithm tutorial, Technical Report CS-93-103, Department of Computer Science, Colorado State University, Fort Collins, CO 8082. Willis, MJ, Hiden, H. G., Marenbach, P, McKay, B., and Mon- tague, G.A. (1997), “Genetic programming: an introduction and survey of applications,” Proceedings of the Second International Conference on Genetic Algorithms in Engineering Systems: Inno vations and Applications (GALESIA'97), pp. 314-319, A Tutorial on Meta-Heuristics for Optimisation 119 Program 1, An implementation of genetic algorithm in ¢ language. include
#include include fidefine MG 50 /* Maximal Number of Generations */ fidefine N 10 /* Population Size */ fidefine CL 32 /* Number of bits in each chromosome */ fidefine SF 2.0 /* Selection Factor */ Hdefine cR 0.5 /* Crossover Rate */ fidefine MR 0.05 /* Mutation Rate */ /* Macro for random number between 0 and 1 */ fidefine RAND ( (float) rand()/ (float) (RAND_MAX+1) ) int IN) (CLI; /* Population of Chromosomes */ float £16); /* Fitness Value of Chromosomes ” int best_c(cL]; /* Best Chromosome */ float best_f; /* Best Fitness Value */ /* Decode Chromosome */ void decode (int chromosome(Ch], float *x, float *y) ( int $7 /* Decode the lower 16 bits for variable x */ (#x)=0.07 for (j™0;4 best_f) i best_fvf [ils for (J=0;4 p{k] : k++) tmpfstmpf-pik]7i 122 $C. Chu, C8. Shieh, and J. F. Roddick /* Chromosome k is selected +/ for (j=0:3 include include fidefine Map "map.txt" —/* file name of city ma pts fdefine Numberofcity 12 /* munber of cities */ fidefine NumberofAnt 10 /* nunber of ante */ define alpha 0.2. /* pheromone decay fact or */ fidefine beta 2.0 /* tradeoff factor between pheromone and distance */ define taud 0.01 /* initial intensity of pheromone */ fidefine BpiscdeLimit 20 /* Limit of episode */ fidefine Route “route.txt" /* file name for route map */ /* RAND: Macro for random number hetween 0 and 1 */ Hidefine RAND | (float) rand()/ (float) (RAND_MAX+1) ) typedef struct ( float x: /* x coordinate */ float yi /* y coordinate */ } CityTyper typedef struct { int route (Numberofcity]; /* visiting sequence of cities */ float length: 7* length of route */ } Routerype; Citytype city(Nunberofcity]; (+ city array +/ float delta [Numberofci ty] [Numberofcity]; /* distance matrix */ | float eta [Numberofcity] {Numberofcity] ;124 8-6. Chu, Shieh, and J. P. Roddick /* weighted visibility matrix #/ fleat _tau[NumberOfcity] [Numberofcityl; /* pheromone intensity matrix */ 7* shortest route 4/ 7* ant array */ RouteType BestRoute; RouteType ant (NunberofAnt! float piNumberofcity) /* path-taking proba ln ctsteespnens bility array «/ nt visited INunberofcityl; /+ array tor visitin status */ ¥ $ float delta_tau{Numberofcity] (Numberofcity] ; /* sum of change in tau */ void main (void) { ues /* tite pointer for city map + ine /* indices for cities */ int Js Sndex for ane */ int episodes 7+ index for ent system cycle #/ int Steps 1s index for outing seep @/ float taps 7+ cemposary variable */ FILE* routefpr; /* file pointer for route map */ /* Set random seed */ farand(1)+ /+ Read city map */ mapfpr=fopen (Map, "x") + for (r=0; reNumberofcitysr++) fecanf (mapfpr, "8f 9£", ¢(city(r].x),&(eitytr). ys fclose (mapfpr) + /* Evaluate distance matrix */ for (r=0;r p[s}7s++) ‘tmp~=p ls]: ant [k] . route [step] ) A Tutorial on Meta-Heuristcs for Optimization 127 /* Update pheromone intensity */ /* Reset matrix for sum of change in tau */ for (r=0) r 7* Write route map */ include : routetpr=fopen Route, "w") Hinelude for (r=0;r gbest_fitness) ( for (d-0:d Vmax) pli) -v{d) =vmax: Sf (pLi] vc] <-Vmax) pi] -v[d) =-Vmaxz PU) -x(d)=p(i) .x(d} #p(3) -vidi if (pli] x(a] >500.0)p(i] .x{d) =500.07 Af (pl4) x14] <-00.0)p{i] .xId) =-500.0; ) pli]. £itness=Schwefel (pli] .x)s /* Update pbest */ if (pli). fitness>p(i] .best_fitness) ( for (d=0;d ghest_fitness) ( for (d=0;d You might also like
A Tutorial On Meta-Heuristics For OptimizationNo ratings yetA Tutorial On Meta-Heuristics For Optimization36 pages Genetic Algorithm: Review and Application: Manoj Kumar, Mohammad Husian, Naveen Upreti & Deepti GuptaNo ratings yetGenetic Algorithm: Review and Application: Manoj Kumar, Mohammad Husian, Naveen Upreti & Deepti Gupta4 pages Genetic Algorithms: Department of Computer Science, University of New Mexico, AlbuquerqueNo ratings yetGenetic Algorithms: Department of Computer Science, University of New Mexico, Albuquerque4 pages Soft Computing Unit-5 by Arun Pratap Singh100% (1)Soft Computing Unit-5 by Arun Pratap Singh78 pages Genetic Algorithms: Asst Lec. Muhannad A. MuhammedNo ratings yetGenetic Algorithms: Asst Lec. Muhannad A. Muhammed27 pages Genetic Algorithm: From Wikipedia, The Free EncyclopediaNo ratings yetGenetic Algorithm: From Wikipedia, The Free Encyclopedia3 pages Explain in Details About Optimization and Its Significance: Unit 6No ratings yetExplain in Details About Optimization and Its Significance: Unit 6100 pages Non Traditional Optimization Methods: Chapter-4No ratings yetNon Traditional Optimization Methods: Chapter-424 pages ML Unit-IV Chapter-I Genetic AlgorithmsNo ratings yetML Unit-IV Chapter-I Genetic Algorithms35 pages Intelligent Systems Theory: Lecture Eight Dan Humpert Associate Professor University of Cincinnati Mechanical EngineeringNo ratings yetIntelligent Systems Theory: Lecture Eight Dan Humpert Associate Professor University of Cincinnati Mechanical Engineering35 pages Cse 590 Data Mining: Prof. Anita Wasilewska SUNY Stony BrookNo ratings yetCse 590 Data Mining: Prof. Anita Wasilewska SUNY Stony Brook66 pages An Introduction To Genetic Algorithms: AbstractNo ratings yetAn Introduction To Genetic Algorithms: Abstract9 pages