Knowledge

Particle swarm optimization

Source πŸ“

892: 22: 4463: 406:(called particles). These particles are moved around in the search-space according to a few simple formulae. The movements of the particles are guided by their own best-known position in the search-space as well as the entire swarm's best-known position. When improved positions are being discovered these will then come to guide the movements of the swarm. The process is repeated and by doing so it is hoped, but not guaranteed, that a satisfactory solution will eventually be discovered. 1055:, remain constant throughout the optimization process. However, it was shown that these simplifications do not affect the boundaries found by these studies for parameter where the swarm is convergent. Considerable effort has been made in recent years to weaken the modeling assumption utilized during the stability analysis of PSO, with the most recent generalized result applying to numerous PSO variants and utilized what was shown to be the minimal necessary modeling assumptions. 1082:
with a higher convergence speed. It enables automatic control of the inertia weight, acceleration coefficients, and other algorithmic parameters at the run time, thereby improving the search effectiveness and efficiency at the same time. Also, APSO can act on the globally best particle to jump out of the likely local optima. However, APSO will introduce new algorithm parameters, it does not introduce additional design or implementation complexity nonetheless.
1162:(another popular metaheuristic) but it was later found to be defective as it was strongly biased in its optimization search towards similar values for different dimensions in the search space, which happened to be the optimum of the benchmark problems considered. This bias was because of a programming error, and has now been fixed. 1073:, so as to form a leading converging exemplar and to be effective with any PSO topology. The aims are to improve the performance of PSO overall, including faster global convergence, higher solution quality, and stronger robustness. However, such studies do not provide theoretical evidence to actually prove their claims. 1986:
As the PSO equations given above work on real numbers, a commonly used method to solve discrete problems is to map the discrete search space to a continuous domain, to apply a classical PSO, and then to demap the result. Such a mapping can be very simple (for example by just using rounded values) or
1006:
Another school of thought is that the behaviour of a PSO swarm is not well understood in terms of how it affects actual optimization performance, especially for higher-dimensional search-spaces and optimization problems that may be discontinuous, noisy, and time-varying. This school of thought merely
953:
The topology of the swarm defines the subset of particles with which each particle can exchange information. The basic version of the algorithm uses the global topology as the swarm communication structure. This topology allows all particles to communicate with all the other particles, thus the whole
1117:
New and more sophisticated PSO variants are also continually being introduced in an attempt to improve optimization performance. There are certain trends in that research; one is to make a hybrid optimization method using PSO combined with other optimizers, e.g., combined PSO with biogeography-based
1081:
Without the need for a trade-off between convergence ('exploitation') and divergence ('exploration'), an adaptive mechanism can be introduced. Adaptive particle swarm optimization (APSO) features better search efficiency than standard PSO. APSO can perform global search over the entire search space
2011:
real numbers, and these operators are simply -, *, +, and again +. But all these mathematical objects can be defined in a completely different way, in order to cope with binary problems (or more generally discrete ones), or even combinatorial ones. One approach is to redefine the operators based on
986:
A common belief amongst researchers is that the swarm behaviour varies between exploratory behaviour, that is, searching a broader region of the search-space, and exploitative behaviour, that is, a locally oriented search so as to get closer to a (possibly local) optimum. This school of thought has
1046:
Convergence of the sequence of solutions has been investigated for PSO. These analyses have resulted in guidelines for selecting PSO parameters that are believed to cause convergence to a point and prevent divergence of the swarm's particles (particles do not move unboundedly and will converge to
958:
from a single particle. However, this approach might lead the swarm to be trapped into a local minimum, thus different topologies have been used to control the flow of information among particles. For instance, in local topologies, particles only share information with a subset of particles. This
967:
A commonly used swarm topology is the ring, in which each particle has just two neighbours, but there are many others. The topology is not necessarily static. In fact, since the topology is related to the diversity of communication of the particles, some efforts have been done to create adaptive
1108:
A series of standard implementations have been created by leading researchers, "intended for use both as a baseline for performance testing of improvements to the technique, as well as to represent PSO to the wider optimization community. Having a well-known, strictly-defined standard algorithm
333:. Each particle's movement is influenced by its local best known position, but is also guided toward the best known positions in the search-space, which are updated as better positions are found by other particles. This is expected to move the swarm toward the best solutions. 1143:. Simplifying PSO was originally suggested by Kennedy and has been studied more extensively, where it appeared that optimization performance was improved, and the parameters were easier to tune and they performed more consistently across different optimization problems. 902:
To prevent divergence ("explosion") the inertia weight must be smaller than 1. The two other parameters can be then derived thanks to the constriction approach, or freely selected, but the analyses suggest convergence domains to constrain them. Typical values are in
1330: 1047:
somewhere). However, the analyses were criticized by Pedersen for being oversimplified as they assume the swarm has only one particle, that it does not use stochastic variables and that the points of attraction, that is, the particle's best known position
1126:
Another research trend is to try to alleviate premature convergence (that is, optimization stagnation), e.g. by reversing or perturbing the movement of the PSO particles, another approach to deal with premature convergence is the use of multiple swarms
1007:
tries to find PSO algorithms and parameters that cause good performance regardless of how the swarm behaviour can be interpreted in relation to e.g. exploration and exploitation. Such studies have led to the simplification of the PSO algorithm, see
1722: 1093:
Numerous variants of even a basic PSO algorithm are possible. For example, there are different ways to initialize the particles and velocities (e.g. start with zero velocities instead), how to dampen the velocity, only update
1601:
Another simpler variant is the accelerated particle swarm optimization (APSO), which also does not need to use velocity and can speed up the convergence in many applications. A simple demo code of APSO is available.
3074:
Almasi, O. N. and Khooban, M. H. (2017). A parsimonious SVM model selection criterion for classification of real-world data sets via an adaptive population-based algorithm. Neural Computing and Applications, 1-9.
4000: 3739:
Nobile, M.; Besozzi, D.; Cazzaniga, P.; Mauri, G.; Pescini, D. (2012). "A GPU-Based Multi-Swarm PSO Method for Parameter Estimation in Stochastic Biological Systems Exploiting Discrete-Time Target Series".
963:
nearest particles" – or, more often, a social one, i.e. a set of particles that is not depending on any distance. In such cases, the PSO variant is said to be local best (vs global best for the basic PSO).
987:
been prevalent since the inception of PSO. This school of thought contends that the PSO algorithm and its parameters must be chosen so as to properly balance between exploration and exploitation to avoid
1908: 1131:). The multi-swarm approach can also be used to implement multi-objective optimization. Finally, there are developments in adapting the behavioural parameters of PSO during optimization. 1179: 1065:
results. One attempt at addressing this issue is the development of an "orthogonal learning" strategy for an improved use of the information already existing in the relationship between
899:
The choice of PSO parameters can have a large impact on optimization performance. Selecting PSO parameters that yield good performance has therefore been the subject of much research.
1498: 1838: 1806: 1960: 1611: 4051:
Jarboui, B.; Damak, N.; Siarry, P.; Rebai, A. (2008). "A combinatorial particle swarm optimization for solving multi-mode resource-constrained project scheduling problems".
3607:"Pathological Brain Detection in Magnetic Resonance Imaging Scanning by Wavelet Entropy and Hybridization of Biogeography-based Optimization and Particle Swarm Optimization" 3110: 2860:
Nobile, M.S; Pasi, G.; Cazzaniga, P.; Besozzi, D.; Colombo, R.; Mauri, G. (2015). "Proactive particles in swarm optimization: a self-tuning algorithm based on fuzzy logic".
1405: 1369: 1605:
In this variant of PSO one dispences with both the particle's velocity and the particle's best position. The particle position is updated according to the following rule,
2793:
Mason, Karl; Duggan, Jim; Howley, Enda (2018). "A Meta Optimisation Analysis of Particle Swarm Optimisation Velocity Update Equations for Watershed Management Learning".
1754: 1591: 1531: 1454: 1170:
Initialization of velocities may require extra inputs. The Bare Bones PSO variant has been proposed in 2003 by James Kennedy, and does not need to use velocity at all.
1858: 1551: 1109:
provides a valuable point of comparison which can be used throughout the field of research to better test new advances." The latest is Standard PSO 2011 (SPSO-2011).
4197: 363:. The algorithm was simplified and it was observed to be performing optimization. The book by Kennedy and Eberhart describes many philosophical aspects of PSO and 1928: 1774: 1425: 1085:
Besides, through the utilization of a scale-adaptive fitness evaluation mechanism, PSO can efficiently address computationally expensive optimization problems.
933: 2823:
Nobile, M.S; Cazzaniga, P.; Besozzi, D.; Colombo, R.; Mauri, G.; Pasi, G. (2018). "Fuzzy Self-Tuning PSO: a settings-free algorithm for global optimization".
3704:
Cheung, N. J.; Ding, X.-M.; Shen, H.-B. (2013). "OptiFel: A Convergent Heterogeneous Particle Sarm Optimization Algorithm for Takagi-Sugeno Fuzzy Modeling".
3266:
Cleghorn, Christopher W.; Engelbrecht, Andries. (2018). "Particle Swarm Stability: A Theoretical Extension using the Non-Stagnate Distribution Assumption".
2717: 1173:
In this variant of PSO one dispences with the velocity of the particles and instead updates the positions of the particles using the following simple rule,
871:
The termination criterion can be the number of iterations performed, or a solution where the adequate objective function value is found. The parameters w, Ο†
4013: 3467:"Scale adaptive fitness evaluation-based particle swarm optimisation for hyperparameter and architecture optimisation in neural networks and deep learning" 1058:
Convergence to a local optimum has been analyzed for PSO in and. It has been proven that PSO needs some modification to guarantee finding a local optimum.
1139:
Another school of thought is that PSO should be simplified as much as possible without impairing its performance; a general concept often referred to as
378:
as it makes few or no assumptions about the problem being optimized and can search very large spaces of candidate solutions. Also, PSO does not use the
4805: 3498:
Zambrano-Bigiarini, M.; Clerc, M.; Rojas, R. (2013). "Standard Particle Swarm Optimisation 2011 at CEC-2013: A baseline for future PSO improvements".
2883:
Cazzaniga, P.; Nobile, M.S.; Besozzi, D. (2015). "The impact of particles initialization in PSO: parameter estimation as a case in point, (Canada)".
276: 1105:
after the entire swarm has been updated, etc. Some of these choices and their possible performance impact have been discussed in the literature.
3973:
Mason, Karl; Duggan, Jim; Howley, Enda (2017). "Multi-objective dynamic economic emission dispatch using particle swarm optimisation variants".
1158:
and this increases the risk of making errors in its description and implementation. A good example of this presented a promising variant of a
4190: 2464: 2225: 895:
Performance landscape showing how a simple PSO variant performs in aggregate on several benchmark problems when varying two PSO parameters.
4150:
Liu, Yang (2009). "Automatic calibration of a rainfall–runoff model using a fast and elitist multi-objective particle swarm algorithm".
413:: β„ β†’ ℝ be the cost function which must be minimized. The function takes a candidate solution as an argument in the form of a 3312: 3022:
Oliveira, M.; Pinheiro, D.; Andrade, B.; Bastos-Filho, C.; Menezes, R. (2016). "Communication Diversity in Particle Swarm Optimizers".
3124: 4183: 4138: 860:
represent the lower and upper boundaries of the search-space respectively. The w parameter is the inertia weight. The parameters Ο†
4494: 2522:
Clerc, M.; Kennedy, J. (2002). "The particle swarm - explosion, stability, and convergence in a multidimensional complex space".
1154:
by doing computational experiments on a finite number of optimization problems. This means a metaheuristic such as PSO cannot be
99: 2079:
Bonyadi, M. R.; Michalewicz, Z. (2017). "Particle swarm optimization for single objective continuous space problems: a review".
4674: 235: 371:. In 2017, a comprehensive review on theoretical and experimental works on PSO has been published by Bonyadi and Michalewicz. 4413: 4078:
Chen, Wei-neng; Zhang, Jun (2010). "A novel set-based particle swarm optimization method for discrete optimization problem".
3772: 3515: 3207: 3039: 2958: 2372: 2208: 318: 269: 104: 2396:
Taherkhani, M.; Safabakhsh, R. (2016). "A novel stability-based adaptive inertia weight for particle swarm optimization".
313:
with regard to a given measure of quality. It solves a problem by having a population of candidate solutions, here dubbed
2632: 1978:
into account when moving the PSO particles and non-dominated solutions are stored so as to approximate the pareto front.
971:
By using the ring topology, PSO can attain generation-level parallelism, significantly enhancing the evolutionary speed.
337: 1863: 4423: 4310: 4206: 421:
and produces a real number as output which indicates the objective function value of the given candidate solution. The
3823:
Tu, Z.; Lu, Y. (2008). "Corrections to "A Robust Stochastic Genetic Algorithm (StGA) for Global Numerical Optimization
4616: 4257: 3880: 2982: 1061:
This means that determining the convergence capabilities of different PSO algorithms and parameters still depends on
511: 182: 1325:{\displaystyle {\vec {x}}_{i}=G\left({\frac {{\vec {p}}_{i}+{\vec {g}}}{2}},||{\vec {p}}_{i}-{\vec {g}}||\right)\,,} 4831: 262: 129: 37: 3161:"Generation-Level Parallelism for Evolutionary Computation: A Pipeline-Based Parallel Particle Swarm Optimization" 4513: 3561:
Niknam, T.; Amiri, B. (2010). "An efficient hybrid approach based on PSO, ACO and k-means for cluster analysis".
2996: 2734: 2021: 64: 3540: 2562:
Trelea, I.C. (2003). "The Particle Swarm Optimization Algorithm: convergence analysis and parameter selection".
3742:
Evolutionary Computation, Machine Learning and Data Mining in Bioinformatics. Lecture Notes in Computer Science
3190:
Cleghorn, Christopher W (2014). "Particle Swarm Convergence: Standardized Analysis and Topological Influence".
3100:
Clerc, M. (2006). Particle Swarm Optimization. ISTE (International Scientific and Technical Encyclopedia), 2006
162: 3905:
Accelerated particle swarm optimization and support vector machine for business optimization and applications
2320: 2056: 493:
be the best known position of the entire swarm. A basic PSO algorithm to minimize the cost function is then:
215: 192: 172: 134: 1031:) in which all particles have converged to a point in the search-space, which may or may not be the optimum, 4384: 4335: 2031: 1971: 1459: 230: 177: 3088: 2862:
Proceedings of the 2015 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE 2015), Istanbul (Turkey)
4584: 4272: 3788:
Tu, Z.; Lu, Y. (2004). "A robust stochastic genetic algorithm (StGA) for global numerical optimization".
2725:. University of Southampton, School of Engineering Sciences, Computational Engineering and Design Group. 1990:
However, it can be noted that the equations of movement make use of operators that perform four actions:
1811: 240: 2885:
Proceedings of IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology
2664:"Optimized Particle Swarm Optimization (OPSO) and its application to artificial neural network training" 1779: 1717:{\displaystyle {\vec {x}}_{i}\leftarrow (1-\beta ){\vec {x}}_{i}+\beta {\vec {g}}+\alpha L{\vec {u}}\,,} 4340: 4252: 3931:
Parsopoulos, K.; Vrahatis, M. (2002). "Particle swarm optimization method in multiobjective problems".
3091:. International Journal of Computational Intelligence Research (IJCIR), Volume 4, Number 2, pp. 105-116 1933: 220: 167: 119: 4518: 4489: 4389: 4327: 4038:
Clerc, M. (2005). Binary Particle Swarm Optimisers: toolbox, derivations, and mathematical insights,
891: 302: 4092: 4001:
A Novel Particle Swarm Optimization Algorithm for Multi-Objective Combinatorial Optimization Problem
2941: 2916: 2770: 2536: 1374: 1338: 4669: 3012:." Evolutionary Computation, 1999. CEC 99. Proceedings of the 1999 Congress on. Vol. 3. IEEE, 1999. 2036: 1994:
computing the difference of two positions. The result is a velocity (more precisely a displacement)
1128: 84: 4170: 3639: 3334: 382:
of the problem being optimized, which means PSO does not require that the optimization problem be
4826: 4750: 4535: 4267: 4016:, Conference on Systems, Man, and Cybernetics, Piscataway, NJ: IEEE Service Center, pp. 4104-4109 3917: 3541:"The LifeCycle Model: combining particle swarm optimisation, genetic algorithms and hillclimbers" 2475: 2351: 383: 4026: 2236: 1730: 1556: 1507: 1430: 4770: 4430: 4303: 4242: 4222: 4087: 2936: 2911: 2765: 2531: 2350:
Bratton, Daniel; Kennedy, James (2007). "Defining a Standard for Particle Swarm Optimization".
938:
The PSO parameters can also be tuned by using another overlaying optimizer, a concept known as
74: 54: 45: 3076: 2637:. The University of Texas - Pan American, Department of Electrical Engineering. Archived from 4579: 4484: 4237: 4232: 1843: 1536: 988: 290: 225: 187: 4790: 4755: 4593: 4540: 4447: 879:
are selected by the practitioner and control the behaviour and efficacy of the PSO method (
394:. However, metaheuristics such as PSO do not guarantee an optimal solution is ever found. 391: 3135: 8: 4800: 4775: 4765: 4355: 4277: 4247: 4227: 1501: 1155: 1028: 996: 356: 349: 341: 206: 139: 94: 4603: 3009: 2756:
Pedersen, M.E.H.; Chipperfield, A.J. (2010). "Simplifying particle swarm optimization".
2634:
An Automatic Regrouping Mechanism to Deal with Stagnation in Particle Swarm Optimization
4780: 4641: 4636: 4217: 4105: 3886: 3842: 3805: 3721: 3660:
Xinchao, Z. (2010). "A perturbed particle swarm algorithm for numerical optimization".
3521: 3465:
Wang, Ye-Qun; Li, Jian-Yu; Chen, Chun-Hua; Zhang, Jun; Zhan, Zhi-Hui (September 2023).
3447: 3357: 3293: 3248: 3045: 2976: 2964: 2933:
Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600)
2931:
Kennedy, J.; Mendes, R. (2002). "Population structure and particle swarm performance".
2726: 2690: 2663: 2593: 2378: 2261: 2104: 2051: 2046: 1913: 1759: 1410: 980: 403: 364: 310: 89: 69: 2575: 2426:
Shi, Y.; Eberhart, R.C. (1998). "Parameter selection in particle swarm optimization".
2298:"A Comprehensive Survey on Particle Swarm Optimization Algorithm and Its Applications" 906: 402:
A basic variant of the PSO algorithm works by having a population (called a swarm) of
4745: 4664: 4651: 4611: 4452: 4296: 3876: 3768: 3525: 3511: 3439: 3240: 3203: 3035: 2954: 2730: 2695: 2368: 2204: 2096: 1975: 1159: 939: 326: 153: 114: 79: 4143: 4109: 3890: 3809: 3725: 3451: 3252: 3049: 2968: 4626: 4574: 4557: 4479: 4471: 4159: 4097: 4060: 4039: 4027:
Discrete Particle Swarm Optimization, illustrated by the Traveling Salesman Problem
3982: 3936: 3868: 3865:
Proceedings of the 2003 IEEE Swarm Intelligence Symposium. SIS'03 (Cat. No.03EX706)
3846: 3834: 3797: 3745: 3713: 3669: 3618: 3570: 3503: 3478: 3431: 3394: 3361: 3349: 3335:"A locally convergent rotationally invariant particle swarm optimization algorithm" 3297: 3283: 3275: 3232: 3195: 3172: 3027: 2946: 2888: 2869: 2865: 2840: 2832: 2802: 2775: 2685: 2675: 2608: 2571: 2541: 2443:"Comparing inertia weights and constriction factors in particle swarm optimization" 2442: 2405: 2382: 2360: 2276: 2181: 2153: 2127: 2108: 2088: 1140: 387: 352: 306: 250: 144: 4175: 4129:
is a repository for information on PSO. Several source codes are freely available.
3588: 2504:(PhD thesis). University of Pretoria, Faculty of Natural and Agricultural Science. 2297: 4442: 4319: 4003:. 'International Journal of Applied Metaheuristic Computing (IJAMC)', 2(4), 41-57 3986: 3687: 3199: 2262:"Analysis of the publications on the applications of particle swarm optimisation" 2041: 1840:
are the parameters of the method. As a refinement of the method one can decrease
1042:, approaches a local optimum of the problem, regardless of how the swarm behaves. 59: 3955: 3749: 3031: 2836: 4785: 4719: 4696: 4530: 4525: 4508: 4501: 4394: 4163: 3717: 3673: 3574: 3435: 3160: 2892: 2806: 2779: 2409: 2026: 314: 124: 26: 21: 4462: 4101: 4064: 3398: 3353: 3279: 3176: 2638: 2496: 4820: 4659: 4401: 4372: 4350: 4262: 3872: 3838: 3801: 3507: 3502:. Evolutionary Computation (CEC), 2013 IEEE Congress on. pp. 2337–2344. 3379: 3089:
Stochastic Star Communication Topology in Evolutionary Particle Swarms (EPSO)
2950: 2185: 2157: 2131: 1147: 992: 942:, or even fine-tuned during the optimization, e.g., by means of fuzzy logic. 539:) Initialize the particle's best known position to its initial position: 375: 368: 197: 3223:
Liu, Q (2015). "Order-2 stability analysis of particle swarm optimization".
2906:
Pedersen, M.E.H. (2010). "Good parameters for particle swarm optimization".
2680: 2364: 4740: 4714: 4704: 4681: 4562: 3443: 3416: 3244: 3113:. International Journal of Swarm Intelligence Research (IJSIR), 2(2), 22-41 2699: 2100: 3940: 2613: 2281: 2176:
Kennedy, J. (1997). "The particle swarm: social adaptation of knowledge".
4760: 4709: 4550: 4406: 3623: 3606: 3236: 2226:"An analysis of publications on particle swarm optimisation applications" 2092: 968:
topologies (SPSO, APSO, stochastic star, TRIBES, Cyber Swarm, and C-PSO)
418: 360: 2845: 2545: 2178:
Proceedings of IEEE International Conference on Evolutionary Computation
2150:
Proceedings of IEEE International Conference on Evolutionary Computation
4567: 4545: 3483: 3466: 3194:. Lecture Notes in Computer Science. Vol. 8667. pp. 134–145. 2997:
Population Topologies and Their Influence in Particle Swarm Performance
2148:
Shi, Y.; Eberhart, R.C. (1998). "A modified particle swarm optimizer".
999:
to the optimum. This belief is the precursor of many PSO variants, see
414: 3956:"MOPSO: A Proposal for Multiple Objective Particle Swarm Optimization" 3288: 355:, as a stylized representation of the movement of organisms in a bird 4795: 4435: 3640:"Extending Particle Swarm Optimisers with Self-Organized Criticality" 3026:. Lecture Notes in Computer Science. Vol. 9882. pp. 77–88. 1151: 1118:
optimization, and the incorporation of an effective learning method.
1062: 345: 3647:
Proceedings of the Fourth Congress on Evolutionary Computation (CEC)
945:
Parameters have also been tuned for various optimization scenarios.
4724: 4686: 4418: 4360: 4029:, New Optimization Techniques in Engineering, Springer, pp. 219-239 3021: 1027:
Convergence of the sequence of solutions (aka, stability analysis,
422: 379: 330: 4139:
Simulation of PSO convergence in a two-dimensional space (Matlab).
3904: 4345: 3589:
DEPSO: hybrid particle swarm with differential evolution operator
2122:
Kennedy, J.; Eberhart, R. (1995). "Particle Swarm Optimization".
322: 245: 4126: 3063: 464:
be the number of particles in the swarm, each having a position
4379: 2124:
Proceedings of IEEE International Conference on Neural Networks
868:
are often called cognitive coefficient and social coefficient.
3593:
IEEE International Conference on Systems, Man, and Cybernetics
3548:
Proceedings of Parallel Problem Solving from Nature VII (PPSN)
983:
as to why and how the PSO algorithm can perform optimization.
4621: 4132: 2822: 3972: 3953: 3497: 2792: 4367: 4288: 3933:
Proceedings of the ACM Symposium on Applied Computing (SAC)
3738: 2859: 2755: 1596: 4014:
A discrete binary version of the particle swarm algorithm
3015: 2661: 4135:
of particle swarms optimizing three benchmark functions.
2751: 2749: 2747: 2458: 2456: 1034:
Convergence to a local optimum where all personal bests
4050: 3918:"Search Results: APSO - File Exchange - MATLAB Central" 3686:
Xie, Xiao-Feng; Zhang, Wen-Jun; Yang, Zhi-Lian (2002).
3109:
Yin, P., Glover, F., Laguna, M., & Zhu, J. (2011).
2472:
Proceedings of the Particle Swarm Optimization Workshop
2447:
Proceedings of the Congress on Evolutionary Computation
1407:
are the position and the best position of the particle
386:
as is required by classic optimization methods such as
3491: 3313:"A convergence proof for the particle swarm optimizer" 2882: 2587: 2585: 1981: 1146:
Another argument in favour of simplifying PSO is that
367:. An extensive survey of PSO applications is made by 3924: 3863:
Kennedy, James (2003). "Bare bones particle swarms".
2924: 2744: 2453: 2389: 2007:
Usually a position and a velocity are represented by
1936: 1916: 1866: 1846: 1814: 1782: 1762: 1733: 1614: 1559: 1539: 1510: 1462: 1433: 1413: 1377: 1341: 1182: 909: 3415:
Zhan, Z-H.; Zhang, J.; Li, Y; Chung, H.S-H. (2009).
3010:
Particle swarm optimiser with neighbourhood operator
2395: 4205: 3947: 3930: 3631: 3532: 3265: 3122: 2711: 2709: 2655: 2582: 2557: 2555: 2517: 2515: 2513: 2511: 2434: 2171: 2169: 2167: 1974:, in which the objective function comparison takes 1038:or, alternatively, the swarm's best known position 959:subset can be a geometrical one – for example "the 4806:Task allocation and partitioning of social insects 3653: 3424:IEEE Transactions on Systems, Man, and Cybernetics 3332: 2899: 2662:Meissner, M.; Schmuker, M.; Schneider, G. (2006). 2428:Proceedings of Evolutionary Programming VII (EP98) 2078: 1954: 1922: 1903:{\displaystyle \alpha _{n}=\alpha _{0}\gamma ^{n}} 1902: 1852: 1832: 1800: 1776:is the typical length of the problem at hand, and 1768: 1748: 1716: 1585: 1545: 1525: 1492: 1448: 1419: 1399: 1363: 1324: 1121: 927: 683:(0,1) Update the particle's velocity: 3554: 3410: 3408: 3380:"Orthogonal Learning Particle Swarm Optimization" 3333:Bonyadi, Mohammad reza.; Michalewicz, Z. (2014). 2719:Tuning & Simplifying Heuristical Optimization 2421: 2419: 2074: 2072: 1997:multiplying a velocity by a numerical coefficient 4818: 3907:, NDT 2011, Springer CCIS 136, pp. 53-66 (2011). 3414: 3378:Zhan, Z-H.; Zhang, J.; Li, Y; Shi, Y-H. (2011). 3134:. Honolulu, HI. pp. 289–296. Archived from 3132:IEEE Swarm Intelligence Symposium 2007 (SIS2007) 2706: 2626: 2624: 2601:Journal of Artificial Evolution and Applications 2591: 2552: 2508: 2490: 2488: 2269:Journal of Artificial Evolution and Applications 2164: 2115: 948: 3960:Congress on Evolutionary Computation (CEC'2002) 3954:Coello Coello, C.; Salazar Lechuga, M. (2002). 3816: 3464: 3087:Miranda, V., Keko, H. and Duque, Á. J. (2008). 2198: 2143: 2141: 2121: 1965: 1023:typically refers to two different definitions: 3999:Roy, R., Dehuri, S., & Cho, S. B. (2012). 3703: 3500:2013 IEEE Congress on Evolutionary Computation 3405: 3377: 3373: 3371: 2462: 2416: 2349: 2345: 2343: 2341: 2339: 2337: 2069: 4304: 4191: 4080:IEEE Transactions on Evolutionary Computation 3827:IEEE Transactions on Evolutionary Computation 3790:IEEE Transactions on Evolutionary Computation 3637: 3538: 3387:IEEE Transactions on Evolutionary Computation 2930: 2876: 2818: 2816: 2786: 2621: 2524:IEEE Transactions on Evolutionary Computation 2494: 2485: 2253: 2217: 2192: 471: βˆˆ ℝ in the search-space and a velocity 429:is not known. The goal is to find a solution 270: 3966: 3756: 3471:CAAI Transactions on Intelligence Technology 3259: 3123:Elshamy, W.; Rashad, H.; Bahgat, A. (2007). 3116: 2521: 2138: 3560: 3368: 3326: 2440: 2425: 2334: 2147: 794:Update the particle's best known position: 317:, and moving these particles around in the 4311: 4297: 4198: 4184: 4071: 4012:Kennedy, J. & Eberhart, R. C. (1997). 3781: 3732: 2853: 2813: 2312: 1756:is a random uniformly distributed vector, 1150:can only have their efficacy demonstrated 745:) Update the particle's position: 510:Initialize the particle's position with a 277: 263: 4091: 4077: 3688:A dissipative particle swarm optimization 3680: 3622: 3604: 3581: 3482: 3310: 3287: 3125:"Clubs-based Particle Swarm Optimization" 3077:https://doi.org/10.1007/s00521-017-2930-y 2940: 2935:. Vol. 2. pp. 1671–1676 vol.2. 2915: 2844: 2769: 2689: 2679: 2612: 2535: 2280: 1710: 1318: 3765:Nature-Inspired Metaheuristic Algorithms 3189: 2905: 2715: 2498:An Analysis of Particle Swarm Optimizers 890: 880: 833:Update the swarm's best known position: 578:update the swarm's best known position: 20: 3862: 3659: 3595:(SMCC), Washington, DC, USA: 3816-3821. 3587:Zhang, Wen-Jun; Xie, Xiao-Feng (2003). 3216: 3158: 2175: 1597:Accelerated Particle Swarm Optimization 485:be the best known position of particle 4819: 3417:"Adaptive Particle Swarm Optimization" 2561: 2353:2007 IEEE Swarm Intelligence Symposium 2321:"Standard Particle Swarm Optimisation" 1076: 886: 453:in the search-space, which would mean 4414:Patterns of self-organization in ants 4292: 4179: 3611:Progress in Electromagnetics Research 3111:A Complementary Cyber Swarm Algorithm 2630: 2318: 2295: 1493:{\displaystyle G({\vec {x}},\sigma )} 3822: 3787: 3762: 3694:(CEC), Honolulu, HI, USA: 1456-1461. 3692:Congress on Evolutionary Computation 2999:(PhD thesis). Universidade do Minho. 2302:Mathematical Problems in Engineering 2259: 2223: 2199:Kennedy, J.; Eberhart, R.C. (2001). 2126:. Vol. IV. pp. 1942–1948. 1051:and the swarm's best known position 1008: 589:Initialize the particle's velocity: 105:Evolutionary multimodal optimization 4149: 4053:Applied Mathematics and Computation 3222: 2592:Bratton, D.; Blackwell, T. (2008). 1982:Binary, discrete, and combinatorial 1962:is the decrease control parameter. 1930:is the number of the iteration and 1833:{\displaystyle \alpha \sim 0.1-0.5} 954:swarm share the same best position 632:a termination criterion is not met 25:A particle swarm searching for the 13: 4424:symmetry breaking of escaping ants 3744:. Vol. 7264. pp. 74–85. 3706:IEEE Transactions on Fuzzy Systems 3649:. Vol. 2. pp. 1588–1593. 2825:Swarm and Evolutionary Computation 1801:{\displaystyle \beta \sim 0.1-0.7} 1134: 14: 4843: 4258:Infinite-dimensional optimization 4120: 2463:Carlisle, A.; Dozier, G. (2001). 2003:applying a velocity to a position 1955:{\displaystyle 0<\gamma <1} 1165: 974: 301:) is a computational method that 4461: 4152:Expert Systems with Applications 3903:X. S. Yang, S. Deb and S. Fong, 3638:Lovbjerg, M.; Krink, T. (2002). 3539:Lovbjerg, M.; Krink, T. (2002). 3165:IEEE Transactions on Cybernetics 2441:Eberhart, R.C.; Shi, Y. (2000). 1593:signifies the norm of a vector. 1112: 995:yet still ensure a good rate of 336:PSO is originally attributed to 130:Promoter based genetic algorithm 4207:Major subfields of optimization 4044: 4032: 4019: 4006: 3993: 3910: 3897: 3856: 3697: 3598: 3458: 3304: 3183: 3152: 3103: 3094: 3081: 3068: 3056: 3002: 2989: 2022:Artificial bee colony algorithm 1122:Alleviate premature convergence 658: = 1, ...,  644: = 1, ...,  503: = 1, ...,  65:Cellular evolutionary algorithm 2870:10.1109/FUZZ-IEEE.2015.7337957 2594:"A Simplified Recombinant PSO" 2564:Information Processing Letters 2474:. pp. 1–6. Archived from 2449:. Vol. 1. pp. 84–88. 2289: 1740: 1704: 1683: 1659: 1649: 1637: 1634: 1622: 1579: 1574: 1566: 1561: 1517: 1487: 1475: 1466: 1440: 1400:{\displaystyle {\vec {p}}_{i}} 1385: 1364:{\displaystyle {\vec {x}}_{i}} 1349: 1309: 1304: 1297: 1276: 1265: 1260: 1244: 1223: 1190: 1014: 922: 910: 1: 2981:: CS1 maint: date and year ( 2576:10.1016/S0020-0190(02)00447-7 2062: 2057:Dispersive flies optimisation 1970:PSO has also been applied to 1456:is the global best position; 1000: 949:Neighbourhoods and topologies 216:Cartesian genetic programming 135:Spiral optimization algorithm 4385:Mixed-species foraging flock 4336:Agent-based model in biology 4318: 3987:10.1016/j.neucom.2017.03.086 3605:Zhang, Y.; Wang, S. (2015). 3200:10.1007/978-3-319-09952-1_12 2032:Derivative-free optimization 1966:Multi-objective optimization 1019:In relation to PSO the word 397: 231:Multi expression programming 7: 4632:Particle swarm optimization 4273:Multiobjective optimization 3750:10.1007/978-3-642-29066-4_7 3032:10.1007/978-3-319-44427-7_7 3008:Suganthan, Ponnuthurai N. " 2837:10.1016/j.swevo.2017.09.001 2015: 1088: 348:and was first intended for 295:particle swarm optimization 110:Particle swarm optimization 18:Iterative simulation method 10: 4848: 4341:Collective animal behavior 4253:Combinatorial optimization 4164:10.1016/j.eswa.2008.10.086 3718:10.1109/TFUZZ.2013.2278972 3674:10.1016/j.asoc.2009.06.010 3575:10.1016/j.asoc.2009.07.001 3436:10.1109/TSMCB.2009.2015956 2893:10.1109/CIBCB.2015.7300288 2807:10.1016/j.asoc.2017.10.018 2780:10.1016/j.asoc.2009.08.029 2495:van den Bergh, F. (2001). 2410:10.1016/j.asoc.2015.10.004 1749:{\displaystyle {\vec {u}}} 1586:{\displaystyle ||\dots ||} 1526:{\displaystyle {\vec {x}}} 1449:{\displaystyle {\vec {g}}} 221:Linear genetic programming 168:Clonal selection algorithm 120:Natural evolution strategy 4733: 4695: 4650: 4602: 4470: 4459: 4326: 4213: 4102:10.1109/tevc.2009.2030331 4065:10.1016/j.amc.2007.04.096 3399:10.1109/TEVC.2010.2052054 3354:10.1007/s11721-014-0095-1 3280:10.1007/s11721-017-0141-x 3177:10.1109/TCYB.2020.3028070 2716:Pedersen, M.E.H. (2010). 4670:Self-propelled particles 4171:Links to PSO source code 3873:10.1109/SIS.2003.1202251 3839:10.1109/TEVC.2008.926734 3802:10.1109/TEVC.2004.831258 3508:10.1109/CEC.2013.6557848 3225:Evolutionary Computation 2951:10.1109/CEC.2002.1004493 2233:Technical Report CSM-469 2186:10.1109/ICEC.1997.592326 2158:10.1109/ICEC.1998.699146 2132:10.1109/ICNN.1995.488968 2081:Evolutionary Computation 2037:Multi-swarm optimization 1972:multi-objective problems 1129:multi-swarm optimization 85:Evolutionary computation 4832:Evolutionary algorithms 4751:Collective intelligence 4617:Ant colony optimization 4268:Constraint satisfaction 3320:Fundamenta Informaticae 2908:Technical Report HL1001 2681:10.1186/1471-2105-7-125 2365:10.1109/SIS.2007.368035 2328:HAL Open Access Archive 1853:{\displaystyle \alpha } 1546:{\displaystyle \sigma } 1533:and standard deviation 457:is the global minimum. 4771:Microbial intelligence 4431:Shoaling and schooling 4243:Stochastic programming 4223:Fractional programming 4127:Particle Swarm Central 3662:Applied Soft Computing 3563:Applied Soft Computing 3064:Particle Swarm Central 2795:Applied Soft Computing 2758:Applied Soft Computing 2465:"An Off-The-Shelf PSO" 2398:Applied Soft Computing 1956: 1924: 1904: 1854: 1834: 1802: 1770: 1750: 1718: 1587: 1547: 1527: 1494: 1450: 1421: 1401: 1365: 1326: 929: 896: 75:Differential evolution 55:Artificial development 46:Evolutionary algorithm 30: 4238:Nonlinear programming 4233:Quadratic programming 3962:. pp. 1051–1056. 3941:10.1145/508791.508907 2000:adding two velocities 1957: 1925: 1905: 1860:with each iteration, 1855: 1835: 1803: 1771: 1751: 1719: 1588: 1548: 1528: 1495: 1451: 1422: 1402: 1366: 1327: 989:premature convergence 930: 894: 665:Pick random numbers: 512:uniformly distributed 323:mathematical formulae 291:computational science 226:Grammatical evolution 188:Genetic fuzzy systems 24: 4791:Spatial organization 4756:Decentralised system 4594:Sea turtle migration 4448:Swarming (honey bee) 3935:. pp. 603–607. 3624:10.2528/pier15040602 3237:10.1162/EVCO_a_00129 3159:Jian-Yu, Li (2021). 2359:. pp. 120–127. 2180:. pp. 303–308. 2093:10.1162/EVCO_r_00180 1987:more sophisticated. 1934: 1914: 1864: 1844: 1812: 1780: 1760: 1731: 1612: 1557: 1537: 1508: 1460: 1431: 1411: 1375: 1339: 1180: 907: 392:quasi-newton methods 325:over the particle's 321:according to simple 309:trying to improve a 4766:Group size measures 4328:Biological swarming 4278:Simulated annealing 4248:Robust optimization 4228:Integer programming 3763:Yang, X.S. (2008). 3550:. pp. 621–630. 2995:Mendes, R. (2004). 2614:10.1155/2008/654184 2546:10.1109/4235.985692 2430:. pp. 591–600. 2282:10.1155/2008/685175 2203:. Morgan Kaufmann. 1502:normal distribution 1077:Adaptive mechanisms 887:Parameter selection 404:candidate solutions 236:Genetic Improvement 207:Genetic programming 140:Self-modifying code 95:Gaussian adaptation 4781:Predator satiation 4642:Swarm (simulation) 4637:Swarm intelligence 4612:Agent-based models 4443:Swarming behaviour 4218:Convex programming 4025:Clerc, M. (2004). 3867:. pp. 80–87. 3484:10.1049/cit2.12106 3342:Swarm Intelligence 3311:Van den Bergh, F. 3268:Swarm Intelligence 3192:Swarm Intelligence 3024:Swarm Intelligence 2668:BMC Bioinformatics 2631:Evers, G. (2009). 2319:Clerc, M. (2012). 2296:Zhang, Y. (2015). 2201:Swarm Intelligence 2152:. pp. 69–73. 2052:Fish School Search 2047:Swarm intelligence 1952: 1920: 1900: 1850: 1830: 1798: 1766: 1746: 1714: 1583: 1543: 1523: 1490: 1446: 1417: 1397: 1361: 1322: 981:schools of thought 979:There are several 925: 897: 365:swarm intelligence 311:candidate solution 90:Evolution strategy 70:Cultural algorithm 31: 4814: 4813: 4801:Military swarming 4746:Animal navigation 4665:Collective motion 4652:Collective motion 4519:reverse migration 4453:Swarming motility 4286: 4285: 3774:978-1-905986-10-1 3767:. Luniver Press. 3517:978-1-4799-0454-9 3209:978-3-319-09951-4 3171:(10): 4848-4859. 3041:978-3-319-44426-0 2960:978-0-7803-7282-5 2642:(Master's thesis) 2374:978-1-4244-0708-8 2260:Poli, R. (2008). 2224:Poli, R. (2007). 2210:978-1-55860-595-4 1923:{\displaystyle n} 1769:{\displaystyle L} 1743: 1707: 1686: 1662: 1625: 1520: 1478: 1443: 1420:{\displaystyle i} 1388: 1352: 1300: 1279: 1254: 1247: 1226: 1193: 1160:genetic algorithm 940:meta-optimization 287: 286: 154:Genetic algorithm 115:Memetic algorithm 100:Grammar induction 80:Effective fitness 4839: 4627:Crowd simulation 4604:Swarm algorithms 4575:Insect migration 4480:Animal migration 4472:Animal migration 4465: 4390:Mobbing behavior 4313: 4306: 4299: 4290: 4289: 4200: 4193: 4186: 4177: 4176: 4167: 4158:(5): 9533–9538. 4114: 4113: 4095: 4075: 4069: 4068: 4048: 4042: 4040:Open Archive HAL 4036: 4030: 4023: 4017: 4010: 4004: 3997: 3991: 3990: 3970: 3964: 3963: 3951: 3945: 3944: 3928: 3922: 3921: 3914: 3908: 3901: 3895: 3894: 3860: 3854: 3851: 3820: 3814: 3813: 3785: 3779: 3778: 3760: 3754: 3753: 3736: 3730: 3729: 3701: 3695: 3684: 3678: 3677: 3657: 3651: 3650: 3644: 3635: 3629: 3628: 3626: 3602: 3596: 3585: 3579: 3578: 3558: 3552: 3551: 3545: 3536: 3530: 3529: 3495: 3489: 3488: 3486: 3462: 3456: 3455: 3430:(6): 1362–1381. 3421: 3412: 3403: 3402: 3384: 3375: 3366: 3365: 3339: 3330: 3324: 3323: 3317: 3308: 3302: 3301: 3291: 3263: 3257: 3256: 3220: 3214: 3213: 3187: 3181: 3180: 3156: 3150: 3149: 3147: 3146: 3140: 3129: 3120: 3114: 3107: 3101: 3098: 3092: 3085: 3079: 3072: 3066: 3060: 3054: 3053: 3019: 3013: 3006: 3000: 2993: 2987: 2986: 2980: 2972: 2944: 2928: 2922: 2921: 2919: 2903: 2897: 2896: 2880: 2874: 2873: 2864:. pp. 1–8. 2857: 2851: 2850: 2848: 2820: 2811: 2810: 2790: 2784: 2783: 2773: 2753: 2742: 2741: 2739: 2733:. Archived from 2724: 2713: 2704: 2703: 2693: 2683: 2659: 2653: 2652: 2650: 2649: 2643: 2628: 2619: 2618: 2616: 2598: 2589: 2580: 2579: 2559: 2550: 2549: 2539: 2519: 2506: 2505: 2503: 2492: 2483: 2482: 2480: 2469: 2460: 2451: 2450: 2438: 2432: 2431: 2423: 2414: 2413: 2393: 2387: 2386: 2358: 2347: 2332: 2331: 2325: 2316: 2310: 2309: 2293: 2287: 2286: 2284: 2266: 2257: 2251: 2250: 2248: 2247: 2241: 2235:. Archived from 2230: 2221: 2215: 2214: 2196: 2190: 2189: 2173: 2162: 2161: 2145: 2136: 2135: 2119: 2113: 2112: 2076: 1976:Pareto dominance 1961: 1959: 1958: 1953: 1929: 1927: 1926: 1921: 1909: 1907: 1906: 1901: 1899: 1898: 1889: 1888: 1876: 1875: 1859: 1857: 1856: 1851: 1839: 1837: 1836: 1831: 1807: 1805: 1804: 1799: 1775: 1773: 1772: 1767: 1755: 1753: 1752: 1747: 1745: 1744: 1736: 1723: 1721: 1720: 1715: 1709: 1708: 1700: 1688: 1687: 1679: 1670: 1669: 1664: 1663: 1655: 1633: 1632: 1627: 1626: 1618: 1592: 1590: 1589: 1584: 1582: 1577: 1569: 1564: 1552: 1550: 1549: 1544: 1532: 1530: 1529: 1524: 1522: 1521: 1513: 1499: 1497: 1496: 1491: 1480: 1479: 1471: 1455: 1453: 1452: 1447: 1445: 1444: 1436: 1426: 1424: 1423: 1418: 1406: 1404: 1403: 1398: 1396: 1395: 1390: 1389: 1381: 1370: 1368: 1367: 1362: 1360: 1359: 1354: 1353: 1345: 1331: 1329: 1328: 1323: 1317: 1313: 1312: 1307: 1302: 1301: 1293: 1287: 1286: 1281: 1280: 1272: 1268: 1263: 1255: 1250: 1249: 1248: 1240: 1234: 1233: 1228: 1227: 1219: 1214: 1201: 1200: 1195: 1194: 1186: 934: 932: 931: 928:{\displaystyle } 926: 388:gradient descent 353:social behaviour 279: 272: 265: 251:Parity benchmark 145:Polymorphic code 33: 32: 4847: 4846: 4842: 4841: 4840: 4838: 4837: 4836: 4817: 4816: 4815: 4810: 4729: 4691: 4646: 4598: 4466: 4457: 4322: 4317: 4287: 4282: 4209: 4204: 4123: 4118: 4117: 4093:10.1.1.224.5378 4076: 4072: 4049: 4045: 4037: 4033: 4024: 4020: 4011: 4007: 3998: 3994: 3971: 3967: 3952: 3948: 3929: 3925: 3916: 3915: 3911: 3902: 3898: 3883: 3861: 3857: 3821: 3817: 3786: 3782: 3775: 3761: 3757: 3737: 3733: 3702: 3698: 3685: 3681: 3658: 3654: 3642: 3636: 3632: 3603: 3599: 3586: 3582: 3559: 3555: 3543: 3537: 3533: 3518: 3496: 3492: 3463: 3459: 3419: 3413: 3406: 3382: 3376: 3369: 3337: 3331: 3327: 3315: 3309: 3305: 3264: 3260: 3221: 3217: 3210: 3188: 3184: 3157: 3153: 3144: 3142: 3138: 3127: 3121: 3117: 3108: 3104: 3099: 3095: 3086: 3082: 3073: 3069: 3061: 3057: 3042: 3020: 3016: 3007: 3003: 2994: 2990: 2974: 2973: 2961: 2942:10.1.1.114.7988 2929: 2925: 2917:10.1.1.298.4359 2904: 2900: 2881: 2877: 2858: 2854: 2821: 2814: 2791: 2787: 2771:10.1.1.149.8300 2754: 2745: 2737: 2722: 2714: 2707: 2660: 2656: 2647: 2645: 2641: 2629: 2622: 2596: 2590: 2583: 2560: 2553: 2537:10.1.1.460.6608 2520: 2509: 2501: 2493: 2486: 2478: 2467: 2461: 2454: 2439: 2435: 2424: 2417: 2394: 2390: 2375: 2356: 2348: 2335: 2323: 2317: 2313: 2294: 2290: 2264: 2258: 2254: 2245: 2243: 2239: 2228: 2222: 2218: 2211: 2197: 2193: 2174: 2165: 2146: 2139: 2120: 2116: 2077: 2070: 2065: 2042:Particle filter 2018: 1984: 1968: 1935: 1932: 1931: 1915: 1912: 1911: 1894: 1890: 1884: 1880: 1871: 1867: 1865: 1862: 1861: 1845: 1842: 1841: 1813: 1810: 1809: 1781: 1778: 1777: 1761: 1758: 1757: 1735: 1734: 1732: 1729: 1728: 1699: 1698: 1678: 1677: 1665: 1654: 1653: 1652: 1628: 1617: 1616: 1615: 1613: 1610: 1609: 1599: 1578: 1573: 1565: 1560: 1558: 1555: 1554: 1538: 1535: 1534: 1512: 1511: 1509: 1506: 1505: 1470: 1469: 1461: 1458: 1457: 1435: 1434: 1432: 1429: 1428: 1412: 1409: 1408: 1391: 1380: 1379: 1378: 1376: 1373: 1372: 1355: 1344: 1343: 1342: 1340: 1337: 1336: 1308: 1303: 1292: 1291: 1282: 1271: 1270: 1269: 1264: 1259: 1239: 1238: 1229: 1218: 1217: 1216: 1215: 1213: 1212: 1208: 1196: 1185: 1184: 1183: 1181: 1178: 1177: 1168: 1137: 1135:Simplifications 1124: 1115: 1100: 1091: 1079: 1017: 977: 951: 908: 905: 904: 889: 878: 874: 867: 863: 858: 851: 844: 843: 820: 807: 800: 789: 778: 765: 758: 751: 744: 737: 730: 724: 720: 713: 706: 700: 696: 690: β† w 689: 678: 671: 654:each dimension 626: 619: 612: 605: 595: 588: 565: 552: 545: 537: 530: 520: 514:random vector: 484: 478: βˆˆ ℝ. Let 477: 470: 400: 283: 60:Artificial life 19: 12: 11: 5: 4845: 4835: 4834: 4829: 4827:Metaheuristics 4812: 4811: 4809: 4808: 4803: 4798: 4793: 4788: 4786:Quorum sensing 4783: 4778: 4773: 4768: 4763: 4758: 4753: 4748: 4743: 4737: 4735: 4734:Related topics 4731: 4730: 4728: 4727: 4722: 4720:Swarm robotics 4717: 4712: 4707: 4701: 4699: 4697:Swarm robotics 4693: 4692: 4690: 4689: 4684: 4679: 4678: 4677: 4667: 4662: 4656: 4654: 4648: 4647: 4645: 4644: 4639: 4634: 4629: 4624: 4619: 4614: 4608: 4606: 4600: 4599: 4597: 4596: 4591: 4590: 4589: 4588: 4587: 4572: 4571: 4570: 4565: 4555: 4554: 4553: 4548: 4543: 4538: 4531:Fish migration 4528: 4526:Cell migration 4523: 4522: 4521: 4516: 4509:Bird migration 4506: 4505: 4504: 4502:coded wire tag 4499: 4498: 4497: 4487: 4476: 4474: 4468: 4467: 4460: 4458: 4456: 4455: 4450: 4445: 4440: 4439: 4438: 4428: 4427: 4426: 4421: 4411: 4410: 4409: 4399: 4398: 4397: 4395:feeding frenzy 4387: 4382: 4377: 4376: 4375: 4365: 4364: 4363: 4358: 4348: 4343: 4338: 4332: 4330: 4324: 4323: 4316: 4315: 4308: 4301: 4293: 4284: 4283: 4281: 4280: 4275: 4270: 4265: 4263:Metaheuristics 4260: 4255: 4250: 4245: 4240: 4235: 4230: 4225: 4220: 4214: 4211: 4210: 4203: 4202: 4195: 4188: 4180: 4174: 4173: 4168: 4147: 4141: 4136: 4130: 4122: 4121:External links 4119: 4116: 4115: 4086:(2): 278–300. 4070: 4043: 4031: 4018: 4005: 3992: 3975:Neurocomputing 3965: 3946: 3923: 3909: 3896: 3881: 3855: 3815: 3796:(5): 456–470. 3780: 3773: 3755: 3731: 3712:(4): 919–933. 3696: 3679: 3668:(1): 119–124. 3652: 3630: 3597: 3580: 3569:(1): 183–197. 3553: 3531: 3516: 3490: 3477:(3): 849-862. 3457: 3404: 3393:(6): 832–847. 3367: 3348:(3): 159–198. 3325: 3303: 3258: 3231:(2): 187–216. 3215: 3208: 3182: 3151: 3115: 3102: 3093: 3080: 3067: 3055: 3040: 3014: 3001: 2988: 2959: 2923: 2898: 2875: 2852: 2812: 2785: 2764:(2): 618–628. 2743: 2740:on 2020-02-13. 2705: 2654: 2620: 2581: 2570:(6): 317–325. 2551: 2507: 2484: 2481:on 2003-05-03. 2452: 2433: 2415: 2388: 2373: 2333: 2311: 2288: 2252: 2216: 2209: 2191: 2163: 2137: 2114: 2067: 2066: 2064: 2061: 2060: 2059: 2054: 2049: 2044: 2039: 2034: 2029: 2027:Bees algorithm 2024: 2017: 2014: 2005: 2004: 2001: 1998: 1995: 1983: 1980: 1967: 1964: 1951: 1948: 1945: 1942: 1939: 1919: 1897: 1893: 1887: 1883: 1879: 1874: 1870: 1849: 1829: 1826: 1823: 1820: 1817: 1797: 1794: 1791: 1788: 1785: 1765: 1742: 1739: 1725: 1724: 1713: 1706: 1703: 1697: 1694: 1691: 1685: 1682: 1676: 1673: 1668: 1661: 1658: 1651: 1648: 1645: 1642: 1639: 1636: 1631: 1624: 1621: 1598: 1595: 1581: 1576: 1572: 1568: 1563: 1542: 1519: 1516: 1504:with the mean 1489: 1486: 1483: 1477: 1474: 1468: 1465: 1442: 1439: 1416: 1394: 1387: 1384: 1358: 1351: 1348: 1333: 1332: 1321: 1316: 1311: 1306: 1299: 1296: 1290: 1285: 1278: 1275: 1267: 1262: 1258: 1253: 1246: 1243: 1237: 1232: 1225: 1222: 1211: 1207: 1204: 1199: 1192: 1189: 1167: 1166:Bare Bones PSO 1164: 1156:proven correct 1148:metaheuristics 1136: 1133: 1123: 1120: 1114: 1111: 1098: 1090: 1087: 1078: 1075: 1044: 1043: 1032: 1016: 1013: 976: 975:Inner workings 973: 950: 947: 924: 921: 918: 915: 912: 888: 885: 876: 872: 865: 861: 856: 849: 841: 818: 805: 798: 787: 776: 763: 756: 749: 742: 735: 728: 722: 718: 711: 704: 698: 694: 687: 676: 669: 640:each particle 624: 617: 610: 603: 593: 586: 563: 550: 543: 535: 528: 518: 499:each particle 495: 482: 475: 468: 441:) β‰€  409:Formally, let 399: 396: 384:differentiable 285: 284: 282: 281: 274: 267: 259: 256: 255: 254: 253: 248: 243: 238: 233: 228: 223: 218: 210: 209: 203: 202: 201: 200: 195: 190: 185: 183:Genetic memory 180: 175: 170: 165: 157: 156: 150: 149: 148: 147: 142: 137: 132: 127: 125:Neuroevolution 122: 117: 112: 107: 102: 97: 92: 87: 82: 77: 72: 67: 62: 57: 49: 48: 42: 41: 27:global minimum 17: 9: 6: 4: 3: 2: 4844: 4833: 4830: 4828: 4825: 4824: 4822: 4807: 4804: 4802: 4799: 4797: 4794: 4792: 4789: 4787: 4784: 4782: 4779: 4777: 4774: 4772: 4769: 4767: 4764: 4762: 4759: 4757: 4754: 4752: 4749: 4747: 4744: 4742: 4739: 4738: 4736: 4732: 4726: 4723: 4721: 4718: 4716: 4713: 4711: 4708: 4706: 4703: 4702: 4700: 4698: 4694: 4688: 4685: 4683: 4680: 4676: 4673: 4672: 4671: 4668: 4666: 4663: 4661: 4660:Active matter 4658: 4657: 4655: 4653: 4649: 4643: 4640: 4638: 4635: 4633: 4630: 4628: 4625: 4623: 4620: 4618: 4615: 4613: 4610: 4609: 4607: 4605: 4601: 4595: 4592: 4586: 4583: 4582: 4581: 4578: 4577: 4576: 4573: 4569: 4566: 4564: 4561: 4560: 4559: 4556: 4552: 4549: 4547: 4544: 4542: 4539: 4537: 4536:diel vertical 4534: 4533: 4532: 4529: 4527: 4524: 4520: 4517: 4515: 4512: 4511: 4510: 4507: 4503: 4500: 4496: 4493: 4492: 4491: 4488: 4486: 4483: 4482: 4481: 4478: 4477: 4475: 4473: 4469: 4464: 4454: 4451: 4449: 4446: 4444: 4441: 4437: 4434: 4433: 4432: 4429: 4425: 4422: 4420: 4417: 4416: 4415: 4412: 4408: 4405: 4404: 4403: 4400: 4396: 4393: 4392: 4391: 4388: 4386: 4383: 4381: 4378: 4374: 4373:herd behavior 4371: 4370: 4369: 4366: 4362: 4359: 4357: 4354: 4353: 4352: 4349: 4347: 4344: 4342: 4339: 4337: 4334: 4333: 4331: 4329: 4325: 4321: 4314: 4309: 4307: 4302: 4300: 4295: 4294: 4291: 4279: 4276: 4274: 4271: 4269: 4266: 4264: 4261: 4259: 4256: 4254: 4251: 4249: 4246: 4244: 4241: 4239: 4236: 4234: 4231: 4229: 4226: 4224: 4221: 4219: 4216: 4215: 4212: 4208: 4201: 4196: 4194: 4189: 4187: 4182: 4181: 4178: 4172: 4169: 4165: 4161: 4157: 4153: 4148: 4145: 4142: 4140: 4137: 4134: 4133:A brief video 4131: 4128: 4125: 4124: 4111: 4107: 4103: 4099: 4094: 4089: 4085: 4081: 4074: 4066: 4062: 4058: 4054: 4047: 4041: 4035: 4028: 4022: 4015: 4009: 4002: 3996: 3988: 3984: 3980: 3976: 3969: 3961: 3957: 3950: 3942: 3938: 3934: 3927: 3919: 3913: 3906: 3900: 3892: 3888: 3884: 3882:0-7803-7914-4 3878: 3874: 3870: 3866: 3859: 3853: 3850: 3848: 3844: 3840: 3836: 3832: 3826: 3819: 3811: 3807: 3803: 3799: 3795: 3791: 3784: 3776: 3770: 3766: 3759: 3751: 3747: 3743: 3735: 3727: 3723: 3719: 3715: 3711: 3707: 3700: 3693: 3689: 3683: 3675: 3671: 3667: 3663: 3656: 3648: 3641: 3634: 3625: 3620: 3616: 3612: 3608: 3601: 3594: 3590: 3584: 3576: 3572: 3568: 3564: 3557: 3549: 3542: 3535: 3527: 3523: 3519: 3513: 3509: 3505: 3501: 3494: 3485: 3480: 3476: 3472: 3468: 3461: 3453: 3449: 3445: 3441: 3437: 3433: 3429: 3425: 3418: 3411: 3409: 3400: 3396: 3392: 3388: 3381: 3374: 3372: 3363: 3359: 3355: 3351: 3347: 3343: 3336: 3329: 3321: 3314: 3307: 3299: 3295: 3290: 3285: 3281: 3277: 3273: 3269: 3262: 3254: 3250: 3246: 3242: 3238: 3234: 3230: 3226: 3219: 3211: 3205: 3201: 3197: 3193: 3186: 3178: 3174: 3170: 3166: 3162: 3155: 3141:on 2013-10-23 3137: 3133: 3126: 3119: 3112: 3106: 3097: 3090: 3084: 3078: 3071: 3065: 3059: 3051: 3047: 3043: 3037: 3033: 3029: 3025: 3018: 3011: 3005: 2998: 2992: 2984: 2978: 2970: 2966: 2962: 2956: 2952: 2948: 2943: 2938: 2934: 2927: 2918: 2913: 2909: 2902: 2894: 2890: 2886: 2879: 2871: 2867: 2863: 2856: 2847: 2842: 2838: 2834: 2830: 2826: 2819: 2817: 2808: 2804: 2800: 2796: 2789: 2781: 2777: 2772: 2767: 2763: 2759: 2752: 2750: 2748: 2736: 2732: 2728: 2721: 2720: 2712: 2710: 2701: 2697: 2692: 2687: 2682: 2677: 2673: 2669: 2665: 2658: 2644:on 2011-05-18 2640: 2636: 2635: 2627: 2625: 2615: 2610: 2606: 2602: 2595: 2588: 2586: 2577: 2573: 2569: 2565: 2558: 2556: 2547: 2543: 2538: 2533: 2529: 2525: 2518: 2516: 2514: 2512: 2500: 2499: 2491: 2489: 2477: 2473: 2466: 2459: 2457: 2448: 2444: 2437: 2429: 2422: 2420: 2411: 2407: 2403: 2399: 2392: 2384: 2380: 2376: 2370: 2366: 2362: 2355: 2354: 2346: 2344: 2342: 2340: 2338: 2329: 2322: 2315: 2307: 2303: 2299: 2292: 2283: 2278: 2274: 2270: 2263: 2256: 2242:on 2011-07-16 2238: 2234: 2227: 2220: 2212: 2206: 2202: 2195: 2187: 2183: 2179: 2172: 2170: 2168: 2159: 2155: 2151: 2144: 2142: 2133: 2129: 2125: 2118: 2110: 2106: 2102: 2098: 2094: 2090: 2086: 2082: 2075: 2073: 2068: 2058: 2055: 2053: 2050: 2048: 2045: 2043: 2040: 2038: 2035: 2033: 2030: 2028: 2025: 2023: 2020: 2019: 2013: 2010: 2002: 1999: 1996: 1993: 1992: 1991: 1988: 1979: 1977: 1973: 1963: 1949: 1946: 1943: 1940: 1937: 1917: 1895: 1891: 1885: 1881: 1877: 1872: 1868: 1847: 1827: 1824: 1821: 1818: 1815: 1795: 1792: 1789: 1786: 1783: 1763: 1737: 1711: 1701: 1695: 1692: 1689: 1680: 1674: 1671: 1666: 1656: 1646: 1643: 1640: 1629: 1619: 1608: 1607: 1606: 1603: 1594: 1570: 1540: 1514: 1503: 1484: 1481: 1472: 1463: 1437: 1414: 1392: 1382: 1356: 1346: 1319: 1314: 1294: 1288: 1283: 1273: 1256: 1251: 1241: 1235: 1230: 1220: 1209: 1205: 1202: 1197: 1187: 1176: 1175: 1174: 1171: 1163: 1161: 1157: 1153: 1149: 1144: 1142: 1141:Occam's razor 1132: 1130: 1119: 1113:Hybridization 1110: 1106: 1104: 1097: 1086: 1083: 1074: 1072: 1068: 1064: 1059: 1056: 1054: 1050: 1041: 1037: 1033: 1030: 1026: 1025: 1024: 1022: 1012: 1010: 1004: 1002: 998: 994: 993:local optimum 990: 984: 982: 972: 969: 965: 962: 957: 946: 943: 941: 936: 919: 916: 913: 900: 893: 884: 882: 869: 859: 852: 840: 837: β†  836: 832: 828: 824: 817: 813: 810: 804: 801: β†  797: 793: 786: 782: 775: 771: 768: 762: 755: 752: β†  748: 741: 734: 727: 717: 710: 703: 693: 686: 682: 675: 668: 664: 661: 657: 653: 650: 647: 643: 639: 635: 631: 627: 620: 613: 606: 599: 596: ~  592: 585: 582: β†  581: 577: 573: 569: 562: 558: 555: 549: 546: β†  542: 538: 531: 524: 521: ~  517: 513: 509: 506: 502: 498: 494: 492: 488: 481: 474: 467: 463: 458: 456: 452: 448: 444: 440: 436: 432: 428: 424: 420: 416: 412: 407: 405: 395: 393: 389: 385: 381: 377: 376:metaheuristic 372: 370: 366: 362: 358: 354: 351: 347: 343: 339: 334: 332: 328: 324: 320: 316: 312: 308: 305:a problem by 304: 300: 296: 292: 280: 275: 273: 268: 266: 261: 260: 258: 257: 252: 249: 247: 244: 242: 239: 237: 234: 232: 229: 227: 224: 222: 219: 217: 214: 213: 212: 211: 208: 205: 204: 199: 198:Fly algorithm 196: 194: 191: 189: 186: 184: 181: 179: 176: 174: 171: 169: 166: 164: 161: 160: 159: 158: 155: 152: 151: 146: 143: 141: 138: 136: 133: 131: 128: 126: 123: 121: 118: 116: 113: 111: 108: 106: 103: 101: 98: 96: 93: 91: 88: 86: 83: 81: 78: 76: 73: 71: 68: 66: 63: 61: 58: 56: 53: 52: 51: 50: 47: 44: 43: 39: 35: 34: 29:of a function 28: 23: 16: 4741:Allee effect 4715:Nanorobotics 4705:Ant robotics 4682:Vicsek model 4631: 4155: 4151: 4144:Applications 4083: 4079: 4073: 4056: 4052: 4046: 4034: 4021: 4008: 3995: 3978: 3974: 3968: 3959: 3949: 3932: 3926: 3912: 3899: 3864: 3858: 3852: 3830: 3828: 3824: 3818: 3793: 3789: 3783: 3764: 3758: 3741: 3734: 3709: 3705: 3699: 3691: 3682: 3665: 3661: 3655: 3646: 3633: 3614: 3610: 3600: 3592: 3583: 3566: 3562: 3556: 3547: 3534: 3499: 3493: 3474: 3470: 3460: 3427: 3423: 3390: 3386: 3345: 3341: 3328: 3319: 3306: 3271: 3267: 3261: 3228: 3224: 3218: 3191: 3185: 3168: 3164: 3154: 3143:. Retrieved 3136:the original 3131: 3118: 3105: 3096: 3083: 3070: 3058: 3023: 3017: 3004: 2991: 2932: 2926: 2907: 2901: 2884: 2878: 2861: 2855: 2846:10446/106467 2828: 2824: 2798: 2794: 2788: 2761: 2757: 2738:(PhD thesis) 2735:the original 2718: 2671: 2667: 2657: 2646:. Retrieved 2639:the original 2633: 2604: 2600: 2567: 2563: 2530:(1): 58–73. 2527: 2523: 2497: 2476:the original 2471: 2446: 2436: 2427: 2401: 2397: 2391: 2352: 2327: 2314: 2305: 2301: 2291: 2272: 2268: 2255: 2244:. Retrieved 2237:the original 2232: 2219: 2200: 2194: 2177: 2149: 2123: 2117: 2084: 2080: 2008: 2006: 1989: 1985: 1969: 1726: 1604: 1600: 1553:; and where 1334: 1172: 1169: 1145: 1138: 1125: 1116: 1107: 1102: 1095: 1092: 1084: 1080: 1070: 1066: 1060: 1057: 1052: 1048: 1045: 1039: 1035: 1020: 1018: 1005: 985: 978: 970: 966: 960: 955: 952: 944: 937: 901: 898: 870: 854: 847: 845: 838: 834: 830: 826: 822: 815: 811: 808: 802: 795: 791: 784: 780: 773: 769: 766: 760: 753: 746: 739: 732: 725: 715: 708: 701: 691: 684: 680: 673: 666: 662: 659: 655: 651: 648: 645: 641: 637: 633: 629: 622: 615: 608: 601: 597: 590: 583: 579: 575: 571: 567: 560: 556: 553: 547: 540: 533: 526: 522: 515: 507: 504: 500: 496: 490: 486: 479: 472: 465: 461: 459: 454: 450: 446: 442: 438: 434: 430: 426: 419:real numbers 410: 408: 401: 373: 335: 319:search-space 298: 294: 288: 109: 15: 4761:Eusociality 4710:Microbotics 4580:butterflies 4551:sardine run 4485:altitudinal 4407:pack hunter 4059:: 299–308. 3981:: 188–197. 3274:(1): 1–22. 2801:: 148–161. 2404:: 281–295. 2087:(1): 1–54. 1152:empirically 1021:convergence 1015:Convergence 997:convergence 846:The values 361:fish school 307:iteratively 4821:Categories 4675:clustering 4568:philopatry 4546:salmon run 4541:Lessepsian 3833:(6): 781. 3289:2263/62934 3145:2012-04-27 2674:(1): 125. 2648:2010-05-05 2246:2010-05-03 2063:References 1029:converging 449:) for all 433:for which 350:simulating 163:Chromosome 4796:Stigmergy 4776:Mutualism 4436:bait ball 4088:CiteSeerX 3617:: 41–58. 3526:206553432 2977:cite book 2937:CiteSeerX 2912:CiteSeerX 2831:: 70–85. 2766:CiteSeerX 2731:107805461 2532:CiteSeerX 2308:: 931256. 1944:γ 1892:γ 1882:α 1869:α 1848:α 1825:− 1819:∼ 1816:α 1793:− 1787:∼ 1784:β 1741:→ 1705:→ 1693:α 1684:→ 1675:β 1660:→ 1647:β 1644:− 1635:← 1623:→ 1571:… 1541:σ 1518:→ 1485:σ 1476:→ 1441:→ 1386:→ 1350:→ 1298:→ 1289:− 1277:→ 1245:→ 1224:→ 1191:→ 1063:empirical 614:|, | 398:Algorithm 374:PSO is a 315:particles 303:optimizes 193:Selection 173:Crossover 4725:Symbrion 4687:BIO-LGCA 4490:tracking 4419:ant mill 4361:sort sol 4356:flocking 4320:Swarming 4110:17984726 3891:37185749 3810:22382958 3726:27974467 3452:11191625 3444:19362911 3253:25471827 3245:24738856 3050:37588745 2969:14364974 2700:16529661 2607:: 1–10. 2275:: 1–10. 2101:26953883 2016:See also 1910:, where 1089:Variants 489:and let 423:gradient 380:gradient 342:Eberhart 331:velocity 327:position 178:Mutation 38:a series 36:Part of 4585:monarch 4514:flyways 4495:history 4346:Droving 4146:of PSO. 3847:2864886 3362:2261683 3298:9778346 2691:1464136 2383:6217309 2109:8783143 1500:is the 875:, and Ο† 821:) < 779:) < 566:) < 532:,  338:Kennedy 246:Eurisko 4558:Homing 4380:Locust 4108:  4090:  3889:  3879:  3845:  3808:  3771:  3724:  3524:  3514:  3450:  3442:  3360:  3296:  3251:  3243:  3206:  3048:  3038:  2967:  2957:  2939:  2914:  2768:  2729:  2698:  2688:  2534:  2381:  2371:  2207:  2107:  2099:  2012:sets. 1727:where 1335:where 864:and Ο† 636:: 415:vector 241:Schema 40:on the 4622:Boids 4563:natal 4351:Flock 4106:S2CID 3887:S2CID 3843:S2CID 3806:S2CID 3722:S2CID 3643:(PDF) 3544:(PDF) 3522:S2CID 3448:S2CID 3420:(PDF) 3383:(PDF) 3358:S2CID 3338:(PDF) 3316:(PDF) 3294:S2CID 3249:S2CID 3139:(PDF) 3128:(PDF) 3062:SPSO 3046:S2CID 2965:S2CID 2727:S2CID 2723:(PDF) 2597:(PDF) 2502:(PDF) 2479:(PDF) 2468:(PDF) 2379:S2CID 2357:(PDF) 2324:(PDF) 2265:(PDF) 2240:(PDF) 2229:(PDF) 2105:S2CID 1009:below 1001:below 991:to a 881:below 721:) + Ο† 630:while 357:flock 4402:Pack 4368:Herd 3877:ISBN 3769:ISBN 3512:ISBN 3440:PMID 3241:PMID 3204:ISBN 3036:ISBN 2983:link 2955:ISBN 2696:PMID 2605:2008 2369:ISBN 2306:2015 2273:2008 2205:ISBN 2097:PMID 1947:< 1941:< 1808:and 1101:and 1069:and 883:). 853:and 831:then 792:then 576:then 460:Let 390:and 369:Poli 344:and 329:and 4160:doi 4098:doi 4061:doi 4057:195 3983:doi 3979:270 3937:doi 3869:doi 3835:doi 3825:". 3798:doi 3746:doi 3714:doi 3670:doi 3619:doi 3615:152 3571:doi 3504:doi 3479:doi 3432:doi 3395:doi 3350:doi 3284:hdl 3276:doi 3233:doi 3196:doi 3173:doi 3028:doi 2947:doi 2889:doi 2866:doi 2841:hdl 2833:doi 2803:doi 2776:doi 2686:PMC 2676:doi 2609:doi 2572:doi 2542:doi 2406:doi 2361:doi 2277:doi 2182:doi 2154:doi 2128:doi 2089:doi 1828:0.5 1822:0.1 1796:0.7 1790:0.1 743:i,d 719:i,d 712:i,d 697:+ Ο† 695:i,d 688:i,d 652:for 638:for 628:|) 600:(-| 497:for 425:of 417:of 359:or 346:Shi 299:PSO 289:In 4823:: 4156:36 4154:. 4104:. 4096:. 4084:14 4082:. 4055:. 3977:. 3958:. 3885:. 3875:. 3841:. 3831:12 3829:. 3804:. 3792:. 3720:. 3710:22 3708:. 3690:. 3666:10 3664:. 3645:. 3613:. 3609:. 3591:. 3567:10 3565:. 3546:. 3520:. 3510:. 3473:. 3469:. 3446:. 3438:. 3428:39 3426:. 3422:. 3407:^ 3391:15 3389:. 3385:. 3370:^ 3356:. 3344:. 3340:. 3318:. 3292:. 3282:. 3272:12 3270:. 3247:. 3239:. 3229:23 3227:. 3202:. 3169:51 3167:. 3163:. 3130:. 3044:. 3034:. 2979:}} 2975:{{ 2963:. 2953:. 2945:. 2910:. 2887:. 2839:. 2829:39 2827:. 2815:^ 2799:62 2797:. 2774:. 2762:10 2760:. 2746:^ 2708:^ 2694:. 2684:. 2670:. 2666:. 2623:^ 2603:. 2599:. 2584:^ 2568:85 2566:. 2554:^ 2540:. 2526:. 2510:^ 2487:^ 2470:. 2455:^ 2445:. 2418:^ 2402:38 2400:. 2377:. 2367:. 2336:^ 2326:. 2304:. 2300:. 2271:. 2267:. 2231:. 2166:^ 2140:^ 2103:. 2095:. 2085:25 2083:. 2071:^ 1427:; 1371:, 1011:. 1003:. 935:. 857:up 850:lo 829:) 809:if 790:) 767:if 759:+ 679:~ 672:, 663:do 649:do 634:do 625:lo 618:up 611:lo 604:up 574:) 554:if 536:up 529:lo 508:do 340:, 293:, 4312:e 4305:t 4298:v 4199:e 4192:t 4185:v 4166:. 4162:: 4112:. 4100:: 4067:. 4063:: 3989:. 3985:: 3943:. 3939:: 3920:. 3893:. 3871:: 3849:. 3837:: 3812:. 3800:: 3794:8 3777:. 3752:. 3748:: 3728:. 3716:: 3676:. 3672:: 3627:. 3621:: 3577:. 3573:: 3528:. 3506:: 3487:. 3481:: 3475:8 3454:. 3434:: 3401:. 3397:: 3364:. 3352:: 3346:8 3322:. 3300:. 3286:: 3278:: 3255:. 3235:: 3212:. 3198:: 3179:. 3175:: 3148:. 3052:. 3030:: 2985:) 2971:. 2949:: 2920:. 2895:. 2891:: 2872:. 2868:: 2849:. 2843:: 2835:: 2809:. 2805:: 2782:. 2778:: 2702:. 2678:: 2672:7 2651:. 2617:. 2611:: 2578:. 2574:: 2548:. 2544:: 2528:6 2412:. 2408:: 2385:. 2363:: 2330:. 2285:. 2279:: 2249:. 2213:. 2188:. 2184:: 2160:. 2156:: 2134:. 2130:: 2111:. 2091:: 2009:n 1950:1 1938:0 1918:n 1896:n 1886:0 1878:= 1873:n 1764:L 1738:u 1712:, 1702:u 1696:L 1690:+ 1681:g 1672:+ 1667:i 1657:x 1650:) 1641:1 1638:( 1630:i 1620:x 1580:| 1575:| 1567:| 1562:| 1515:x 1488:) 1482:, 1473:x 1467:( 1464:G 1438:g 1415:i 1393:i 1383:p 1357:i 1347:x 1320:, 1315:) 1310:| 1305:| 1295:g 1284:i 1274:p 1266:| 1261:| 1257:, 1252:2 1242:g 1236:+ 1231:i 1221:p 1210:( 1206:G 1203:= 1198:i 1188:x 1127:( 1103:g 1099:i 1096:p 1071:g 1067:p 1053:g 1049:p 1040:g 1036:p 961:m 956:g 923:] 920:3 917:, 914:1 911:[ 877:g 873:p 866:g 862:p 855:b 848:b 842:i 839:p 835:g 827:g 825:( 823:f 819:i 816:p 814:( 812:f 806:i 803:x 799:i 796:p 788:i 785:p 783:( 781:f 777:i 774:x 772:( 770:f 764:i 761:v 757:i 754:x 750:i 747:x 740:x 738:- 736:d 733:g 731:( 729:g 726:r 723:g 716:x 714:- 709:p 707:( 705:p 702:r 699:p 692:v 685:v 681:U 677:g 674:r 670:p 667:r 660:n 656:d 646:S 642:i 623:b 621:- 616:b 609:b 607:- 602:b 598:U 594:i 591:v 587:i 584:p 580:g 572:g 570:( 568:f 564:i 561:p 559:( 557:f 551:i 548:x 544:i 541:p 534:b 527:b 525:( 523:U 519:i 516:x 505:S 501:i 491:g 487:i 483:i 480:p 476:i 473:v 469:i 466:x 462:S 455:a 451:b 447:b 445:( 443:f 439:a 437:( 435:f 431:a 427:f 411:f 297:( 278:e 271:t 264:v

Index


global minimum
a series
Evolutionary algorithm
Artificial development
Artificial life
Cellular evolutionary algorithm
Cultural algorithm
Differential evolution
Effective fitness
Evolutionary computation
Evolution strategy
Gaussian adaptation
Grammar induction
Evolutionary multimodal optimization
Particle swarm optimization
Memetic algorithm
Natural evolution strategy
Neuroevolution
Promoter based genetic algorithm
Spiral optimization algorithm
Self-modifying code
Polymorphic code
Genetic algorithm
Chromosome
Clonal selection algorithm
Crossover
Mutation
Genetic memory
Genetic fuzzy systems

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑