Affiliation:
1. Istituto per i Circuiti Elettronici-C.N.R., Genoa, Italy
Abstract
A new global optimization algorithm for functions of continuous variables is presented, derived from the “Simulated Annealing” algorithm recently introduced in combinatorial optimization.The algorithm is essentially an iterative random search procedure with adaptive moves along the coordinate directions. It permits uphill moves under the control of a probabilistic criterion, thus tending to avoid the first local minima encountered.The algorithm has been tested against the Nelder and Mead simplex method and against a version of Adaptive Random Search. The test functions were Rosenbrock valleys and multiminima functions in 2,4, and 10 dimensions.The new method proved to be more reliable than the others, being always able to find the optimum, or at least a point very close to it. It is quite costly in term of function evaluations, but its cost can be predicted in advance, depending only slightly on the starting point.
Publisher
Association for Computing Machinery (ACM)
Subject
Applied Mathematics,Software
Reference18 articles.
1. DIXON L. C. W. AND SZEGO G. P. (EDS.) Toward Global Optimization. North-Holland Amsterdam 1975. DIXON L. C. W. AND SZEGO G. P. (EDS.) Toward Global Optimization. North-Holland Amsterdam 1975.
2. DIXON L. C. W. AND SZEGO G. e. (EDS.) Toward Global Optimization 2. North-Holland Amsterdam 1978. DIXON L. C. W. AND SZEGO G. e. (EDS.) Toward Global Optimization 2. North-Holland Amsterdam 1978.
3. A Rapidly Convergent Descent Method for Minimization
4. Function minimization by conjugate gradients
Cited by
1148 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献