共查询到20条相似文献,搜索用时 15 毫秒
1.
A Numerical Evaluation of Several Stochastic Algorithms on Selected Continuous Global Optimization Test Problems 总被引:9,自引:0,他引:9
There is a need for a methodology to fairly compare and present evaluation study results of stochastic global optimization algorithms. This need raises two important questions of (i) an appropriate set of benchmark test problems that the algorithms may be tested upon and (ii) a methodology to compactly and completely present the results. To address the first question, we compiled a collection of test problems, some are better known than others. Although the compilation is not exhaustive, it provides an easily accessible collection of standard test problems for continuous global optimization. Five different stochastic global optimization algorithms have been tested on these problems and a performance profile plot based on the improvement of objective function values is constructed to investigate the macroscopic behavior of the algorithms. The paper also investigates the microscopic behavior of the algorithms through quartile sequential plots, and contrasts the information gained from these two kinds of plots. The effect of the length of run is explored by using three maximum numbers of function evaluations and it is shown to significantly impact the behavior of the algorithms. 相似文献
2.
Functions with local minima and size of their region of attraction known a priori, are often needed for testing the performance of algorithms that solve global optimization problems. In this paper we investigate a technique for constructing test functions for global optimization problems for which we fix a priori: (i) the problem dimension, (ii) the number of local minima, (iii) the local minima points, (iv) the function values of the local minima. Further, the size of the region of attraction of each local minimum may be made large or small. The technique consists of first constructing a convex quadratic function and then systematically distorting selected parts of this function so as to introduce local minima. 相似文献
3.
4.
Global Optimization Requires Global Information 总被引:5,自引:0,他引:5
There are many global optimization algorithms which do not use global information. We broaden previous results, showing limitations on such algorithms, even if allowed to run forever. We show that deterministic algorithms must sample a dense set to find the global optimum value and can never be guaranteed to converge only to global optimizers. Further, analogous results show that introducing a stochastic element does not overcome these limitations. An example is simulated annealing in practice. Our results show that there are functions for which the probability of success is arbitrarily small. 相似文献
5.
There is a lack of a representative set of test problems for comparing global optimization methods. To remedy this a classification of essentially unconstrained global optimization problems into unimodal, easy, moderately difficult, and difficult problems is proposed. The problem features giving this classification are the chance to miss the region of attraction of the global minimum, embeddedness of the global minimum, and the number of minimizers. The classification of some often used test problems are given and it is recognized that most of them are easy and some even unimodal. Global optimization solution techniques treated are global, local, and adaptive search and their use for tackling different classes of problems is discussed. The problem of fair comparison of methods is then adressed. Further possible components of a general global optimization tool based on the problem classes and solution techniques is presented. 相似文献
6.
Parallel Simulated Annealing Algorithms in Global Optimization 总被引:4,自引:0,他引:4
Global optimization involves the difficult task of the identification of global extremities of mathematical functions. Such problems are often encountered in practice in various fields, e.g., molecular biology, physics, industrial chemistry. In this work, we develop five different parallel Simulated Annealing (SA) algorithms and compare them on an extensive test bed used previously for the assessment of various solution approaches in global optimization. The parallel SA algorithms consist of various categories: the asynchronous approach where no information is exchanged among parallel runs and the synchronous approaches where solutions are exchanged using genetic operators, or where solutions are transmitted only occasionally, or where highly coupled synchronization is achieved at every iteration. One of these approaches, which occasionally applies partial information exchanges (controlled in terms of solution quality), provides particularly notable results for functions with vast search spaces of up to 400 dimensions. Previous attempts with other approaches, such as sequential SA, adaptive partitioning algorithms and clustering algorithms, to identify the global optima of these functions have failed without exception. 相似文献
7.
H. P. Benson 《Journal of Optimization Theory and Applications》2002,112(1):1-29
This article presents a branch-and-bound algorithm for globally solving the nonlinear sum of ratios problem (P). The algorithm economizes the required computations by conducting the branch-and-bound search in p, rather than in n, where p is the number of ratios in the objective function of problem (P) and n is the number of decision variables in problem (P). To implement the algorithm, the main computations involve solving a sequence of convex programming problems for which standard algorithms are available. 相似文献
8.
H. Tuy 《Journal of Optimization Theory and Applications》2003,118(1):201-216
We discuss global optimality conditions and cutting plane algorithms for DC optimization. The discussion is motivated by certain incorrect results that have appeared recently in the literature on these topics. Incidentally, we investigate the relation of the Tikhonov reciprocity theorem to the optimality conditions for general nonconvex global optimization problems and show how the outer-approximation scheme developed earlier for DC programming can be used to solve a wider class of problems. 相似文献
9.
We describe global optimization problems from three different fields representing many-body potentials in physical chemistry, optimal control of a chemical reactor, and fitting a statistical model to empirical data. Historical background for each of the problems as well as the practical significance of the first two are given. The problems are solved by using eight recently developed stochastic global optimization algorithms representing controlled random search (4 algorithms), simulated annealing (2 algorithms), and clustering (2 algorithms). The results are discussed, and the importance of global optimization in each respective field is focused. 相似文献
10.
M. Locatelli 《Journal of Optimization Theory and Applications》2009,140(1):93-102
We discuss the complexity of a class of highly structured global optimization problems, namely the maximization of separable
functions, with each one-dimensional component convex and nondecreasing, over polytopes defined by a 0-1 constraint matrix
with at most two variables involved in each constraint. In particular, we prove some inapproximability and approximability
results. 相似文献
11.
Global Multiobjective Optimization Using Evolutionary Algorithms 总被引:2,自引:0,他引:2
Thomas Hanne 《Journal of Heuristics》2000,6(3):347-360
Since the 60s, several approaches (genetic algorithms, evolution strategies etc.) have been developed which apply evolutionary concepts for simulation and optimization purposes. Also in the area of multiobjective programming, such approaches (mainly genetic algorithms) have already been used (Evolutionary Computation 3(1), 1–16).In our presentation, we consider a generalization of common approaches like evolution strategies: a multiobjective evolutionary algorithm (MOEA) for analyzing decision problems with alternatives taken from a real-valued vector space and evaluated according to several objective functions. The algorithm is implemented within the Learning Object-Oriented Problem Solver (LOOPS) framework developed by the author. Various test problems are analyzed using the MOEA: (multiobjective) linear programming, convex programming, and global programming. Especially for hard problems with disconnected or local efficient regions, the algorithms seems to be a useful tool. 相似文献
12.
This paper investigates the properties of the inclusion functions on subintervals while a Branch-and-Bound algorithm is solving global optimization problems. It has been found that the relative place of the global minimum value within the inclusion interval of the inclusion function of the objective function at the actual interval mostly indicates whether the given interval is close to minimizer point. This information is used in a heuristic interval rejection rule that can save a big amount of computation. Illustrative examples are discussed and a numerical study completes the investigation.This revised version was published online in October 2005 with corrections to the Cover Date. 相似文献
13.
A method is presented for the construction of test problems involving the minimization over convex sets of sums of ratios of affine functions. Given a nonempty, compact convex set, the method determines a function that is the sum of linear fractional functions and attains a global minimum over the set at a point that can be found by convex programming and univariate search. Generally, the function will have also local minima over the set that are not global minima. 相似文献
14.
Motivated by the recent developments in digital diffusion networks, this work is devoted to the rates of convergence issue for a class of global optimization algorithms. By means of weak convergence methods, we show that a sequence of suitably scaled estimation errors converges weakly to a diffusion process (a solution of a stochastic differential equation). The scaling together with the stationary covariance of the limit diffusion process gives the desired rates of convergence. Application examples are also provided for some image estimation problems. 相似文献
15.
A stochastic global optimization method is applied to the challenging problem of finding the minimum energy conformation of a cluster of identical atoms interacting through the Lennard-Jones potential. The method proposed incorporates within an already existing and quite successful method, monotonic basin hopping, a two-phase local search procedure which is capable of significantly enlarging the basin of attraction of the global optimum. The experiments reported confirm the considerable advantages of this approach, in particular for all those cases which are considered in the literature as the most challenging ones, namely 75, 98, 102 atoms. While being capable of discovering all putative global optima in the range considered, the method proposed improves by more than two orders of magnitude the speed and the percentage of success in finding the global optima of clusters of 75, 98, 102 atoms. 相似文献
16.
This paper examines the complexity of global verification for MAX-SAT, MAX-k-SAT (for k3), Vertex Cover, and Traveling Salesman Problem. These results are obtained by adaptations of the transformations that prove such problems to be NP-complete. The class of problems PGS is defined to be those discrete optimization problems for which there exists a polynomial time algorithm such that given any solution , either a solution can be found with a better objective function value or it can be concluded that no such solution exists and is a global optimum. This paper demonstrates that if any one of MAX-SAT, MAX-k-SAT (for k3), Vertex Cover, or Traveling Salesman Problem are in PGS, then P=NP. 相似文献
17.
Volker Stix 《Journal of Global Optimization》2003,26(3):261-277
We introduce a very simple but efficient idea for branch and bound (&) algorithms in global optimization (GO). As input for our generic algorithm, we need an upper bound algorithm for the GO maximization problem and a branching rule. The latter reduces the problem into several smaller subproblems of the same type. The new & approach delivers one global optimizer or, if stopped before finished, improved upper and lower bounds for the problem. Its main difference to commonly used & techniques is its ability to approximate the problem from above and from below while traversing the problem tree. It needs no supplementary information about the system optimized and does not consume more time than classical & techniques. Experimental results with the maximum clique problem illustrate the benefit of this new method. 相似文献
18.
We develop new algorithms for global optimization by combining well known branch and bound methods with multilevel subdivision
techniques for the computation of invariant sets of dynamical systems. The basic idea is to view iteration schemes for local
optimization problems – e.g. Newton’s method or conjugate gradient methods – as dynamical systems and to compute set coverings
of their fixed points. The combination with bounding techniques allow for the computation of coverings of the global optima
only. We show convergence of the new algorithms and present a particular implementation.
Michael Dellnitz - Research of the authors is partially supported by the Deutsche Forschungsgemeinschaft within the Sonderforschungsbereich
376 相似文献
19.
UEGO is a general clustering technique capable of accelerating and/or parallelizing existing search methods. UEGO is an abstraction of GAS, a genetic algorithm (GA) with subpopulation support, so the niching (i.e. clustering) technique of GAS can be applied along with any kind of optimizers, not only genetic algorithm. The aim of this paper is to analyze the behavior of the algorithm as a function of different parameter settings and types of functions and to examine its reliability with the help of Csendes' method. Comparisons to other methods are also presented. 相似文献