排序方式: 共有13条查询结果,搜索用时 31 毫秒
11.
Mirjam Dür Charoenchai Khompatraporn Zelda B. Zabinsky 《Annals of Operations Research》2007,156(1):25-44
Fractional programming has numerous applications in economy and engineering. While some fractional problems are easy in the
sense that they are equivalent to an ordinary linear program, other problems like maximizing a sum or product of several ratios
are known to be hard, as these functions are highly nonconvex and multimodal. In contrast to the standard Branch-and-Bound
type algorithms proposed for specific types of fractional problems, we treat general fractional problems with stochastic algorithms
developed for multimodal global optimization. Specifically, we propose Improving Hit-and-Run with restarts, based on a theoretical
analysis of Multistart Pure Adaptive Search (cf. the dissertation of Khompatraporn (2004)) which prescribes a way to utilize problem specific information to sample until a certain level α of confidence is achieved. For this purpose, we analyze the Lipschitz properties of fractional functions, and then utilize
a unified method to solve general fractional problems. The paper ends with a report on numerical experiments.
This work was initiated while Mirjam Dür was spending a three-month research visit at the University of Washington. She would
like to thank the Fulbright Commission for financial support and the optimization group at UW for their warm hospitality.
The work of C. Khompatraporn and Z.B. Zabinsky was partially supported by the NSF grant DMI-0244286. 相似文献
12.
Suppose a sequential sample is taken from an unknown discrete probability distribution on an unknown range of integers, in an effort to sample its maximum. A crucial issue is an appropriate stopping rude determining when to terminate the sampling process. We approach this problem from a Bayesian perspective, and derive stopping rules that minimize loss functions which assign a loss to the sample size and to the deviation between the maximum in the sample and the true (unknown) maximum. We will show that our rules offer an extremely simple approximate solution to the well-known problem to terminate the Multistart method for continuous global optimization. 相似文献
13.
A Population-based Approach for Hard Global Optimization Problems based on Dissimilarity Measures 总被引:1,自引:0,他引:1
When dealing with extremely hard global optimization problems, i.e. problems with a large number of variables and a huge number
of local optima, heuristic procedures are the only possible choice. In this situation, lacking any possibility of guaranteeing
global optimality for most problem instances, it is quite difficult to establish rules for discriminating among different
algorithms. We think that in order to judge the quality of new global optimization methods, different criteria might be adopted
like, e.g.:
Of course the third criterion cannot be considered as a compulsory one, as it might be the case that, for a given problem,
the best known putative global optimum is indeed the global one, so that no algorithm will ever be able to discover a better
one. In this paper we present a computational framework based on a population-based stochastic method in which different candidate
solutions for a single problem are maintained in a population which evolves in such a way as to guarantee a sufficient diversity
among solutions. This diversity enforcement is obtained through the definition of a dissimilarity measure whose definition
is dependent on the specific problem class. We show in the paper that, for some well known and particularly hard test classes,
the proposed method satisfies the above criteria, in that it is both much more efficient and robust when compared with other
published approaches. Moreover, for the very hard problem of determining the minimum energy conformation of a cluster of particles
which interact through short-range Morse potential, our approach was able to discover four new putative optima. 相似文献
1. | efficiency – measured in terms of the computational effort necessary to obtain the putative global optimum |
2. | robustness – measured in terms of “percentage of successes”, i.e. of the number of times the algorithm, re-started with different seeds or starting points, is able to end up at the putative global optimum |
3. | discovery capability – measured in terms of the possibility that an algorithm discovers, for the first time, a putative optimum for a given problem which is better than the best known up to now. |