首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 500 毫秒
1.
In this paper, we are concerned with the development of parallel algorithms for solving some classes of nonconvex optimization problems. We present an introductory survey of parallel algorithms that have been used to solve structured problems (partially separable, and large-scale block structured problems), and algorithms based on parallel local searches for solving general nonconvex problems. Indefinite quadratic programming posynomial optimization, and the general global concave minimization problem can be solved using these approaches. In addition, for the minimum concave cost network flow problem, we are going to present new parallel search algorithms for large-scale problems. Computational results of an efficient implementation on a multi-transputer system will be presented.  相似文献   

2.
Traditionally, minimum cost transshipment problems have been simplified as linear cost problems, which are not practical in real applications. Some advanced local search algorithms have been developed to solve concave cost bipartite network problems. These have been found to be more effective than the traditional linear approximation methods and local search methods. Recently, a genetic algorithm and an ant colony system algorithm were employed to develop two global search algorithms for solving concave cost transshipment problems. These two global search algorithms were found to be more effective than the advanced local search algorithms for solving concave cost transshipment problems. Although the particle swarm optimization algorithm has been used to obtain good results in many applications, to the best of our knowledge, it has not yet been applied in minimum concave cost network flow problems. Thus, in this study, we employ an arc-based particle swarm optimization algorithm, coupled with some genetic algorithm and threshold accepting method techniques, as well as concave cost network heuristics, to develop a hybrid global search algorithm for efficiently solving minimum cost network flow problems with concave arc costs. The proposed algorithm is evaluated by solving several randomly generated network flow problems. The results indicate that the proposed algorithm is more effective than several other recently designed methods, such as local search algorithms, genetic algorithms and ant colony system algorithms, for solving minimum cost network flow problems with concave arc costs.  相似文献   

3.
This paper examines the influence of two major aspects on the solution quality of surrogate model algorithms for computationally expensive black-box global optimization problems, namely the surrogate model choice and the method of iteratively selecting sample points. A random sampling strategy (algorithm SO-M-c) and a strategy where the minimum point of the response surface is used as new sample point (algorithm SO-M-s) are compared in numerical experiments. Various surrogate models and their combinations have been used within the SO-M-c and SO-M-s sampling frameworks. The Dempster–Shafer Theory approach used in the algorithm by Müller and Piché (J Glob Optim 51:79–104, 2011) has been used for combining the surrogate models. The algorithms are numerically compared on 13 deterministic literature test problems with 2–30 dimensions, an application problem that deals with groundwater bioremediation, and an application that arises in energy generation using tethered kites. NOMAD and the particle swarm pattern search algorithm (PSWARM), which are derivative-free optimization methods, have been included in the comparison. The algorithms have also been compared to a kriging method that uses the expected improvement as sampling strategy (FEI), which is similar to the Efficient Global Optimization (EGO) algorithm. Data and performance profiles show that surrogate model combinations containing the cubic radial basis function (RBF) model work best regardless of the sampling strategy, whereas using only a polynomial regression model should be avoided. Kriging and combinations including kriging perform in general worse than when RBF models are used. NOMAD, PSWARM, and FEI perform for most problems worse than SO-M-s and SO-M-c. Within the scope of this study a Matlab toolbox has been developed that allows the user to choose, among others, between various sampling strategies and surrogate models and their combinations. The open source toolbox is available from the authors upon request.  相似文献   

4.
This paper introduces a novel methodology for the global optimization of general constrained grey-box problems. A grey-box problem may contain a combination of black-box constraints and constraints with a known functional form. The novel features of this work include (i) the selection of initial samples through a subset selection optimization problem from a large number of faster low-fidelity model samples (when a low-fidelity model is available), (ii) the exploration of a diverse set of interpolating and non-interpolating functional forms for representing the objective function and each of the constraints, (iii) the global optimization of the parameter estimation of surrogate functions and the global optimization of the constrained grey-box formulation, and (iv) the updating of variable bounds based on a clustering technique. The performance of the algorithm is presented for a set of case studies representing an expensive non-linear algebraic partial differential equation simulation of a pressure swing adsorption system for \(\hbox {CO}_{2}\). We address three significant sources of variability and their effects on the consistency and reliability of the algorithm: (i) the initial sampling variability, (ii) the type of surrogate function, and (iii) global versus local optimization of the surrogate function parameter estimation and overall surrogate constrained grey-box problem. It is shown that globally optimizing the parameters in the parameter estimation model, and globally optimizing the constrained grey-box formulation has a significant impact on the performance. The effect of sampling variability is mitigated by a two-stage sampling approach which exploits information from reduced-order models. Finally, the proposed global optimization approach is compared to existing constrained derivative-free optimization algorithms.  相似文献   

5.
Surrogate constraint methods have been embedded in a variety of mathematical programming applications over the past thirty years, yet their potential uses and underlying principles remain incompletely understood by a large segment of the optimization community. In a number of significant domains of combinatorial optimization, researchers have produced solution strategies without recognizing that they can be derived as special instances of surrogate constraint methods. Once the connection to surrogate constraint ideas is exposed, additional ways to exploit this framework become visible, frequently offering opportunities for improvement.We provide a tutorial on surrogate constraint approaches for optimization in graphs, illustrating the key ideas by reference to independent set and graph coloring problems, including constructions for weighted independent sets which have applications to associated covering and weighted maximum clique problems. In these settings, the surrogate constraints can be generated relative to well-known packing and covering formulations that are convenient for exposing key notions. The surrogate constraint approaches yield widely used heuristics for identifying independent sets as simple special cases, and also afford previously unidentified heuristics that have greater power in these settings. Our tutorial also shows how the use of surrogate constraints can be placed within the context of vocabulary building strategies for independent set and coloring problems, providing a framework for applying surrogate constraints that can be used in other applications.At a higher level, we show how to make use of surrogate constraint information, together with specialized algorithms for solving associated sub-problems, to obtain stronger objective function bounds and improved choice rules for heuristic or exact methods. The theorems that support these developments yield further strategies for exploiting surrogate constraint relaxations, both in graph optimization and integer programming generally.  相似文献   

6.
《Optimization》2012,61(3):403-419
In this article, the application of the electromagnetism-like method (EM) for solving constrained optimization problems is investigated. A number of penalty functions have been tested with EM in this investigation, and their merits and demerits have been discussed. We have also provided motivations for such an investigation. Finally, we have compared EM with two recent global optimization algorithms from the literature. We have shown that EM is a suitable alternative to these methods and that it has a role to play in solving constrained global optimization problems.  相似文献   

7.
Pengcheng Ye 《Optimization》2017,66(7):1135-1155
As a robust and efficient technique for global optimization, surrogate-based optimization method has been widely used in dealing with the complicated and computation-intensive engineering design optimization problems. It’s hard to select an appropriate surrogate model without knowing the behaviour of the real system a priori in most cases. To overcome this difficulty, a global optimization method using an adaptive and parallel ensemble of surrogates combining three representative surrogate models with optimized weight factors has been proposed. The selection of weight factors is treated as an optimization problem with the desired solution being one that minimizes the generalized mean square cross-validation error. The proposed optimization method is tested by considering several well-known numerical examples and one industrial problem compared with other optimization methods. The results show that the proposed optimization method can be a robust and efficient approach in surrogate-based optimization for locating the global optimum.  相似文献   

8.
Evolutionary algorithms are robust and powerful global optimization techniques for solving large-scale problems that have many local optima. However, they require high CPU times, and they are very poor in terms of convergence performance. On the other hand, local search algorithms can converge in a few iterations but lack a global perspective. The combination of global and local search procedures should offer the advantages of both optimization methods while offsetting their disadvantages. This paper proposes a new hybrid optimization technique that merges a genetic algorithm with a local search strategy based on the interior point method. The efficiency of this hybrid approach is demonstrated by solving a constrained multi-objective mathematical test-case.  相似文献   

9.
We present a new strategy for the constrained global optimization of expensive black box functions using response surface models. A response surface model is simply a multivariate approximation of a continuous black box function which is used as a surrogate model for optimization in situations where function evaluations are computationally expensive. Prior global optimization methods that utilize response surface models were limited to box-constrained problems, but the new method can easily incorporate general nonlinear constraints. In the proposed method, which we refer to as the Constrained Optimization using Response Surfaces (CORS) Method, the next point for costly function evaluation is chosen to be the one that minimizes the current response surface model subject to the given constraints and to additional constraints that the point be of some distance from previously evaluated points. The distance requirement is allowed to cycle, starting from a high value (global search) and ending with a low value (local search). The purpose of the constraint is to drive the method towards unexplored regions of the domain and to prevent the premature convergence of the method to some point which may not even be a local minimizer of the black box function. The new method can be shown to converge to the global minimizer of any continuous function on a compact set regardless of the response surface model that is used. Finally, we considered two particular implementations of the CORS method which utilize a radial basis function model (CORS-RBF) and applied it on the box-constrained Dixon–Szegö test functions and on a simple nonlinearly constrained test function. The results indicate that the CORS-RBF algorithms are competitive with existing global optimization algorithms for costly functions on the box-constrained test problems. The results also show that the CORS-RBF algorithms are better than other algorithms for constrained global optimization on the nonlinearly constrained test problem.  相似文献   

10.
Variable neighborhood search: Principles and applications   总被引:5,自引:0,他引:5  
Systematic change of neighborhood within a possibly randomized local search algorithm yields a simple and effective metaheuristic for combinatorial and global optimization, called variable neighborhood search (VNS). We present a basic scheme for this purpose, which can easily be implemented using any local search algorithm as a subroutine. Its effectiveness is illustrated by solving several classical combinatorial or global optimization problems. Moreover, several extensions are proposed for solving large problem instances: using VNS within the successive approximation method yields a two-level VNS, called variable neighborhood decomposition search (VNDS); modifying the basic scheme to explore easily valleys far from the incumbent solution yields an efficient skewed VNS (SVNS) heuristic. Finally, we show how to stabilize column generation algorithms with help of VNS and discuss various ways to use VNS in graph theory, i.e., to suggest, disprove or give hints on how to prove conjectures, an area where metaheuristics do not appear to have been applied before.  相似文献   

11.
In this paper we present a chaos-based evolutionary algorithm (EA) for solving nonlinear programming problems named chaotic genetic algorithm (CGA). CGA integrates genetic algorithm (GA) and chaotic local search (CLS) strategy to accelerate the optimum seeking operation and to speed the convergence to the global solution. The integration of global search represented in genetic algorithm and CLS procedures should offer the advantages of both optimization methods while offsetting their disadvantages. By this way, it is intended to enhance the global convergence and to prevent to stick on a local solution. The inherent characteristics of chaos can enhance optimization algorithms by enabling it to escape from local solutions and increase the convergence to reach to the global solution. Twelve chaotic maps have been analyzed in the proposed approach. The simulation results using the set of CEC’2005 show that the application of chaotic mapping may be an effective strategy to improve the performances of EAs.  相似文献   

12.
This paper provides a comprehensive analysis of computational problems concerning calculation of general correlation coefficients for interval data. Exact algorithms solving this task have unacceptable computational complexity for larger samples, therefore we concentrate on computational problems arising in approximate algorithms. General correlation coefficients for interval data are also given by intervals. We derive bounds on their lower and upper endpoints. Moreover, we propose a set of heuristic solutions and optimization methods for approximate computation. Extensive simulation experiments show that the heuristics yield very good solutions for strong dependencies. In other cases, global optimization using evolutionary algorithm performs best. A real data example of autocorrelation of cloud cover data confirms the applicability of the approach.  相似文献   

13.
Speed and memory requirements of branch and bound algorithms depend on the selection strategy of which candidate node to process next. The goal of this paper is to experimentally investigate this influence to the performance of sequential and parallel branch and bound algorithms. The experiments have been performed solving a number of multidimensional test problems for global optimization. Branch and bound algorithm using simplicial partitions and combination of Lipschitz bounds has been investigated. Similar results may be expected for other branch and bound algorithms.  相似文献   

14.
Heuristic optimization provides a robust and efficient approach for solving complex real-world problems. The aim of this paper is to introduce a hybrid approach combining two heuristic optimization techniques, particle swarm optimization (PSO) and genetic algorithms (GA). Our approach integrates the merits of both GA and PSO and it has two characteristic features. Firstly, the algorithm is initialized by a set of random particles which travel through the search space. During this travel an evolution of these particles is performed by integrating PSO and GA. Secondly, to restrict velocity of the particles and control it, we introduce a modified constriction factor. Finally, the results of various experimental studies using a suite of multimodal test functions taken from the literature have demonstrated the superiority of the proposed approach to finding the global optimal solution.  相似文献   

15.
Rollout algorithms are innovative methods, recently proposed by Bertsekas et al. [3], for solving NP-hard combinatorial optimization problems. The main advantage of these approaches is related to their capability of magnifying the effectiveness of any given heuristic algorithm. However, one of the main limitations of rollout algorithms in solving large-scale problems is represented by their computational complexity. Innovative versions of rollout algorithms, aimed at reducing the computational complexity in sequential environments, have been proposed in our previous work [9]. In this paper, we show that a further reduction can be accomplished by using parallel technologies. Indeed, rollout algorithms have very appealing characteristics that make them suitable for efficient and effective implementations in parallel environments, thus extending their range of relevant practical applications.We propose two strategies for parallelizing rollout algorithms and we analyze their performance by considering a shared-memory paradigm. The computational experiments have been carried out on a SGI Origin 2000 with 8 processors, by considering two classical combinatorial optimization problems. The numerical results show that a good reduction of the execution time can be obtained by exploiting parallel computing systems.  相似文献   

16.
This paper proposes an efficient method for solving complex multicriterial optimization problems, for which the optimality criteria may be multiextremal and the calculations of the criteria values may be time-consuming. The approach involves reducing multicriterial problems to global optimization ones through minimax convolution of partial criteria, reducing dimensionality by using Peano curves and implementing efficient information-statistical methods for global optimization. To efficiently find the set of Pareto-optimal solutions, it is proposed to reuse all the search information obtained in the course of optimization. The results of computational experiments indicate that the proposed approach greatly reduces the computational complexity of solving multicriterial optimization problems.  相似文献   

17.
Nonmonotone line search approach is a new technique for solving optimization problems. It relaxes the line search range and finds a larger step-size at each iteration, so as to possibly avoid local minimizer and run away from narrow curved valley. It is helpful to find the global minimizer of optimization problems. In this paper we develop a new modification of matrix-free nonmonotone Armijo line search and analyze the global convergence and convergence rate of the resulting method. We also address several approaches to estimate the Lipschitz constant of the gradient of objective functions that would be used in line search algorithms. Numerical results show that this new modification of Armijo line search is efficient for solving large scale unconstrained optimization problems.  相似文献   

18.
We show the importance of exploiting the complementary convex structure for efficiently solving a wide class of specially structured nonconvex global optimization problems. Roughly speaking, a specific feature of these problems is that their nonconvex nucleus can be transformed into a complementary convex structure which can then be shifted to a subspace of much lower dimension than the original underlying space. This approach leads to quite efficient algorithms for many problems of practical interest, including linear and convex multiplicative programming problems, concave minimization problems with few nonlinear variables, bilevel linear optimization problems, etc...  相似文献   

19.
There is a need for a methodology to fairly compare and present evaluation study results of stochastic global optimization algorithms. This need raises two important questions of (i) an appropriate set of benchmark test problems that the algorithms may be tested upon and (ii) a methodology to compactly and completely present the results. To address the first question, we compiled a collection of test problems, some are better known than others. Although the compilation is not exhaustive, it provides an easily accessible collection of standard test problems for continuous global optimization. Five different stochastic global optimization algorithms have been tested on these problems and a performance profile plot based on the improvement of objective function values is constructed to investigate the macroscopic behavior of the algorithms. The paper also investigates the microscopic behavior of the algorithms through quartile sequential plots, and contrasts the information gained from these two kinds of plots. The effect of the length of run is explored by using three maximum numbers of function evaluations and it is shown to significantly impact the behavior of the algorithms.  相似文献   

20.
In recent years, it has been shown that strategies based on an interval-Newton approach can be used to reliably solve a variety of nonlinear equation solving and optimization problems in chemical process engineering, including problems in parameter estimation and in the computation of phase behavior. These strategies provide a mathematical and computational guarantee either that all solutions have been located in an equation solving problem or that the global optimum has been found in an optimization problem. The primary drawback to this approach is the potentially high computational cost. In this paper, we consider strategies for bounding the solution set of the linear interval equation system that must be solved in the context of the interval-Newton method. Recent preconditioning techniques for this purpose are reviewed, and a new bounding approach based on the use of linear programming (LP) techniques is presented. Using this approach it is possible to determine the desired bounds exactly (within round out), leading to significant overall improvements in computational efficiency. These techniques will be demonstrated using several global optimization problems, with focus on problems arising in chemical engineering, including parameter estimation and molecular modeling. These problems range in size from under ten variables to over two hundred, and are solved deterministically using the interval methodology.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号