首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Determining the maximum outerplanar subgraph of a given graph is known to be an NP-complete problem. In the literature there are no earlier experiment on approximating the maximum outerplanar subgraph problem. In this paper we compare solution quality and running times of different heuristics for finding maximum outerplanar subgraphs. We compare a greedy heuristic against a triangular cactus heuristic and its greedy variation. We also use the solutions from the greedy heuristics as initial solutions for a simulated annealing algorithm.The main experimental result is that simulated annealing with initial solution taken from the greedy triangular cactus heuristic yields the best known approximations for the maximum outerplanar subgraph problem.Work funded by the Tampere Graduate School in Information Science and Engineering (TISE) and supported by the Academy of Finland (Project 51528).  相似文献   

2.
In this paper, a Lagrangian-based heuristic is proposed for the degree constrained minimum spanning tree problem. The heuristic uses Lagrangian relaxation information to guide the construction of feasible solutions to the problem. The scheme operates, within a Lagrangian relaxation framework, with calls to a greedy construction heuristic, followed by a heuristic improvement procedure. A look ahead infeasibility prevention mechanism, introduced into the greedy heuristic, allowed us to solve instances of the problem where some of the vertices are restricted to having degrees 1 or 2. Furthermore, in order to cut down on CPU time, a restricted version of the original problem is formulated and used to generate feasible solutions. Extensive computational experiments were conducted and indicate that the proposed heuristic is competitive with the best heuristics and metaheuristics in the literature.  相似文献   

3.
In this paper, we develop new heuristic procedures for the maximum diversity problem (MDP). This NP-hard problem has a significant number of practical applications such as environmental balance, telecommunication services or genetic engineering. The proposed algorithm is based on the tabu search methodology and incorporates memory structures for both construction and improvement. Although proposed in seminal tabu search papers, memory-based constructions have often been implemented in naïve ways that disregard important elements of the fundamental tabu search proposals. We will compare our tabu search construction with a memory-less design and with previous algorithms recently developed for this problem. The constructive method can be coupled with a local search procedure or a short-term tabu search for improved outcomes. Extensive computational experiments with medium and large instances show that the proposed procedure outperforms the best heuristics reported in the literature within short computational times.  相似文献   

4.
Over the last decade, many metaheuristics have been applied to the flowshop scheduling problem, ranging from Simulated Annealing or Tabu Search to complex hybrid techniques. Some of these methods provide excellent effectiveness and efficiency at the expense of being utterly complicated. In fact, several published methods require substantial implementation efforts, exploit problem specific speed-up techniques that cannot be applied to slight variations of the original problem, and often re-implementations of these methods by other researchers produce results that are quite different from the original ones. In this work we present a new iterated greedy algorithm that applies two phases iteratively, named destruction, were some jobs are eliminated from the incumbent solution, and construction, where the eliminated jobs are reinserted into the sequence using the well known NEH construction heuristic. Optionally, a local search can be applied after the construction phase. Our iterated greedy algorithm is both very simple to implement and, as shown by experimental results, highly effective when compared to state-of-the-art methods.  相似文献   

5.
A greedy randomized adaptive search procedure (GRASP) is an iterative multistart metaheuristic for difficult combinatorial optimization problems. Each GRASP iteration consists of two phases: a construction phase, in which a feasible solution is produced, and a local search phase, in which a local optimum in the neighborhood of the constructed solution is sought. Repeated applications of the construction procedure yields different starting solutions for the local search and the best overall solution is kept as the result. The GRASP local search applies iterative improvement until a locally optimal solution is found. During this phase, starting from the current solution an improving neighbor solution is accepted and considered as the new current solution. In this paper, we propose a variant of the GRASP framework that uses a new “nonmonotone” strategy to explore the neighborhood of the current solution. We formally state the convergence of the nonmonotone local search to a locally optimal solution and illustrate the effectiveness of the resulting Nonmonotone GRASP on three classical hard combinatorial optimization problems: the maximum cut problem (MAX-CUT), the weighted maximum satisfiability problem (MAX-SAT), and the quadratic assignment problem (QAP).  相似文献   

6.
In this work, the NP-hard maximum clique problem on graphs is considered. Starting from basic greedy heuristics, modifications and improvements are proposed and combined in a two-phase heuristic procedure. In the first phase an improved greedy procedure is applied starting from each node of the graph; on the basis of the results of this phase a reduced subset of nodes is selected and an adaptive greedy algorithm is repeatedly started to build cliques around such nodes. In each restart the selection of nodes is biased by the maximal clique generated in the previous execution. Computational results are reported on the DIMACS benchmarks suite. Remarkably, the two-phase procedure successfully solves the difficult Brockington-Culberson instances, and is generally competitive with state-of-the-art much more complex heuristics.  相似文献   

7.
研究企业新建设施时,市场上已有设施存在的情况下,使本企业总体利润最大的截流设施选址问题。在一般截留设施选址模型的基础上引入引力模型,消费者到某个设施接受服务的概率与偏离距离及设施的吸引力相关,同时设施的建设费用与设施吸引力正相关,建立非线性整数规划模型并使用贪婪算法进行求解。数值分析表明,该算法求解速度快,模型计算精度较高。  相似文献   

8.
Graph coloring is one of the hardest combinatorial optimization problems for which a wide variety of algorithms has been proposed over the last 30 years. The problem is as follows: given a graph one has to assign a label to each vertex such that no monochromatic edge appears and the number of different labels used is minimized. In this paper we present a new heuristic for this problem which works with two different functionalities. One is defined by two greedy subroutines, the former being a greedy constructive one and the other a greedy modification one. The other functionality is a perturbation subroutine, which can produce also infeasible colorings, and the ability is then to retrieve feasible solutions. In our experimentation the proper tuning of this optimization scheme produced good results on known graph coloring benchmarks.  相似文献   

9.
We consider the linking set problem, which can be seen as a particular case of the multiple-choice knapsack problem. This problem occurs as a subproblem in a decomposition procedure for solving large-scale p-median problems such as the optimal diversity management problem. We show that if a non-increasing diference property of the costs in the linking set problem holds, then the problem can be solved by a greedy algorithm and the corresponding linear gap is null.  相似文献   

10.
The single-sink fixed-charge transportation problem (SSFCTP) consists of finding a minimum cost flow from a number of nodes to a single sink. Beside a cost proportional to the amount shipped, the flow cost encompass a fixed charge. The SSFCTP is an important subproblem of the well-known fixed-charge transportation problem. Nevertheless, just a few methods for solving this problem have been proposed in the literature. In this paper, some greedy heuristic solutions methods for the SSFCTP are investigated. It is shown that two greedy approaches for the SSFCTP known from the literature can be arbitrarily bad, whereas an approximation algorithm proposed in the literature for the binary min-knapsack problem has a guaranteed worst case bound if adapted accordingly to the case of the SSFCTP.  相似文献   

11.
Merging words according to their overlap yields a superstring. This basic operation allows to infer long strings from a collection of short pieces, as in genome assembly. To capture a maximum of overlaps, the goal is to infer the shortest superstring of a set of input words. The Shortest Cyclic Cover of Strings (SCCS) problem asks, instead of a single linear superstring, for a set of cyclic strings that contain the words as substrings and whose sum of lengths is minimal. SCCS is used as a crucial step in polynomial time approximation algorithms for the notably hard Shortest Superstring problem, but it is solved in cubic time. The cyclic strings are then cut and merged to build a linear superstring. SCCS can also be solved by a greedy algorithm. Here, we propose a linear time algorithm for solving SCCS based on a Eulerian graph that captures all greedy solutions in linear space. Because the graph is Eulerian, this algorithm can also find a greedy solution of SCCS with the least number of cyclic strings. This has implications for solving certain instances of the Shortest linear or cyclic Superstring problems.  相似文献   

12.
In this paper we prove the equivalence between a pivoting-based heuristic (PBH) for the maximum weight clique problem and a combinatorial greedy heuristic. It is also proved that PBH always returns a local solution although this is not always guaranteed for Lemke's method, on which PBH is based.  相似文献   

13.
The analysis of data concerning the deterioration of pavement over time yielded a problem of aggregating the data in a manner that preserved independence of the aggregated data points and maximized the number of points. We show that this problem can be modeled as a maximum cardinality vertex packing problem on a proper internal graph and thus can be solved very efficiently by a greedy algorithm.  相似文献   

14.
《Journal of Graph Theory》2018,88(3):402-410
We improve by an exponential factor the lower bound of Körner and Muzi for the cardinality of the largest family of Hamilton paths in a complete graph of n vertices in which the union of any two paths has maximum degree 4. The improvement is through an explicit construction while the previous bound was obtained by a greedy algorithm. We solve a similar problem for permutations up to an exponential factor.  相似文献   

15.
We report a real application project in a car industry optimization problem known as the optimal diversity management problem. We provide an alternative proof of NP-hardness, and we give and discuss the results obtained from a greedy algorithm applied to huge size instances.  相似文献   

16.
This work deals with the parallel machine scheduling problem which consists in the assignment of n jobs on m   parallel machines. The most general variant of this problem is when the processing time depends on the machine to which each job is assigned to. This case is known as the unrelated parallel machine problem. Similarly to most of the literature, this paper deals with the minimization of the maximum completion time of the jobs, commonly referred to as makespan (Cmax)(Cmax). Many algorithms and methods have been proposed for this hard combinatorial problem, including several highly sophisticated procedures. By contrast, in this paper we propose a set of simple iterated greedy local search based metaheuristics that produce solutions of very good quality in a very short amount of time. Extensive computational campaigns show that these solutions are, most of the time, better than the current state-of-the-art methodologies by a statistically significant margin.  相似文献   

17.
A new heuristic procedure, which is called Smart Greedy, is proposed for solving a kind of general reliability optimization problems (non-DGR type knapsack problems). Smart Greedy uses Recursive Greedy with multiple greedy functions designated by balance coefficients, generates several solutions and then determines the best solution among them as the smart greedy solution. Recursive Greedy first checks the feasibility of sets of items for a given problem and removes infeasible items from the item sets. Second, the procedure checks the gain ratio of increments of objective function to constraint function and reduces the problem to DGR type problem by invoking LP dominance. Third, the procedure continues to allocate the increments for current items until the constraint is violated. With the current solution, the procedure then repeats the greedy procedure for current items that are added to the items removed by the LP dominance in the previous step.Computational results show that the Smart Greedy is more effective than the previously reported methods.  相似文献   

18.
This paper addresses the central spanning tree problem (CTP). The problem consists in finding a spanning tree that minimizes the so-called robust deviation, i.e. deviation from a maximally distant tree. The distance between two trees is measured by means of the symmetric difference of their edge sets. The central tree problem is known to be NP-hard. We attack the problem with a hybrid heuristic consisting of: (1) a greedy construction heuristic to get a good initial solution and (2) fast local search improvement. We illustrate computationally efficiency of the proposed approach.  相似文献   

19.
This paper describes a polynomial-time heuristic for the permutation flow-shop scheduling problem with the makespan criterion. The proposed method consists of two phases: arranging the jobs in priority order and then constructing a sequence. A fuzzy greedy evaluation function is employed to prioritize the jobs for incorporating into the construction phase of the heuristic. Computational experiments using standard benchmark problems indicate an improvement of the new heuristic over the well-known Nawaz, Enscore and Ham (NEH) heuristic. It will be seen that the NEH heuristic is a special case of our more general heuristic.  相似文献   

20.
The location of base stations (BS) and the allocation of channels are of paramount importance for the performance of cellular radio networks. Also cellular service providers are now being driven by the goal to enhance performance, particularly as it relates to the receipt and transmission of emergency crash notification messages generated by automobile telematics systems. In this paper, a Mixed Integer Programming (MIP) problem is proposed, which integrates into the same model the base station location problem, the frequency channel assignment problem and the emergency notification problem. The purpose of unifying these three problems in the same model is to treat the tradeoffs among them, providing a higher quality solution to the cellular system design. Some properties of the formulation are proposed that give us more insight into the problem structure. An instance generator is developed that randomly creates test problems. A few greedy heuristics are proposed to obtain quick solutions that turn out to be very good in some cases. To further improve the optimality gap, we develop a Lagrangean heuristic technique that builds on the solution obtained by the greedy heuristics. Finally, the performance of these methods is analyzed by extensive numerical tests and a sample case study is presented.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号