首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We evaluate two variants of depth-first search algorithms and consider the classic job shop scheduling problem as a test bed. The first one is the well-known branch-and-bound algorithm proposed by P. Brucker et al. which uses a single chronological backtracking strategy. The second is a variant that uses partially informed depth-first search strategy instead. Both algorithms use the same heuristic estimation; in the first case, it is only used for pruning states that cannot improve the incumbent solution, whereas in the second it is also used to sort the successors of an expanded state. We also propose and analyze a new heuristic estimation which is more informed and more time consuming than that used by Brucker’s algorithm. We conducted an experimental study over well-known instances showing that the proposed partially informed depth-first search algorithm outperforms the original Brucker’s algorithm.  相似文献   

2.
The intensional transformation is a technique that can be used in order to eliminate higher-order functions from a functional program by introducing appropriate context-manipulation operators. The transformation can be applied to a significant class of higher-order programs and results in equivalent zero-order intensional programs that can be executed in a simple demand-driven way. Despite its simplicity, the transformation has never been seriously evaluated with respect to its efficiency and potential. Certain simple implementations of the technique have been performed, but questions regarding the merits of the method have remained inconclusive. In this paper we demonstrate that the transformation can be efficiently implemented by using what we call lazy activation records, namely activation records in which some entries are filled on-demand. An evaluation of our implementation demonstrates that the technique outperforms some of the most well-known functional programming systems, for the class of programs that can be transformed. This work has been partially supported by the University of Athens under the project “Kapodistrias” (grant no. 70/4/5827).  相似文献   

3.
In this work we consider the problem of Hidden Markov Models (HMM) training. This problem can be considered as a global optimization problem and we focus our study on the Particle Swarm Optimization (PSO) algorithm. To take advantage of the search strategy adopted by PSO, we need to modify the HMM's search space. Moreover, we introduce a local search technique from the field of HMMs and that is known as the Baum–Welch algorithm. A parameter study is then presented to evaluate the importance of several parameters of PSO on artificial data and natural data extracted from images.  相似文献   

4.
This paper proposes a fast exact algorithm to solve the Pallet Loading Problem (PLP) using depth-first strategy. A new concept called Maximal Breadth Filling Sequence (MBFS) is introduced to bring down the size of the search tree. The algorithm makes use of two pruning rules — lower-bound pruning and state-dominance pruning. Although depth-first search, by itself, requires very little memory, the dominance pruning rule makes effective utilization of the available memory. For large problems, more the memory available, more effective is the dominance pruning. The algorithm has been tested on standard problem sets. It has been found to be quite fast in outputting optimal solutions. Empirical findings are given in detail.  相似文献   

5.
A new efficient interval partitioning approach to solve constrained global optimization problems is proposed. This involves a new parallel subdivision direction selection method as well as an adaptive tree search. The latter explores nodes (intervals in variable domains) using a restricted hybrid depth-first and best-first branching strategy. This hybrid approach is also used for activating local search to identify feasible stationary points. The new tree search management technique results in improved performance across standard solution and computational indicators when compared to previously proposed techniques. On the other hand, the new parallel subdivision direction selection rule detects infeasible and suboptimal boxes earlier than existing rules, and this contributes to performance by enabling earlier reliable deletion of such subintervals from the search space.  相似文献   

6.
A fault-tolerant routing algorithm has been developed for star graph interconnection topology by using a depth-first search strategy. The proposed algorithm routes a message from the source to the destination along an optimal path with a very high probability and is guaranteed to trace a path as long as the source and the destination are not disconnected. We derive exact mathematical expressions for the probabilities that the algorithm will compute an optimal path for a given number of faulty links in the network. The analysis reveals many interesting topological properties of the star graphs.  相似文献   

7.
The paper presents an exact procedure for a general resource-constrained project scheduling problem where multiple modes are available for the performance of the individual activities and minimum as well as maximum time lags between the different activities may be given. The objective is to determine a mode and a start time for each activity such that all constraints are observed and the project duration is minimized. Project scheduling problems of this type occur, e.g. in process industries. The solution method is a depth-first search based branch-and-bound procedure. It makes use of a branching strategy where the branching rule is selected dynamically. The solution approach is an integration approach where the modes and start times are determined simultaneously. Within an experimental performance analysis this procedure is compared with existing solution procedures.  相似文献   

8.
《Fuzzy Sets and Systems》1987,23(1):119-129
Prolog, a programming language based on the first order predicate calculus, has been widely used in artificial intelligence research. One of its shortcomings is the lack of a natural mechanism to deal with uncertainty.A possible solution to this problem, outlined here, is to base Prolog on fuzzy logic rather than on conventional two-valued logic. This leads to a more general system, of which standard Prolog is a special case. To give the system greater flexibility, the fuzzy Prolog interpreter can link with the Fril system developed at Bristol, yielding a powerful language with breadth-first and depth-first search capabilities.  相似文献   

9.
Sleator and Tarjan have invented a form of self-adjusting binary search tree called thesplay tree. On any sufficiently long access sequence, splay trees are as efficient, to within a constant factor, as both dynamically balanced and static optimum search trees. Sleator and Tarjan have made a much stronger conjecture; namely, that on any sufficiently long access sequence and to within a constant factor, splay trees are as efficient asany form of dynamically updated search tree. Thisdynamic optimality conjecture implies as a special case that accessing the items in a splay tree in sequential order takes linear time, i.e.O(1) time per access. In this paper we prove this special case of the conjecture, generalizing an unpublished result of Wegman. Oursequential access theorem not only supports belief in the dynamic optimality conjecture but provides additional insight into the workings of splay trees. As a corollary of our result, we show that splay trees can be used to simulate output-restricted deques (double-ended queues) in linear time. We pose several open problems related to our result.  相似文献   

10.
In this paper, we propose a two-phased local search for vertex coloring. The algorithm alternately executes two closely interacting functionalities, i.e., a stochastic and a deterministic local search. The stochastic phase is basically based on biased random sampling that, according to a probability matrix storing the probability a vertex can be assigned to a color, iteratively constructs feasible colorings. The deterministic phase, instead, consists in assigning sequentially, according to a given ordering, each vertex to the color which causes the lowest increase of the solution penalty, and then, when the schedule is constructed, swap operations are executed to improve the performance. The interaction between the two phases is implemented by tunnelling information of what happened during a phase to the successive ones. Beyond the algorithm scheme, the novelty of the approach stems from the fact that the objective function is not minimizing the number of colors but a new penalty function. The proposed approach is tested on known benchmarks for the studied problem available on the public domain. From a comparison to the state of the art it appears that the proposed approach is robust and is able to achieve best known results.  相似文献   

11.
Following the pioneering work of Kierstead, we present here some complexity results about the construction of depth-first greedy linear extensions. We prove that the recognition of Dilworth partially ordered sets of height 2, as defined by Syslo, is NP-complete. This last result yields a new proof of the NP-completeness of the jump number problem, first proved by Pulleyblank.  相似文献   

12.
This paper focuses on introducing a concept of diversified local search strategy under the scatter search framework for the probabilistic traveling salesman problem (PTSP). Different combinations of three commonly used local search methods in the PTSP, i.e., 1-shift, 2-opt, and 3-opt, were used to investigate its effects. A set of numerical experiments were conducted to test the validity of the proposed strategy based on randomly generated test instances. The numerical results and the permutation test showed that the diversified local search strategy, especially by combining 1-shift and 2-opt algorithms, can most effectively solve the homogeneous and heterogeneous PTSP in most of the tested instances in comparison with the single local search strategy under scatter search framework.  相似文献   

13.
Constraint Programming typically uses the technique of depth-first branch and bound as the method of solving optimization problems. Although this method can give the optimal solution, for large problems, the time needed to find the optimal can be prohibitive. This paper introduces a method for using local search techniques within a Constraint Programming framework, and applies this technique to vehicle routing problems. We introduce a Constraint Programming model for vehicle routing, and a system for integrating Constraint Programming and local search techniques. We then describe how the method can be accelerated by handling core constraints using fast local checks, while other more complex constraints are left to the constraint propagation system. We have coupled our local search method with a meta-heuristic to avoid the search being trapped in local minima. Several meta-heuristics are investigated ranging from a simple Tabu Search method to Guided Local Search. An empirical study over benchmark problems shows the relative merits of these techniques. Investigations indicate that the specific long-term memory technique used by Guided Local Search can be used as a diversification method for Tabu Search, resulting in significant benefit. Several new best solutions on the Solomon problems are found in relatively few iterations of our algorithm.  相似文献   

14.
To perform efficient inference in Bayesian networks by means of a Junction Tree method, the network graph needs to be triangulated. The quality of this triangulation largely determines the efficiency of the subsequent inference, but the triangulation problem is unfortunately NP-hard. It is common for existing methods to use the treewidth criterion for optimality of a triangulation. However, this criterion may lead to a somewhat harder inference problem than the total table size criterion. We therefore investigate new methods for depth-first search and best-first search for finding optimal total table size triangulations. The search methods are made faster by efficient dynamic maintenance of the cliques of a graph. This problem was investigated by Stix, and in this paper we derive a new simple method based on the Bron-Kerbosch algorithm that compares favourably to Stix’ approach. The new approach is generic in the sense that it can be used with other algorithms than just Bron-Kerbosch. The algorithms for finding optimal triangulations are mainly supposed to be off-line methods, but they may form the basis for efficient any-time heuristics. Furthermore, the methods make it possible to evaluate the quality of heuristics precisely and allow us to discover parts of the search space that are most important to direct randomized sampling to.  相似文献   

15.
Scale factor local search in differential evolution   总被引:8,自引:0,他引:8  
This paper proposes the scale factor local search differential evolution (SFLSDE). The SFLSDE is a differential evolution (DE) based memetic algorithm which employs, within a self-adaptive scheme, two local search algorithms. These local search algorithms aim at detecting a value of the scale factor corresponding to an offspring with a high performance, while the generation is executed. The local search algorithms thus assist in the global search and generate offspring with high performance which are subsequently supposed to promote the generation of enhanced solutions within the evolutionary framework. Despite its simplicity, the proposed algorithm seems to have very good performance on various test problems. Numerical results are shown in order to justify the use of a double local search instead of a single search. In addition, the SFLSDE has been compared with a standard DE and three other modern DE based metaheuristic for a large and varied set of test problems. Numerical results are given for relatively low and high dimensional cases. A statistical analysis of the optimization results has been included in order to compare the results in terms of final solution detected and convergence speed. The efficiency of the proposed algorithm seems to be very high especially for large scale problems and complex fitness landscapes.  相似文献   

16.
We describe a simple breadth-first tree search scheme for minimizing the makespan of a project consisting of a partially ordered network of activities under multiple resource constraints. The method compares quite favourably with existing techniques that employ depth-first or best-first search; in particular, it is able to solve optimally on a Pentium PC running SCO UNIX the entire set of 680 benchmark problems (First Lot: 480, Second Lot: 200) generated by Kolisch et al., 1995. The new algorithm has also been checked out experimentally on additional random test problems at graded levels of difficulty that were generated using two parameters: the threshold, which determined the predecessors of an activity, and the total resource availability of each resource type. The breadth-first scheme can be modified quite readily to do best-first search or to minimize measures other than makespan such as mean flow time and maximum tardiness.  相似文献   

17.
A comparison deals with the advantages and disadvantages of the classical random-base, exhaustive and gradient searches and presents a precise local search combined global search control strategy including a new, systematic point selection which makes possible the escape from local minima by time. As a demonstration electrochemically etched porous silicon (PS) samples were investigated by spectroscopic ellipsometry (SE). The evaluation process (a global optimisation task) was made in different ways to see the difficulties and the differences among the evaluating possibilities. The new, topographical search (named Gradient Cube search) was compared with some classical methods (Grid search, Random or Monte-Carlo search, and Levenberg-Marquardt gradient search) and with two more complex algorithms (Genetic Algorithms and Simulated Annealing) by evaluating real measurements. The application results prove that the classical methods have difficulties to give enough reliability and precision at the same time in global optimisation tasks if the error surface is hilly. There is therefore a hard need of escaping from local minima, and a need of a systematic evaluation to avoid the uncertainty of random-base evaluation. The Gradient Cube search is an effective, systematic hill-climbing search with high precision and so it can be useful in ellipsometry.  相似文献   

18.
The Nisan–Wigderson pseudo-random generator [19] was constructed to derandomize probabilistic algorithms under the assumption that there exist explicit functions which are hard for small circuits. We give the first explicit construction of a pseudo-random generator with asymptotically optimal seed length even when given a function which is hard for relatively small circuits. Generators with optimal seed length were previously known only assuming hardness for exponential size circuits [13,26]. We also give the first explicit construction of an extractor which uses asymptotically optimal seed length for random sources of arbitrary min-entropy. Our construction is the first to use the optimal seed length for sub-polynomial entropy levels. It builds on the fundamental connection between extractors and pseudo-random generators discovered by Trevisan [29], combined with the construction above. The key is a new analysis of the NW-generator [19]. We show that it fails to be pseudorandom only if a much harder function can be efficiently constructed from the given hard function. By repeatedly using this idea we get a new recursive generator, which may be viewed as a reduction from the general case of arbitrary hardness to the solved case of exponential hardness. * This paper is based on two conference papers [11,12] by the same authors. † Research Supported by NSF Award CCR-9734911, NSF Award CCR-0098197, Sloan Research Fellowship BR-3311, grant #93025 of the joint US-Czechoslovak Science and Technology Program, and USA-Israel BSF Grant 97-00188. ‡ Part of this work was done while at the Hebrew University and the Institute for advanced study. § This research was supported by grant number 69/96 of the Israel Science Foundation, founded by the Israel Academy for Sciences and Humanities and USA-Israel BSF Grant 97-00188.  相似文献   

19.
This paper tackles the problem of showing that evolutionary algorithms for fuzzy clustering can be more efficient than systematic (i.e. repetitive) approaches when the number of clusters in a data set is unknown. To do so, a fuzzy version of an Evolutionary Algorithm for Clustering (EAC) is introduced. A fuzzy cluster validity criterion and a fuzzy local search algorithm are used instead of their hard counterparts employed by EAC. Theoretical complexity analyses for both the systematic and evolutionary algorithms under interest are provided. Examples with computational experiments and statistical analyses are also presented.  相似文献   

20.
We show that the problems of deciding whether an ordered set has at leastk depth-first linear extensions and whether an ordered set has at leastk greedy linear extensions are NP-hard.Supported by Office of Naval Research contract N00014-85K-0494.Supported by National Science Foundation grant DMS-8713994.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号