首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
2.
3.
Gilmore and Gomory's algorithm is one of the better actually known exact algorithms for solving unconstrained guillotine two-dimensional cutting problems. Herz's algorithm is more effective, but only for the unweighted case. We propose a new exact algorithm adequate for both weighted and unweighted cases, which is more powerful than both algorithms. The algorithm uses dynamic programming procedures and one-dimensional knapsack problem to obtain efficient lower and upper bounds and important optimality criteria which permit a significant branching cut in a recursive tree-search procedure. Recursivity, computational power, adequateness to parallel implementations, and generalization for solving constrained two-dimensional cutting problems, are some important features of the new algorithm.  相似文献   

4.
We give a generalization of the hypergreedy algorithm for minimum weight perfect matching on a complete edge weighted graph whose weights satisfy the triangle inequality. With a modified version of this algorithm we obtain a logn-approximate perfect matching heuristic for points in the Euclidean plane, inO(n log2 n) time.This research was supported in part by the DIMACS Grant No. NSF-STC88-09648.This research was supported in part by the NSF under Grant No. CCR 88-07518.  相似文献   

5.
In this paper, we present a new smoothing Newton method for solving monotone weighted linear complementarity problem (WCP). Our algorithm needs only to solve one linear system of equation and performs one line search per iteration. Any accumulation point of the iteration sequence generated by our algorithm is a solution of WCP. Under suitable conditions, our algorithm has local quadratic convergence rate. Numerical experiments show the feasibility and efficiency of the algorithm.  相似文献   

6.
7.
Let {Xk} be a sequence of i.i.d. random variables with d.f. F(x). In the first part of the paper the weak convergence of the d.f.'s Fn(x) of sums is studied, where 0<α≤2, ank>0, 1≤k≤mn, and, as n→∞, bothmax 1≤k≤mna nk→0 and . It is shown that such convergence, with suitably chosen An's and necessarily stable limit laws, holds for all such arrays {αnk} provided it holds for the special case αnk=1/n, 1≤k≤n. Necessary and sufficient conditions for such convergence are classical. Conditions are given for the convergence of the moments of the sequence {Fn(x)}, as well as for its convergence in mean. The second part of the paper deals with the almost sure convergence of sums , where an≠0, bn>0, andmax 1≤k≤n ak/bn→0. The strong law is said to hold if there are constants An for which Sn→0 almost surely. Let N(0)=0 and N(x) equal the number of n≥1 for which bn/|an|<x if x>0. The main result is as follows. If the strong law holds,EN (|X1|)<∞. If for some 0<p≤2, then the strong law holds with if 1≤p≤2 and An=0 if 0<p<1. This extends the results of Heyde and of Jamison, Orey, and Pruitt. The strong law is shown to hold under various conditions imposed on F(x), the coefficients an and bn, and the function N(x). Proceedings of the Seminar on Stability Problems for Stochastic Models, Moscow, 1993.  相似文献   

8.
The set cover problem is that of computing a minimum weight subfamily F, given a family F of weighted subsets of a base set U, such that every element of U is covered by some subset in F. The k-set cover problem is a variant in which every subset is of size at most k. It has been long known that the problem can be approximated within a factor of by the greedy heuristic, but no better bound has been shown except for the case of unweighted subsets. In this paper we consider approximation of a restricted version of the weighted 3-set cover problem, as a first step towards better approximation of general k-set cover problem, where any two distinct subset costs differ by a multiplicative factor of at least 2. It will be shown, via LP duality, that an improved approximation bound of H(3)-1/6 can be attained, when the greedy heuristic is suitably modified for this case. A key to our algorithm design and analysis is the Gallai-Edmonds structure theorem for maximum matchings.  相似文献   

9.
Computing a maximum weighted stable set in a bipartite graph is considered well-solved and usually approached with preflow-push, Ford-Fulkerson or network simplex algorithms. We present a combinatorial algorithm for the problem that is not based on flows. Numerical tests suggest that this algorithm performs quite well in practice and is competitive with flow based algorithms especially in the case of dense graphs.  相似文献   

10.
A linear time approximation algorithm for the weighted set-covering problem is presented. For the special case of the weighted vertex cover problem it produces a solution of weight which is at most twice the weight of an optimal solution.  相似文献   

11.
We consider a two-stage defender-attacker game that takes place on a network, in which the attacker seeks to take control over (or “influence”) as many nodes as possible. The defender acts first in this game by protecting a subset of nodes that cannot be influenced by the attacker. With full knowledge of the defender’s action, the attacker can then influence an initial subset of unprotected nodes. The influence then spreads over a finite number of time stages, where an uninfluenced node becomes influenced at time t if a threshold number of its neighbors are influenced at time t?1. The attacker’s objective is to maximize the weighted number of nodes that are influenced over the time horizon, where the weights depend both on the node and on the time at which that is influenced. This defender-attacker game is especially difficult to optimize, because the attacker’s problem itself is NP-hard, which precludes a standard inner-dualization approach that is common in many interdiction studies. We provide three models for solving the attacker’s problem, and develop a tailored cutting-plane algorithm for solving the defender’s problem. We then demonstrate the computational efficacy of our proposed algorithms on a set of randomly generated instances.  相似文献   

12.
Let r be a fixed constant and let be an r‐uniform, D‐regular hypergraph on N vertices. Assume further that for some . Consider the random greedy algorithm for forming an independent set in . An independent set is chosen at random by iteratively choosing vertices at random to be in the independent set. At each step we chose a vertex uniformly at random from the collection of vertices that could be added to the independent set (i.e. the collection of vertices v with the property that v is not in the current independent set I and contains no edge in ). Note that this process terminates at a maximal subset of vertices with the property that this set contains no edge of ; that is, the process terminates at a maximal independent set. We prove that if satisfies certain degree and codegree conditions then there are vertices in the independent set produced by the random greedy algorithm with high probability. This result generalizes a lower bound on the number of steps in the H‐free process due to Bohman and Keevash and produces objects of interest in additive combinatorics. © 2016 Wiley Periodicals, Inc. Random Struct. Alg., 49, 479–502, 2016  相似文献   

13.
Independent component analysis (ICA) aims to recover a set of unknown mutually independent components (ICs) from their observed mixtures without knowledge of the mixing coefficients. In the classical ICA model there exists ICs’ indeterminacy on permutation and dilation. Constrained ICA is one of methods for solving this problem through introducing constraints into the classical ICA model. In this paper we first present a new constrained ICA model which composed of three parts: a maximum likelihood criterion as an objective function, statistical measures as inequality constraints and the normalization of demixing matrix as equality constraints. Next, we incorporate the new fixed-point (newFP) algorithm into this constrained ICA model to construct a new constrained fixed-point algorithm. Computation simulations on synthesized signals and speech signals demonstrate that this combination both can eliminate ICs’ indeterminacy to a certain extent, and can provide better performance. Moreover, comparison results with the existing algorithm verify the efficiency of our new algorithm furthermore, and show that it is more simple to implement than the existing algorithm due to its advantage of not using the learning rate. Finally, this new algorithm is also applied for the real-world fetal ECG data, experiment results further indicate the efficiency of the new constrained fixed-point algorithm.  相似文献   

14.
Novosibirsk. Translated from Sibirskii Matematicheskii Zhurnal, Vol. 32, No. 1, pp. 171–173, January–February, 1991.  相似文献   

15.
This paper presents a finite algorithm for computing the weighted Moore-Penrose inverse of a rectangular matrix. This algorithm may be viewed as an extension of Decell's algorithm.  相似文献   

16.
17.
In this paper, sequential and parallel algorithms are presented to find a maximum independent set with largest weight in a weighted permutation graph. The sequential algorithm, which is designed based on dynamic programming, runs in timeO(nlogn) and requiresO(n) space. The parallel algorithm runs inO(log2 n) time usingO(n 3/logn) processors on the CREW PRAM, orO(logn) time usingO(n 3) processors on the CRCW PRAM.  相似文献   

18.
Equivalent conditions of complete convergence for independent weighted sums   总被引:3,自引:0,他引:3  
For a very general weight function, the equivalent conditions of complete convergence for weighted sums of independent but not necessary identically distributed random variables are given. The previous situation of only sufficient results except for particular weight functions is changed. These results may help deduce many known ones and bring to light richer content. Project supported by the National Natural Science Foundation of China (Grant No. 19671078). the Natural Science Foundation of the Highter Education Department of Cuangdong Province and the National Science Foundation of the Education Commission of Jiangsu Province.  相似文献   

19.
This work is concerned with the prime factor decomposition (PFD) of strong product graphs. A new quasi-linear time algorithm for the PFD with respect to the strong product for arbitrary, finite, connected, undirected graphs is derived.Moreover, since most graphs are prime although they can have a product-like structure, also known as approximate graph products, the practical application of the well-known “classical” prime factorization algorithm is strictly limited. This new PFD algorithm is based on a local approach that covers a graph by small factorizable subgraphs and then utilizes this information to derive the global factors. Therefore, we can take advantage of this approach and derive in addition a method for the recognition of approximate graph products.  相似文献   

20.
Very recently, Thomassé et al. (2017) have given an FPT algorithm for Weighted Independent Set in bull-free graphs parameterized by the weight of the solution, running in time 2O(k5)?n9. In this article we improve this running time to 2O(k2)?n7. As a byproduct, we also improve the previous Turing-kernel for this problem from O(k5) to O(k2). Furthermore, for the subclass of bull-free graphs without holes of length at most 2p?1 for p3, we speed up the running time to 2O(k?k1p?1)?n7. As p grows, this running time is asymptotically tight in terms of k, since we prove that for each integer p3, Weighted Independent Set cannot be solved in time 2o(k)?nO(1) in the class of {bull,C4,,C2p?1}-free graphs unless the ETH fails.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号