首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
A continuous space/time approximation of the well known ‘directed polymer’ problem is considered. Connection between the ‘Helmholtz Free Energy’ and the ‘Two Walker problem’ is shown. Rigorous proof of the superdiffusive mean squared displacement exponent of 4/3 is given when there is one space dimension and one time dimension. Asymptotically diffusive behaviour of c(k)tis shown when there are one ‘time’ and two ‘space’ dimensions. For higher dimensions, the behaviour is diffusive and the mean squared displacement is asymptotically t d. These results hold for all temperature, because the phase transition in the discrete model is no longer present in the continuous model; the renormalization procedure has set the transition temperature to k crit =0The joint distribution is also shown to be asymptotically sub-Gaussian for all dimensions and all temperatures (in the sense that the p thmoments as a function of pincrease more slowly than the moments of a Gaussian distribution). The ‘Helmholtz Free Energy’ is also calculated for this model and the quenched and annealed free energies are shown to be identical for all temperature  相似文献   

2.
In this paper, a deteriorating simple repairable system with k + 1 states, including k failure states and one working state, is studied. The system after repair is not ‘as good as new’ and the deterioration of the system is stochastic. Under these assumptions, we study a replacement policy, called policy N, based on the failure number of the system. The objective is to maximize the long-run expected profit per unit time. The explicit expression of the long-run expected profit per unit time is derived and the corresponding optimal solution may be determined analytically or numerically. Furthermore, we prove that the model for the multistate system in this paper forms a general monotone process model which includes the geometric process repair model as a special case. A numerical example is given to illustrate the theoretical results.  相似文献   

3.
In a paper that initiated the modern study of the stochastic block model (SBM), Decelle, Krzakala, Moore, and Zdeborová, backed by Mossel, Neeman, and Sly, conjectured that detecting clusters in the symmetric SBM in polynomial time is always possible above the Kesten‐Stigum (KS) threshold, while it is possible to detect clusters information theoretically (i.e., not necessarily in polynomial time) below the KS threshold when the number of clusters k is at least 4. Massoulié, Mossel et al., and Bordenave, Lelarge, and Massoulié proved that the KS threshold is in fact efficiently achievable for k = 2, while Mossel et al. proved that it cannot be crossed at k = 2. The above conjecture remained open for k ≥ 3. This paper proves the two parts of the conjecture, further extending the results to general SBMs. For the efficient part, an approximate acyclic belief propagation (ABP) algorithm is developed and proved to detect communities for any k down to the KS threshold in quasi‐linear time. Achieving this requires showing optimality of ABP in the presence of cycles, a challenge for message‐passing algorithms. The paper further connects ABP to a power iteration method on a nonbacktracking operator of generalized order, formalizing the interplay between message passing and spectral methods. For the information‐theoretic part, a nonefficient algorithm sampling a typical clustering is shown to break down the KS threshold at k = 4. The emerging gap is shown to be large in some cases, making the SBM a good case study for information‐computation gaps. © 2017 Wiley Periodicals, Inc.  相似文献   

4.
Papadimitriou and Steiglitz constructed ‘traps’ for the symmetric travelling salesman problem (TSP) with n = 8k cities. The constructed problem instances have exponentially many suboptimal solutions with arbitrarily large weight, which differ from the unique optimal solution in exactly 3k edges, and hence local search algorithms are ineffective to solve this problem. However, we show that this class of ‘catastrophic’ examples can be solved by linear programming relaxation appended with k subtour elimination constraints. It follows that this class of problem instances of TSP can be optimized in polynomial time.  相似文献   

5.
A partially ordered set (P, ≤) is called k‐homogeneous if any isomorphism between k‐element subsets extends to an automorphism of (P, ≤). Assuming the set‐theoretic assumption ⋄(ϰ1), it is shown that for each k, there exist partially ordered sets of size ϰ1 which embed each countable partial order and are k‐homogeneous, but not (k + 1)‐homogeneous. This is impossible in the countable case for k ≥ 4.  相似文献   

6.
We consider homogeneous solutions of the Vlasov–Fokker–Planck equation in plasma theory proving that they reach the equilibrium with a time exponential rate in various norms. By Csiszar–Kullback inequality, strong L1-convergence is a consequence of the ‘sharp’ exponential decay of relative entropy and relative Fisher information. To prove exponential strong decay in Sobolev spaces Hk, k ⩾ 0, we take into account the smoothing effect of the Fokker–Planck kernel. Finally, we prove that in a metric for probability distributions recently introduced in [9] and studied in [4, 14] the decay towards equilibrium is exponential at a rate depending on the number of moments bounded initially. Uniform bounds on the solution in various norms are then combined, by interpolation inequalities, with the convergence in this weak metric, to recover the optimal rate of decay in Sobolev spaces. © 1998 by B. G. Teubner Stuttgart–John Wiley & Sons, Ltd.  相似文献   

7.
Alain Bruguières 《代数通讯》2013,41(14):5817-5860
Inspired by a recent paper by Deligne [2], we extend the Tannaka-Krein duility results (over a field) to the non-commutative situation. To be precise, we establish a 1-1 corresponde:ice between ‘tensorial autonomous categories’ equipped with a ‘fibre functor’ (i. e. tannakian categories without the commutativity condition on the tensor product), and ‘quantum groupoids’ (as defined by Maltsiniotis, [9]) which are ‘transitive’ (7.1.). When the base field is perfect, a quantum groupoid over Spec B is transitive iff it is projective and faithfully fiat over B? k B. Moreover, the fibre functor is unique up to ‘quantum isomorphism’ (7.6.). Actually, we show Tannaka-Krein duality results in the more general setting where there is no monoidal structure on the category (and the functor); the algebraic object corresponding to such a category is a ‘semi-transitive’ coalgebroid (5.2. and 5.8.).  相似文献   

8.
Summary Based on the framework of subspace splitting and the additive Schwarz scheme, we give bounds for the condition number of multilevel preconditioners for sparse grid discretizations of elliptic model problems. For a BXP-like preconditioner we derive an estimate of the optimal orderO(1) and for a HB-like variant we obtain an estimate of the orderO(k 2 ·2 k/2 ), wherek denotes the number of levels employed. Furthermore, we confirm these results by numerically computed condition numbers.  相似文献   

9.
Consider a Poisson process X in R d with density 1. We connect each point of X to its k nearest neighbors by undirected edges. The number k is the parameter in this model. We show that, for k = 1, no percolation occurs in any dimension, while, for k = 2, percolation occurs when the dimension is sufficiently large. We also show that if percolation occurs, then there is exactly one infinite cluster. Another percolation model is obtained by putting balls of radius zero around each point of X and let the radii grow linearly in time until they hit another ball. We show that this model exists and that there is no percolation in the limiting configuration. Finally we discuss some general properties of percolation models where balls placed at Poisson points are not allowed to overlap (but are allowed to be tangent). © 1996 John Wiley & Sons, Inc.  相似文献   

10.
Jim Propp's rotor–router model is a deterministic analog of a random walk on a graph. Instead of distributing chips randomly, each vertex serves its neighbors in a fixed order. Cooper and Spencer (Comb Probab Comput 15 (2006) 815–822) show a remarkable similarity of both models. If an (almost) arbitrary population of chips is placed on the vertices of a grid ?d and does a simultaneous walk in the Propp model, then at all times and on each vertex, the number of chips on this vertex deviates from the expected number the random walk would have gotten there by at most a constant. This constant is independent of the starting configuration and the order in which each vertex serves its neighbors. This result raises the question if all graphs do have this property. With quite some effort, we are now able to answer this question negatively. For the graph being an infinite k‐ary tree (k ≥ 3), we show that for any deviation D there is an initial configuration of chips such that after running the Propp model for a certain time there is a vertex with at least D more chips than expected in the random walk model. However, to achieve a deviation of D it is necessary that at least exp(Ω(D2)) vertices contribute by being occupied by a number of chips not divisible by k at a certain time. © 2010 Wiley Periodicals, Inc. Random Struct. Alg., 2010  相似文献   

11.
The Integer Knapsack Problem with Set-up Weights (IKPSW) is a generalization of the classical Integer Knapsack Problem (IKP), where each item type has a set-up weight that is added to the knapsack if any copies of the item type are in the knapsack solution. The k-item IKPSW (kIKPSW) is also considered, where a cardinality constraint imposes a value k on the total number of items in the knapsack solution. IKPSW and kIKPSW have applications in the area of aviation security. This paper provides dynamic programming algorithms for each problem that produce optimal solutions in pseudo-polynomial time. Moreover, four heuristics are presented that provide approximate solutions to IKPSW and kIKPSW. For each problem, a Greedy heuristic is presented that produces solutions within a factor of 1/2 of the optimal solution value, and a fully polynomial time approximation scheme (FPTAS) is presented that produces solutions within a factor of ε of the optimal solution value. The FPTAS for IKPSW has time and space requirements of O(nlog n+n/ε 2+1/ε 3) and O(1/ε 2), respectively, and the FPTAS for kIKPSW has time and space requirements of O(kn 2/ε 3) and O(k/ε 2), respectively.  相似文献   

12.
When (R1,...,Rn) is a random permutation of the numbers (1,...,n), a ‘near-match’ at the ith place is defined to have occured if |Ri ? i| < k, for some fixed integer k. This note studies the asymptotic distribution of the number of ‘near-matches’ when k is fixed and when k is allowed to go to infinity with n.  相似文献   

13.
This article analyses intangible constructs that affect sales on the Internetretailing industry. We suggest an explanatory model for the success of retailersthat operate on the Internet. Non-financial information has been used toidentify several intangible constructs: ‘web trafficgeneration’, ‘relevance in search engines’,‘link popularity’, and ‘blogspopularity’. The success is measured through items derived fromfinancial statements: sales and profits. The model has been built within astructural modelling framework. It has been estimated using Partial LeastSquares with a sample of USA e-tailers. The results show that there is asignificant relationship between the intangible constructs and accountingfigures. This relationship is stronger when we consider Sales from InternetOperations rather than Total Sales or Net Profit.  相似文献   

14.
A function is said to be of ‘type k’ near zero if it behaves roughly like xk there. This notion is defined precisely, then it is used to obtain modifications of the Newton and Halley iteration schemes. It also gives an idea of the location of the constants c which arise in various mean value theorems.  相似文献   

15.
In this paper we consider a common optimization problem faced by a printing company while designing masters for advertisement material. A printing company may receive from various customers, advertisements for their products and services and their demand is for a specified number of copies to be printed. In a particular case, the printer receives these orders to be delivered next week from the customers, until the Thursday of a week. By Monday the printed copies have to be delivered to the customers. These advertisement items of the various customers are to be printed on large sheets of papers of specified standard sizes. The size is called a k-up if k items can be printed on one sheet. It is a given constraint that only items of the same size can be loaded on a master. This constraint results in a decomposition of the original problem of designing masters into many sub-problems, one for each size. The objective is to minimize the number of masters required while meeting the requirements of the customers. We formulate this optimization problem mathematically, discuss the computational issues and present some heuristic approaches for solving the problem.  相似文献   

16.
The graph coloring problem is to color a given graph with the minimum number of colors. This problem is known to be NP-hard even if we are only aiming at approximate solutions. On the other hand, the best known approximation algorithms require nδ (δ>0) colors even for bounded chromatic (k-colorable for fixed k) n-vertex graphs. The situation changes dramatically if we look at the average performance of an algorithm rather than its worst case performance. A k-colorable graph drawn from certain classes of distributions can be k-colored almost surely in polynomial time. It is also possible to k-color such random graphs in polynomial average time. In this paper, we present polynomial time algorithms for k-coloring graphs drawn from the semirandom model. In this model, the graph is supplied by an adversary each of whose decisions regarding inclusion of edges is reversed with some probability p. In terms of randomness, this model lies between the worst case model and the usual random model where each edge is chosen with equal probability. We present polynomial time algorithms of two different types. The first type of algorithms always run in polynomial time and succeed almost surely. Blum and Spencer [J. Algorithms, 19 , 204–234 (1995)] have also obtained independently such algorithms, but our results are based on different proof techniques which are interesting in their own right. The second type of algorithms always succeed and have polynomial running time on the average. Such algorithms are more useful and more difficult to obtain than the first type of algorithms. Our algorithms work for semirandom graphs drawn from a wide range of distributions and work as long as pn−α(k)+ϵ where α(k)=(2k)/((k−1)(k+2)) and ϵ is a positive constant. © 1998 John Wiley & Sons, Inc. Random Struct. Alg., 13, 125–158 (1998)  相似文献   

17.
We present error estimates of a linear fully discrete scheme for a three-dimensional mass diffusion model for incompressible fluids (also called Kazhikhov–Smagulov model). All unknowns of the model (velocity, pressure and density) are approximated in space by C 0-finite elements and in time an Euler type scheme is used decoupling the density from the velocity–pressure pair. If we assume that the velocity and pressure finite-element spaces satisfy the inf–sup condition and the density finite-element space contains the products of any two discrete velocities, we first obtain point-wise stability estimates for the density, under the constraint lim(h,k)→0 h/k = 0 (h and k being the space and time discrete parameters, respectively), and error estimates for the velocity and density in energy type norms, at the same time. Afterwards, error estimates for the density in stronger norms are deduced. All these error estimates will be optimal (of order O(h+k){\mathcal{O}(h+k)}) for regular enough solutions without imposing nonlocal compatibility conditions at the initial time. Finally, we also study two convergent iterative methods for the two problems to solve at each time step, which hold constant matrices (independent of iterations).  相似文献   

18.
The paradigm of many choices has influenced significantly the design of efficient data structures and, most notably, hash tables. Cuckoo hashing is a technique that extends this concept. There, we are given a table with n locations, and we assume that each location can hold one item. Each item to be inserted chooses randomly k ≥ 2 locations and has to be placed in any one of them. How much load can cuckoo hashing handle before collisions prevent the successful assignment of the available items to the chosen locations? Practical evaluations and theoretical analysis of this method have shown that one can allocate a number of elements that is a large proportion of the size of the table, being very close to 1 even for small values of k such as 4 or 5. In this paper we show that there is a critical value for this proportion: with high probability, when the amount of available items is below this value, then these can be allocated successfully, but when it exceeds this value, the allocation becomes impossible. We give explicitly for each k ≥ 3 this critical value. This answers an open question posed by Mitzenmacher (ESA '09) and underpins theoretically the experimental results. Our proofs are based on the translation of the question into a hypergraph setting, and the study of the related typical properties of random k ‐uniform hypergraphs.© 2012 Wiley Periodicals, Inc. Random Struct., 2012  相似文献   

19.
Scale free graphs have attracted attention as their non-uniform structure that can be used as a model for many social networks including the WWW and the Internet. In this paper, we propose a simple random model for generating scale free k-trees. For any fixed integer k, a k-tree consists of a generalized tree parameterized by k, and is one of the basic notions in the area of graph minors. Our model is quite simple and natural; it first picks a maximal clique of size k + 1 uniformly at random, it then picks k vertices in the clique uniformly at random, and adds a new vertex incident to the k vertices. That is, the model only makes uniform random choices twice per vertex. Then (asymptotically) the distribution of vertex degree in the resultant k-tree follows a power law with exponent 2 + 1/k, the k-tree has a large clustering coefficient, and the diameter is small. Moreover, our experimental results indicate that the resultant k-trees have extremely small diameter, proportional to o(log n), where n is the number of vertices in the k-tree, and the o(1) term is a function of k.  相似文献   

20.
In this article, a local discontinuous Galerkin (LDG) method is studied for numerically solving the fractal mobile/immobile transport equation with a new time Caputo–Fabrizio fractional derivative. The stability of the LDG scheme is proven, and a priori error estimates with the second‐order temporal convergence rate and the (k + 1) th order spatial convergence rate are derived in detail. Finally, numerical experiments based on Pk, k = 0, 1, 2, 3, elements are provided to verify our theoretical results.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号