首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
For the uniform random regular directed graph we prove concentration inequalities for (1) codegrees and (2) the number of edges passing from one set of vertices to another. As a consequence, we can deduce discrepancy properties for the distribution of edges essentially matching results for Erd?s–Rényi digraphs obtained from Chernoff‐type bounds. The proofs make use of the method of exchangeable pairs, developed for concentration of measure by Chatterjee in (Chatterjee, Probab Theory and Relat Fields 138 (2007), 305–321). Exchangeable pairs are constructed using two involutions on the set of regular digraphs: a well‐known “simple switching” operation, as well as a novel “reflection” operation. © 2016 Wiley Periodicals, Inc. Random Struct. Alg., 50, 23–58, 2017  相似文献   

2.
In view of an enhancement of our implementation on the computer, we explore the possibility of an algorithmic optimization of the various proof‐theoretic techniques employed by Kohlenbach for the synthesis of new (and better) effective uniform bounds out of established qualitative proofs in Numerical Functional Analysis. Concretely, we prove that the method (developed by the author in his thesis, as an adaptation to Dialectica interpretations of Berger's original technique for modified realizability and A‐translation) of “colouring” some of the quantifiers as “non‐computational” extends well to ε‐arithmetization, elimination‐of‐extensionality and model‐interpretation (© 2009 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

3.
In this paper we point out a mistake in the paper “Abstract Multiparameter Theory I” of P. J. Browne. We give new proofs for some of the results of P. J. Browne.  相似文献   

4.
A k‐piece of a graph G is a connected subgraph of G all of whose nodes have degree at most k and at least one node has degree equal to k. We consider the problem of covering the maximum number of nodes of a graph by node disjoint k‐pieces. When k = 1 this is the maximum matching problem, and when k = 2 this is the problem, recently studied by Kaneko [ 19 [, of covering the maximum number of nodes by disjoint paths of length greater than 1. We present a polynomial time algorithm for the problem as well as a Tutte‐type existence theorem and a Berge‐type min‐max formula. We also solve the problem in the more general situation where the “pieces” are defined in terms of lower and upper bounds on the degrees. © 2006 Wiley Periodicals, Inc. J Graph Theory  相似文献   

5.
Two embeddings of a graph in a surface S are said to be “equivalent” if they are identical under an homeomorphism of S that is orientation‐preserving for orientable S. Two graphs cellularly embedded simultaneously in S are said to be “jointly embedded” if the only points of intersection involve an edge of one graph transversally crossing an edge of the other. The problem is to find equivalent embeddings of the two graphs that minimize the number of these edge‐crossings; this minimum we call the “joint crossing number” of the two graphs. In this paper, we calculate the exact value for the joint crossing number for two graphs simultaneously embedded in the projective plane. Furthermore, we give upper and lower bounds when the surface is the torus, which in many cases give an exact answer. In particular, we give a construction for re‐embedding (equivalently) the graphs in the torus so that the number of crossings is best possible up to a constant factor. Finally, we show that if one of the embeddings is replaced by its “mirror image,” then the joint crossing number can decrease, but not by more than 6.066%. © 2001 John Wiley & Sons, Inc. J Graph Theory 36: 198–216, 2001  相似文献   

6.
Joint progressive censoring schemes are quite useful to conduct comparative life‐testing experiment of different competing products. Recently, Mondal and Kundu (“A New Two Sample Type‐II Progressive Censoring Scheme,” Commun Stat‐Theory Methods; 2018) introduced a joint progressive censoring scheme on two samples known as the balanced joint progressive censoring (BJPC) scheme. Optimal planning of such progressive censoring scheme is an important issue to the experimenter. This article considers optimal life‐testing plan under the BJPC scheme using the Bayesian precision and D‐optimality criteria, assuming that the lifetimes follow Weibull distribution. In order to obtain the optimal BJPC life‐testing plans, one needs to carry out an exhaustive search within the set of all admissible plans under the BJPC scheme. However, for large sample size, determination of the optimal life‐testing plan is difficult by exhaustive search technique. A metaheuristic algorithm based on the variable neighborhood search method is employed for computation of the optimal life‐testing plan. Optimal plans are provided under different scenarios. The optimal plans depend upon the values of the hyperparameters of the prior distribution. The effect of different prior information on optimal scheme is studied.  相似文献   

7.
We introduce a unifying framework for studying edge‐coloring problems on multigraphs. This is defined in terms of a rooted directed multigraph , which is naturally associated to the set of fans based at a given vertex u in a multigraph G. We call the “Fan Digraph.” We show that fans in G based at u are in one‐to‐one correspondence with directed trails in starting at the root of . We state and prove a central theorem about the fan digraph, which embodies many edge‐coloring results and expresses them at a higher level of abstraction. Using this result, we derive short proofs of classical theorems. We conclude with a new, generalized version of Vizing's Adjacency Lemma for multigraphs, which is stronger than all those known to the author. © 2005 Wiley Periodicals, Inc. J Graph Theory 51: 301–318, 2006  相似文献   

8.
A sentence of the usual language of set theory is said to be stratified if it is obtained by “erasing” type indices in a sentence of the language of Russell's Simple Theory of Types. In this paper we give an alternative presentation of a proof the ambiguity theorem stating that any provable stratified sentence has a stratified proof. To this end, we introduce a new set of ambiguity axioms, inspired by Fraïssé's characterization of elementary equivalence; these axioms can be naturally used to give different proofs of the ambiguity theorem (semantic or syntactic, classical or intuitionistic). MSC: 03B15, 03F50, 03F55.  相似文献   

9.
Let G be a graph of order n, maximum degree Δ, and minimum degree δ. Let P(G, λ) be the chromatic polynomial of G. It is known that the multiplicity of zero “0” of P(G, λ) is one if G is connected, and the multiplicity of zero “1” of P(G, λ) is one if G is 2‐connected. Is the multiplicity of zero “2” of P(G, λ) at most one if G is 3‐connected? In this article, we first construct an infinite family of 3‐connected graphs G such that the multiplicity of zero “2” of P(G, λ) is more than one, and then characterize 3‐connected graphs G with Δ + δ?n such that the multiplicity of zero “2” of P(G, λ) is at most one. In particular, we show that for a 3‐connected graph G, if Δ + δ?n and (Δ, δ3)≠(n?3, 3), where δ3 is the third minimum degree of G, then the multiplicity of zero “2” of P(G, λ) is at most one. © 2011 Wiley Periodicals, Inc. J Graph Theory  相似文献   

10.
This study compared one lesson across four U.S. “traditional” textbook series, two U.S. reform‐based textbook series, and one Chinese mathematics textbook series in teaching the connection between multiplication and division. The results showed the differences across U.S. and Chinese lessons in both the teaching and the practice parts of the lesson across three dimensions (i.e., problem schemata, response requirement, and algebra readiness). In particular, the Chinese lesson's penetrating analysis or explanation of the topic is reflected in its deliberately constructed examples and wide range of problems (pertaining to problem types and difficulty levels) present in the teaching and practice sections of the lesson. None of analyzed U.S. lessons are comparable with the Chinese lesson with respect to the breadth and depth in teaching the topic. A deliberate emphasis, both arithmetically and algebraically, on problem schema acquisition as found in the Chinese lesson represents a promotion of symbolic or higher order of conceptual understanding. The findings are discussed within the context of teaching big ideas through problem schemata acquisition and the importance of symbolic level of conceptual understanding.  相似文献   

11.
12.
Let ??n be the class of unlabeled trees with n vertices, and denote by H n a tree that is drawn uniformly at random from this set. The asymptotic behavior of the random variable degk(H n) that counts vertices of degree k in H n was studied, among others, by Drmota and Gittenberger in [J Graph Theory 31(3) (1999), 227–253], who showed that this quantity satisfies a central limit theorem. This result provides a very precise characterization of the “central region” of the distribution, but does not give any non‐trivial information about its tails. In this work, we study further the number of vertices of degree k in H n. In particular, for k = ??((logn/(loglogn))1/2) we show exponential‐type bounds for the probability that degk(H n) deviates from its expectation. On the technical side, our proofs are based on the analysis of a randomized algorithm that generates unlabeled trees in the so‐called Boltzmann model. The analysis of such algorithms is quite well‐understood for classes of labeled graphs, see e.g. the work [Bernasconi et al., SODA '08: Proceedings of the 19th Annual ACM‐SIAM Symposium on Discrete Algorithms, Society for Industrial and Applied Mathematics, Philadelphia, PA, 2008, pp. 132–141; Bernasconi et al., Proceedings of the 11th International Workshop, APPROX 2008, and 12th International Workshop, RANDOM 2008 on Approximation, Randomization and Combinatorial Optimization, Springer, Berlin, 2008, pp. 303–316] by Bernasconi, the first author, and Steger. Comparable algorithms for unlabeled classes are unfortunately much more complex. We demonstrate in this work that they can be analyzed very precisely for classes of unlabeled graphs as well. © 2011 Wiley Periodicals, Inc. J Graph Theory. 69:114‐130, 2012  相似文献   

13.
Coalitional games raise a number of important questions from the point of view of computer science, key among them being how to represent such games compactly, and how to efficiently compute solution concepts assuming such representations. Marginal contribution nets (MC‐nets), introduced by Ieong and Shoham, are one of the simplest and most influential representation schemes for coalitional games. MC‐nets are a rulebased formalism, in which rules take the form patternvalue, where “pattern ” is a Boolean condition over agents, and “value ” is a numeric value. Ieong and Shoham showed that, for a class of what we will call “basic” MC‐nets, where patterns are constrained to be a conjunction of literals, marginal contribution nets permit the easy computation of solution concepts such as the Shapley value. However, there are very natural classes of coalitional games that require an exponential number of such basic MC‐net rules. We present read‐once MC‐nets, a new class of MC‐nets that is provably more compact than basic MC‐nets, while retaining the attractive computational properties of basic MC‐nets. We show how the techniques we develop for read‐once MC‐nets can be applied to other domains, in particular, computing solution concepts in network flow games on series‐parallel networks (© 2009 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

14.
ABSTRACT. During the restoration planning phase of the natural resource damage assessment (NRDA) process, potential injuries to natural resources and services are evaluated in terms of the nature, degree and extent of injury so that the need for and scale of restoration actions can be ascertained. Injuries are quantified by comparing the condition of the injured natural resource relative to baseline (pre‐injury) conditions. The “Type A” procedures are used to quantify damages from smaller spills and rely on a standardized methodology and computer model to calculate injury and value of damages. In this model, fishery stock changes from injuries and resulting changes in user participation are not treated as dynamic. If true stock growth and re‐growth are indeed dynamic, then the Type A model is likely underestimating fishery losses. The purpose of this paper is to illustrate the potential for such underestimation by comparing simulated stock and harvest losses under dynamic treatment and a static treatment that more closely represents the way stock and service losses are estimated under the current NRDA process.  相似文献   

15.
We investigate modules over “systematic” rings. Such rings are “almost graded” and have appeared under various names in the literature; they are special cases of the G-systems of Grzeszczuk. We analyse their K-theory in the presence of conditions on the support, and explain how this generalises and unifies calculations of graded and filtered K-theory scattered in the literature. Our treatment makes systematic use of the formalism of idempotent completion and a theory of triangular objects in additive categories, leading to elementary and transparent proofs throughout.  相似文献   

16.
G. Karch  W.A. Woyczynski 《PAMM》2007,7(1):1030205-1030206
Nonlinear and nonlocal evolution equations of the form ut = ℒ︁u ± |∇u |q, where ℒ︁ is a pseudodifferential operator representing the infinitesimal generator of a Lévy stochastic process, have been derived (see, [6]) as models for growing interfaces in the case when the continuous Brownian diffusion surface transport is augmented by a random hopping mechanism. The goal of this note is to report properties of solutions to this equation resulting from the interplay between the strengths of the “diffusive” linear and “hyperbolic” nonlinear terms, posed in the whole space R N , and supplemented with nonnegative, bounded, and sufficiently regular initial conditions. The full text of the paper, including complete proofs and other results, will appear in the Transactions of the American Mathematical Society. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

17.
In (J Graph Theory 33 (2000), 14–24), Hell and Zhu proved that if a series–parallel graph G has girth at least 2?(3k?1)/2?, then χc(G)≤4k/(2k?1). In (J Graph Theory 33 (2000), 185–198), Chien and Zhu proved that the girth condition given in (J Graph Theory 33 (2000), 14–24) is sharp. Short proofs of both results are given in this note. © 2010 Wiley Periodicals, Inc. J Graph 66: 83‐88, 2010 Theory  相似文献   

18.
The Moran process is a random process that models the spread of genetic mutations through graphs. On connected graphs, the process eventually reaches “fixation,” where all vertices are mutants, or “extinction,” where none are. Our main result is an almost‐tight upper bound on expected absorption time. For all ?>0, we show that the expected absorption time on an n‐vertex graph is o(n3+?). Specifically, it is at most , and there is a family of graphs where it is Ω(n3). In proving this, we establish a phase transition in the probability of fixation, depending on the mutants' fitness r. We show that no similar phase transition occurs for digraphs, where it is already known that the expected absorption time can be exponential. Finally, we give an improved fully polynomial randomized approximation scheme (FPRAS) for approximating the probability of fixation. On degree‐bounded graphs where some basic properties are given, its running time is independent of the number of vertices.  相似文献   

19.
The first proof of the fundamental theorem of algebra, proposed by D'Alembert in 1746 and practically unknown to this day, stimulated a series of “analytic” proofs which made essential use of the properties of polynomials as analytic functions and placed the theorem within complex analysis. One of the simplest and most widely known modern proofs is formed from the “analytic” proofs of Gauss, Argand, Legendre and Cauchy, which used and developed the ideas of D'Alembert.  相似文献   

20.
Electoral control refers to attempts by an election's organizer (“the chair”) to influence the outcome by adding/deleting/partitioning voters or candidates. The important paper of Bartholdi, Tovey, and Trick [1] that introduces (constructive) control proposes computational complexity as a means of resisting control attempts: Look for election systems where the chair's task in seeking control is itself computationally infeasible. We introduce and study a method of combining two or more candidate‐anonymous election schemes in such a way that the combined scheme possesses all the resistances to control (i.e., all the NP‐hardnesses of control) possessed by any of its constituents: It combines their strengths. From this and new resistance constructions, we prove for the first time that there exists a neutral, anonymous election scheme (whose winner problem is computable in polynomial time) that is resistant to all twenty standard types of electoral control (© 2009 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号