Seymour’s distance two conjecture states that in any digraph there exists a vertex (a “Seymour vertex”) that has at least as many neighbors at distance two as it does at distance one. We explore the validity of probabilistic statements along lines suggested by Seymour’s conjecture, proving that almost surely there are a “large” number of Seymour vertices in random tournaments and “even more” in general random digraphs. 相似文献
We introduce an equivalence class of varied properties for hypergraphs. Any hypergraph possessing any one of these properties must of necessity possess them all. Since almost all random hypergraphs share these properties, we term these properties quasi-random. With these results, it becomes quite easy to show that many natural explicit constructions result in hypergraphs which imitate random hypergraphs in a variety of ways. 相似文献
We characterize asymptotic collective behavior of rectangular random matrices, the sizes of which tend to infinity at different rates. It appears that one can compute the limits of all noncommutative moments (thus all spectral properties) of the random matrices we consider because, when embedded in a space of larger square matrices, independent rectangular random matrices are asymptotically free with amalgamation over a subalgebra. Therefore, we can define a “rectangular-free convolution”, which allows to deduce the singular values of the sum of two large independent rectangular random matrices from the individual singular values. This convolution is linearized by cumulants and by an analytic integral transform, that we called the “rectangular R-transform”. 相似文献
We study some properties of graphs (or, rather, graph sequences) defined by demanding that the number of subgraphs of a given type, with vertices in subsets of given sizes, approximatively equals the number expected in a random graph. It has been shown by several authors that several such conditions are quasi-random, but that there are exceptions. In order to understand this better, we investigate some new properties of this type. We show that these properties too are quasi-random, at least in some cases; however, there are also cases that are left as open problems, and we discuss why the proofs fail in these cases.The proofs are based on the theory of graph limits; and on the method and results developed by Janson (2011), this translates the combinatorial problem to an analytic problem, which then is translated to an algebraic problem. 相似文献
This paper introduces a new notion of a “procedural” value for cooperative TU games. A procedural value is determined by an underlying procedure of sharing marginal contributions to coalitions formed by players joining in random order. We consider procedures under which players can only share their marginal contributions with their predecessors in the ordering, and study the set of all resulting values. The most prominent procedural value is, of course, the Shapley value obtaining under the simplest procedure of every player just retaining his entire marginal contribution. But different sharing rules lead to other interesting values, including the “egalitarian solution” and the Nowak and Radzik “solidarity value”. All procedural values are efficient, symmetric and linear. Moreover, it is shown that these properties together with two very natural monotonicity postulates characterize the class of procedural values. Some possible modifications and generalizations are also discussed. In particular, it is shown that dropping one of monotonicity axioms is equivalent to allowing for sharing marginal contributions with both predecessors and successors in the ordering. 相似文献
A modification of a scheme of allocating a random number of particles to cells is considered in which particles are allocated according to some multinomial scheme “in groups” with a random number of particles in each. The asymptotic properties of the distribution of the number of trials (allocated “groups” of particles) needed to fill a given number of cells are investigated. 相似文献
Complete balanced Howell rotations are mathematical designs for highly structured paired-comparison experiments. They have been widely used in bridge tournaments, and many special cases are applicable to round-robin tournaments for all kinds of sports. In this paper we construct complete balanced Howell rotations of 8k + 5 teams, for 8k + 5 a prime power greater than 5. We also show that “table balanced” complete balanced Howell rotations for 8k + 5 teams do not exist unless 8k + 5 is the sum of two squares of integers. 相似文献
Haviland and Thomason and Chung and Graham were the first to investigate systematically some properties of quasi-random hypergraphs. In particular, in a series of articles, Chung and Graham considered several quite disparate properties of random-like hypergraphs of density 1/2 and proved that they are in fact equivalent. The central concept in their work turned out to be the so called deviation of a hypergraph. They proved that having small deviation is equivalent to a variety of other properties that describe quasi-randomness. In this paper, we consider the concept of discrepancy for k-uniform hypergraphs with an arbitrary constant density d (0<d<1) and prove that the condition of having asymptotically vanishing discrepancy is equivalent to several other quasi-random properties of , similar to the ones introduced by Chung and Graham. In particular, we prove that the correct “spectrum” of the s-vertex subhypergraphs is equivalent to quasi-randomness for any s⩾2k. Our work may be viewed as a continuation of the work of Chung and Graham, although our proof techniques are different in certain important parts. 相似文献
We characterize the hamiltonian tournaments that admits exactly one spanning cycle and the hamiltonian tournaments with the least number of 3-cycles by their hamiltonian subtournaments which have the same properties respectively. 相似文献
We show that if a graph G has the property that all subsets of vertices of size n/4 contain the “correct” number of triangles one would expect to find in a random graph G(n, 1/2), then G behaves like a random graph, that is, it is quasi-random in the sense of Chung, Graham, and Wilson [6]. This answers positively
an open problem of Simonovits and Sós [10], who showed that in order to deduce that G is quasi-random one needs to assume that all sets of vertices have the correct number of triangles. A similar improvement
of [10] is also obtained for any fixed graph other than the triangle, and for any edge density other than 1/2. The proof relies
on a theorem of Gottlieb [7] in algebraic combinatorics, concerning the rank of set inclusion matrices. 相似文献
For a stationary Gaussian process either almost all sample paths are almost everywhere differentiable or almost all sample paths are almost nowhere differentiable. In this paper it is shown by means of an example involving a random lacunary trigonometric series that “almost everywhere differentiable” and “almost nowhere differentiable” cannot in general be replaced by “everywhere differentiable” and “nowhere differentiable”, respectively. 相似文献
When attributes are rare and few or none are observed in the selected sample from a finite universe, sampling statisticians are increasingly being challenged to use whatever methods are available to declare with high probability or confidence that the universe is near or completely attribute-free. This is especially true when the attribute is undesirable. Approximations such as those based on normal theory are frequently inadequate with rare attributes. For simple random sampling without replacement, an appropriate probability distribution for statistical inference is the hypergeometric distribution. But even with the hypergeometric distribution, the investigator is limited from making claims of attribute-free with high confidence unless the sample size is quite large using nonrandomized techniques. For students in statistical theory, this short article seeks to revive the question of the relevance of randomized methods. When comparing methods for construction of confidence bounds in discrete settings, randomization methods are useful in fixing like confidence levels and hence facilitating the comparisons. Under simple random sampling, this article defines and presents a simple algorithm for the construction of exact “randomized” upper confidence bounds which permit one to possibly report tighter bounds than those exact bounds obtained using “nonrandomized” methods. A general theory for exact randomized confidence bounds is presented in Lehmann (1959, p. 81), but Lehmann's development requires more mathematical development than is required in this application. Not only is the development of these “randomized” bounds in this paper elementary, but their desirable properties and their link with the usual nonrandomized bounds are easy to see with the presented approach which leads to the same results as would be obtained using the method of Lehmann. 相似文献
Simulated Annealing and Genetic Algorithms are important methods to solve discrete optimization problems and are often used
to find approximate solutions for diverse NP-complete problems. They depend on randomness to change their current configuration
and transition to a new state. In Simulated Annealing, the random choice influences the construction of the new state as well
as the acceptance of that new state. In Genetic Algorithms, selection, mutation and crossover depend on random choices. We
experimentally investigate the robustness of the two generic search heuristics when using pseudorandom numbers of limited
quality. To this end, we conducted experiments with linear congruential generators of various period lengths, a Mersenne Twister
with artificially reduced period lengths as well as quasi-random numbers as the source of randomness. Both heuristics were
used to solve several instances of the Traveling Salesman Problem in order to compare optimization results. Our experiments
show that both Simulated Annealing and the Genetic Algorithm produce inferior solutions when using random numbers with small
period lengths or quasi-random numbers of inappropriate dimension. The influence on Simulated Annealing, however, is more
severe than on Genetic Algorithms. Interestingly, we found that when using diverse quasi-random sequences, the Genetic Algorithm
outperforms its own results using quantum random numbers. 相似文献
A new method is developed for the statistical mechanics of composite materials — the generalized selfadjustment method — which makes it possible to reduce the problem of predicting effective elastic properties of composites with random structures to the solution of two simpler “averaged” problems of an inclusion with transitional layers in a medium with the desired effective elastic properties. The inhomogeneous elastic properties and dimensions of the transitional layers take into account both the “approximate” order of mutual positioning, and also the variation in the dimensions and elastics properties of inclusions through appropriate special averaged indicator functions of the random structure of the composite. A numerical calculation of averaged indicator functions and effective elastic characteristics is performed by the generalized self-adjustment method for a unidirectional fiberglass on the basis of various models of actual random structures in the plane of isotropy. 相似文献
Fuzzy random variables have been introduced by Puri and Ralescu as an extension of random sets. In this paper, we first introduce a real-valued generalized measure of the “relative variation” (or inequality) associated with a fuzzy random variable. This measure is inspired in Csiszár's f-divergence, and extends to fuzzy random variables many well-known inequality indices. To guarantee certain relevant properties of this measure, we have to distinguish two main families of measures which will be characterized. Then, the fundamental properties are derived, and an outstanding measure in each family is separately examined on the basis of an additive decomposition property and an additive decomposability one. Finally, two examples illustrate the application of the study in this paper. 相似文献
Numerous problems in numerical analysis, including matrix inversion, eigen-value calculations, and polynomial zero finding, share the following property: the difficulty of solving a given problem is large when the distance from that problem to the nearest “ill-posed” one is small. For example, the closer a matrix is to the set of noninvertible matrices, the larger its condition number with respect to inversion. We show that the sets of ill-posed problems for matrix inversion, eigen-problems, and polynomial zero finding all have a common algebraic and geometric structure which lets us compute the probability distribution of the distance from a “random” problem to the set. From this probability distribution we derive, for example, the distribution of the condition number of a random matrix. We examine the relevance of this theory to the analysis and construction of numerical algorithms destined to be run in finite precision arithmetic. 相似文献