首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Coupling procedures for Markov renewal processes are described. Applications to ergodic theorems for processes with semi-Markov switchings are considered.This paper was partly prepared with the support of NFR Grant F-UP 10257-300.  相似文献   

2.
Limit theorems for branching Markov processes   总被引:1,自引:0,他引:1  
We establish almost sure limit theorems for a branching symmetric Hunt process in terms of the principal eigenvalue and the ground state of an associated Schrödinger operator. Here the branching rate and the branching mechanism can be state-dependent. In particular, the branching rate can be a measure belonging to a certain Kato class and is allowed to be singular with respect to the symmetrizing measure for the underlying Hunt process X. The almost sure limit theorems are established under the assumption that the associated Schrödinger operator of X has a spectral gap. Such an assumption is satisfied if the underlying process X is a Brownian motion, a symmetric α-stable-like process on or a relativistic symmetric stable process on .  相似文献   

3.
We derive universal strong limit theorems for increments of compound renewal processes which unify the strong law of large numbers, Erd?s-Rényi law, Csörg?-Révész law, and law of the iterated logarithm for such processes. New results are obtained under various moment assumptions on distributions of random variables generating the process. In particular, we study the case of distributions from domains of attraction of the normal law and completely asymmetric stable laws with index α ∈ (1, 2). Bibliography: 15 titles.  相似文献   

4.
5.
A necessary and sufficient condition for convergence of Markov processesL is given. As a consequence we get a theorem concerning the convergence of Harris processes. This paper is a part of the author’s Ph.D. thesis to be submitted to the Hebrew University of Jerusalem. The author wishes to express his thanks to Professor S. R. Foguel for much valuable advice and encouragement.  相似文献   

6.
7.
Summary Let X(t)=(X 1 (t), X 2 (t), , X t (t)) be a k-type (2k<) continuous time, supercritical, nonsingular, positively regular Markov branching process. Let M(t)=((m ij (t))) be the mean matrix where m ij (t)=E(X j (t)¦X r (0)= ir for r=1, 2, , k) and write M(t)=exp(At). Let be an eigenvector of A corresponding to an eigenvalue . Assuming second moments this paper studies the limit behavior as t of the stochastic process . It is shown that i) if 2 Re >1, then · X(t)e{–t¦ converges a.s. and in mean square to a random variable. ii) if 2 Re 1 then [ · X(t)] f(v · X(t)) converges in law to a normal distribution where f(x)=(x) –1 if 2 Re <1 and f(x)=(x log x)–1 if 2 Re =1, 1 the largest real eigenvalue of A and v the corresponding right eigenvector.Research supported in part under contracts N0014-67-A-0112-0015 and NIH USPHS 10452 at Stanford University.  相似文献   

8.
Translated fromProblemy Ustoichivosti Stokhasticheskikh Modelei. Trudy Seminara, 1988, pp. 100–115.  相似文献   

9.
10.
Given a sequence \({\mathcal{U} =\{U_n: n \in \omega\}}\) of non-empty open subsets of a space X, a set \({\{x_n : n \in \omega\}}\) is a selection of \({\mathcal{U}}\) if \({x_n \in U_n}\) for every \({n \in \omega}\). We show that a space X is uncountable if and only if every sequence of non-empty open subsets of C p (X) has a closed discrete selection. The same statement is not true for \({C_p(X,[0,1])}\) so we study when the above selection property (which we call discrete selectivity) holds in \({C_p(X,[0,1])}\). We prove, among other things, that \({C_p(X, [0,1])}\) is discretely selective if X is an uncountable Lindelöf \({\Sigma}\)-space. We also give a characterization, in terms of the topology of X, of discrete selectivity of \({C_p(X,[0,1])}\) if X is an \({\omega}\)-monolithic space of countable tightness.  相似文献   

11.
12.
13.
Summary The empirical measure P n for iid sampling on a distribution P is formed by placing mass n –1 at each of the first n observations. Generalizations of the classical Glivenko-Cantelli theorem for empirical measures have been proved by Vapnik and ervonenkis using combinatorial methods. They found simple conditions on a class C to ensure that sup {|P n (C) – P(C)|: C C} converges in probability to zero. They used a randomization device that reduced the problem to finding exponential bounds on the tails of a hypergeometric distribution. In this paper an alternative randomization is proposed. The role of the hypergeometric distribution is thereby taken over by the binomial distribution, for which the elementary Bernstein inequalities provide exponential boundson the tails. This leads to easier proofs of both the basic results of Vapnik-ervonenkis and the extensions due to Steele. A similar simplification is made in the proof of Dudley's central limit theorem forn 1/2(P P n –P)— a result that generalizes Donsker's functional central limit theorem for empirical distribution functions.This research was supported in part by the Air Force Office of Scientific Research, Contract No. F49620-79-C-0164  相似文献   

14.
Summary This paper applies renewal theoretic arguments and an elementary lemma on the transforms of convolutions to prove solidarity and limit theorems for semi-Markov processes.I wish to thank Dr. Vere-Jones for his helpful comments and suggestions.  相似文献   

15.
A Markov Renewal Process (M.R.P.) is a process similar to a Markov chain, except that the time required to move from one state to another is not fixed, but is a random variable whose distribution may depend on the two states between which the transition is made. For an M.R.P. ofm (<∞) states we derive a goodness-of-fit test for a hypothetical matrix of transition probabilities. This test is similar to the test Bartlett has derived for Markov chains. We calculate the first two moments of the test statistic and modify it to fit the moments of a standard χ2. Finally, we illustrate the above procedure numeerically for a particular case of a two-state M.R.P. Dwight B. Brock is mathematical statistican, Office of Statistical Methods, National Center for Health Statistics, Rockville, Maryland. A. M. Kshisagar is Associate Professor, Department of Statistics, Southern Methodist University. This research was partially supported by Office of Naval Research Contract No. N000 14-68-A-0515, and by NIH Training Grant GM-951, both with Southern Methodist University. This article is partially based on Dwight B. Brock's Ph.D. dissertation accepted by Southern Methodist University.  相似文献   

16.
17.
The aim of this paper is to prove some limit theorems for Markov processes using only functional analytic methods. Some of our results were proved in [7], [8] and [5] by probabilistic methods. We prove in the Appendix a theorem on Markov processes that have no finite invariant measure. This paper is a part of the author’s Ph.D. thesis to be submitted to the Hebrew University of Jerusalem. The author wishes to express his thanks to Professor S. R. Foguel for much valuable advice and encouragement.  相似文献   

18.
Convergence of andμP n(B)/μP n(a) is established for a certain class of Markov operators,P, whereμ is a measure andB is a subset ofA. The results are proved under certain conditions onP and the setA.  相似文献   

19.
20.
We consider a Markov chain with a general state space, but whose behavior is governed by finite matrices. After a brief exposition of the basic properties of this chain, its convenience as a model is illustrated by three limit theorems. The ergodic theorem, the central limit theorem, and an extreme-value theorem are expressed in terms of dominant eigenvalues of finite matrices and proved by simple matrix theory.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号