首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
In this paper we give explicit constructions of the Martin boundary for some discrete Markov chains. This construction is extended to the product of discrete Markov.chains  相似文献   

2.
3.
Summary The error bound O(1/n) is derived in the central limit theorem for partial sums where j is a recurrent discrete Markov chain and f is a real valued function on the state space. In particular it is shown that for bounded f and starting distribution dominated by some multiple of the stationary one, it is sufficient for the chain to have recurrence times with third moments on order to get this bound.  相似文献   

4.
5.
A class of models called interactive Markov chains is studied in both discrete and continuous time. These models were introduced by Conlisk and serve as a rich class for sociological modeling, because they allow for interactions among individuals. In discrete time, it is proved that the Markovian processes converge to a deterministic process almost surely as the population size becomes infinite. More importantly, the normalized process is shown to be asymptotically normal with specified mean vector and covariance matrix. In continuous time, the chain is shown to converge weakly to a diffusion process with specified drift and scale terms. The distributional results will allow for the construction of a likelihood function from interactive Markov chain data, so these results will be important for questions of statistical inference. An example from manpower planning is given which indicates the use of this theory in constructing and evaluating control policies for certain social systems.  相似文献   

6.
Markov chains are often used as mathematical models of natural phenomena, with transition probabilities defined in terms of parameters that are of interest in the scientific question at hand. Sensitivity analysis is an important way to quantify the effects of changes in these parameters on the behavior of the chain. Many properties of Markov chains can be written as simple matrix expressions, and hence matrix calculus is a powerful approach to sensitivity analysis. Using matrix calculus, we derive the sensitivity and elasticity of a variety of properties of absorbing and ergodic finite-state chains. For absorbing chains, we present the sensitivities of the moments of the number of visits to each transient state, the moments of the time to absorption, the mean number of states visited before absorption, the quasistationary distribution, and the probabilities of absorption in each of several absorbing states. For ergodic chains, we present the sensitivity of the stationary distribution, the mean first passage time matrix, the fundamental matrix, and the Kemeny constant. We include two examples of application of the results to demographic and ecological problems.  相似文献   

7.
8.
9.
Efficient algorithms for finding steady state probabilities are presented and compared with the Gaussian elimination method for two special classes of finite state Markov chains. One class has block matrix steps and a possible jump of up to k block steps, and the other is a generalization of the class considered by Shanthikumar and Sargent where each element is a matrix.  相似文献   

10.
In principle it is possible to characterize the long run behavior of any evolutionary game by finding an analytical expression for its limit probability distribution. However, it is cumbersome to do so when the state space is large and the rate of mutation is significant. This paper gives upper and lower bounds for the limit distribution, which are easy to compute. The bounds are expressed in terms of the maximal and minimal row sums of parts of the transition matrix.  相似文献   

11.
We give simple proofs of large deviation theorems for the occupation measure of a Markov chain using a regeneration argument to establish existence and convexity theory to identify the rate function.  相似文献   

12.
The isomorphism theorem of Dynkin is definitely an important tool to investigate the problems raised in terms of local times of Markov processes. This theorem concerns continuous time Markov processes. We give here an equivalent version for Markov chains.  相似文献   

13.
Motivated by the problem of finding a satisfactory quantum generalization of the classical random walks, we construct a new class of quantum Markov chains which are at the same time purely generated and uniquely determined by a corresponding classical Markov chain. We argue that this construction yields as a corollary, a solution to the problem of constructing quantum analogues of classical random walks which are “entangled” in a sense specified in the paper.The formula giving the joint correlations of these quantum chains is obtained from the corresponding classical formula by replacing the usual matrix multiplication by Schur multiplication.The connection between Schur multiplication and entanglement is clarified by showing that these quantum chains are the limits of vector states whose amplitudes, in a given basis (e.g. the computational basis of quantum information), are complex square roots of the joint probabilities of the corresponding classical chains. In particular, when restricted to the projectors on this basis, the quantum chain reduces to the classical one. In this sense we speak of entangled lifting, to the quantum case, of a classical Markov chain. Since random walks are particular Markov chains, our general construction also gives a solution to the problem that motivated our study.In view of possible applications to quantum statistical mechanics too, we prove that the ergodic type of an entangled Markov chain with finite state space (thus excluding random walks) is completely determined by the corresponding ergodic type of the underlying classical chain. Mathematics Subject Classification (2000) Primary 46L53, 60J99; Secondary 46L60, 60G50, 62B10  相似文献   

14.
给出了Csiszar和Krner关于独立随机变量序列的一个定理的一个推广,该定理的推论是关于相对熵的,在统计假设检验及编码理论中起着重要的作用.利用非齐次马氏链的一个强大数定律将这个定理推广到非齐次马氏链上.  相似文献   

15.
This paper develops bounds on the rate of decay of powers of Markov kernels on finite state spaces. These are combined with eigenvalue estimates to give good bounds on the rate of convergence to stationarity for finite Markov chains whose underlying graph has moderate volume growth. Roughly, for such chains, order (diameter) steps are necessary and suffice to reach stationarity. We consider local Poincaré inequalities and use them to prove Nash inequalities. These are bounds onl 2-norms in terms of Dirichlet forms andl 1-norms which yield decay rates for iterates of the kernel. This method is adapted from arguments developed by a number of authors in the context of partial differential equations and, later, in the study of random walks on infinite graphs. The main results do not require reversibility.  相似文献   

16.
17.
In this paper, subgeometric ergodicity is investigated for continuous-time Markov chains. Several equivalent conditions, based on the first hitting time or the drift function, are derived as the main theorem. In its corollaries, practical drift criteria are given for ?-ergodicity and computable bounds on subgeometric convergence rates are obtained for stochastically monotone Markov chains. These results are illustrated by examples.  相似文献   

18.
Let TRn×n be an irreducible stochastic matrix with stationary distribution vector π. Set A = I − T, and define the quantity , where Aj, j = 1, … , n, are the (n − 1) × (n − 1) principal submatrices of A obtained by deleting the jth row and column of A. Results of Cho and Meyer, and of Kirkland show that κ3 provides a sensitive measure of the conditioning of π under perturbation of T. Moreover, it is known that .In this paper, we investigate the class of irreducible stochastic matrices T of order n such that , for such matrices correspond to Markov chains with desirable conditioning properties. We identify some restrictions on the zero-nonzero patterns of such matrices, and construct several infinite classes of matrices for which κ3 is as small as possible.  相似文献   

19.
20.
Summary This paper deals with asymptotic optimal inference in a time-continuous ergodic Markov chain with countable state space, based on observation of the process up to timet. Let the infinitesimal generator depend on an unknown parameter. Under weak assumptions on the parametrization, we show local asymptotic normality for the statistical model ast. As a consequence, limit distributions of sequences of competing estimators for the unknown parameter are more spread out than a specified normal distribution.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号