首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
The practical usefulness of Markov models and Markovian decision process has been severely limited due to their extremely large dimension. Thus, a reduced model without sacrificing significant accuracy can be very interesting.

The homogeneous finite Markov chain's long-run behaviour is given by the persistent states, obtained after the decomposition in classes of connected states. In this paper we expound a new reduction method for ergodic classes formed by such persistent states. An ergodic class has a steady-state independent of the initial distribution. This class constitutes an irreducible finite ergodic Markov chain, which evolves independently after the capture of the event.

The reduction is made according to the significance of steady-state probabilities. For being treatable by this method, the ergodic chain must have the Two-Time-Scale property.

The presented reduction method is an approximate method. We begin with an arrangement of irreducible Markov chain states, in decreasing order of their steady state probability's size. Furthermore, the Two-Time-Scale property of the chain enables us to make an assumption giving the reduction. Thus, we reduce the ergodic class only to its stronger part, which contains the most important events having also a slower evolution. The reduced system keeps the stochastic property, so it will be a Markov chain  相似文献   

2.
The previous results describing the generalization ability of Empirical Risk Minimization (ERM) algorithm are usually based on the assumption of independent and identically distributed (i.i.d.) samples. In this paper we go far beyond this classical framework by establishing the first exponential bound on the rate of uniform convergence of the ERM algorithm with V-geometrically ergodic Markov chain samples, as the application of the bound on the rate of uniform convergence, we also obtain the generalization bounds of the ERM algorithm with V-geometrically ergodic Markov chain samples and prove that the ERM algorithm with V-geometrically ergodic Markov chain samples is consistent. The main results obtained in this paper extend the previously known results of i.i.d. observations to the case of V-geometrically ergodic Markov chain samples.  相似文献   

3.
洪沆 《数学杂志》2015,35(5):1259-1268
本文研究了随机环境中单链■的强遍历性,得到了单链强遍历的充分条件以及与强遍历性等价的一些形式.利用鞅收敛定理,给出了单链强遍历下尾的结构,最后证明了在环境平稳的条件下,强遍历、平凡尾、弱遍历三者之间的关系,推广了经典马氏链理论中相应的结果.  相似文献   

4.
A key result underlying the theory of MCMC is that any η-irreducible Markov chain having a transition density with respect to η and possessing a stationary distribution π is automatically positive Harris recurrent. This paper provides a short self-contained proof of this fact using the ergodic theorem in its standard form as the most advanced tool.  相似文献   

5.
The central limit theorem for Markov chains started at a point   总被引:2,自引:0,他引:2  
 The aim of this paper is to prove a central limit theorem and an invariance principle for an additive functional of an ergodic Markov chain on a general state space, with respect to the law of the chain started at a point. No irreducibility assumption nor mixing conditions are imposed; the only assumption bears on the growth of the L 2 -norms of the ergodic sums for the function generating the additive functional, which must be with . The result holds almost surely with respect to the invariant probability of the chain. Received: 17 October 2001 / Revised version: 5 April 2002 / Published Online: 24 October 2002 Mathematics Subject Classification (2000): 60F05, 60J05  相似文献   

6.
The ergodic theory of Markov chains in random environments   总被引:70,自引:0,他引:70  
Summary A general formulation of the stochastic model for a Markov chain in a random environment is given, including an analysis of the dependence relations between the environmental process and the controlled Markov chain, in particular the problem of feedback. Assuming stationary environments, the ergodic theory of Markov processes is applied to give conditions for the existence of finite invariant measure (equilibrium distributions) and to obtain ergodic theorems, which provide results on convergence of products of random stochastic matrices. Coupling theory is used to obtain results on direct convergence of these products and the structure of the tail -field. State properties including classification and communication properties are discussed.  相似文献   

7.
本文考虑可数状态离散时间齐次马氏链平稳分布的存在与唯一性.放弃以往大多数文献中要求马氏链是不可约,正常返且非周期(即遍历)的条件,本文仅需要马氏链是不可约和正常返的(但可能是周期的,因而可能是非遍历的).在此较弱的条件下,本文不仅给出了平稳分布存在与唯一性的简洁证明,而且还给出了平稳分布的计算方法.  相似文献   

8.
Summary A homogeneous Markov chain on a countable state space can be classified as ergodic, geometrically ergodic, or strongly ergodic. Ergodicity and strong ergodicity have been characterized using the -coefficient. In this paper the -coefficient is used to characterize geometric ergodicity.  相似文献   

9.
We extend the central limit theorem for additive functionals of a stationary, ergodic Markov chain with normal transition operator due to Gordin and Lif?ic, 1981 [A remark about a Markov process with normal transition operator, In: Third Vilnius Conference on Probability and Statistics 1, pp. 147–48] to continuous-time Markov processes with normal generators. As examples, we discuss random walks on compact commutative hypergroups as well as certain random walks on non-commutative, compact groups.  相似文献   

10.
本文在文献[1]的基础上继续讨论了广生灭马氏链,求出了向下的首达时间分布及各级矩,给出了广生灭马氏链遍历的充分必要条件以及平均反回时间的计算公式,并且在遍历的条件下,求出了其平稳分布。  相似文献   

11.
Abstract

The members of a set of conditional probability density functions are called compatible if there exists a joint probability density function that generates them. We generalize this concept by calling the conditionals functionally compatible if there exists a non-negative function that behaves like a joint density as far as generating the conditionals according to the probability calculus, but whose integral over the whole space is not necessarily finite. A necessary and sufficient condition for functional compatibility is given that provides a method of calculating this function, if it exists. A Markov transition function is then constructed using a set of functionally compatible conditional densities and it is shown, using the compatibility results, that the associated Markov chain is positive recurrent if and only if the conditionals are compatible. A Gibbs Markov chain, constructed via “Gibbs conditionals” from a hierarchical model with an improper posterior, is a special case. Therefore, the results of this article can be used to evaluate the consequences of applying the Gibbs sampler when the posterior's impropriety is unknown to the user. Our results cannot, however, be used to detect improper posteriors. Monte Carlo approximations based on Gibbs chains are shown to have undesirable limiting behavior when the posterior is improper. The results are applied to a Bayesian hierarchical one-way random effects model with an improper posterior distribution. The model is simple, but also quite similar to some models with improper posteriors that have been used in conjunction with the Gibbs sampler in the literature.  相似文献   

12.
Consider a Markov additive chain (V,Z) with a negative horizontal drift on a half-plane. We provide the limiting distribution of Z when V passes a threshold for the first time, as V tends to infinity. Our contribution is to allow the Markovian part of an associated twisted Markov chain to be null recurrent or transient. The positive recurrent case was treated by Kesten [Ann. Probab. 2 (1974), 355–386]. Moreover, a ratio limit will be established for a transition kernel with unbounded jumps.  相似文献   

13.
Potential Theory for ergodic Markov chains (with a discrete state spare and a continuous parameter) is developed in terms of the fundamental matrix of a chain.A notion of an ergodic potential for a chain is introduced and a form of Riesz decomposition theorem for measures is proved. Ergodic potentials of charges (with total charge zero) are shown to play the role of Green potentials for transient chains.  相似文献   

14.
He  Qi-Ming  Li  Hui 《Queueing Systems》2003,44(2):137-160
In this paper, we study the stability conditions of the MMAP[K]/G[K]/1/LCFS preemptive repeat queue. We introduce an embedded Markov chain of matrix M/G/1 type with a tree structure and identify conditions for the Markov chain to be ergodic. First, we present three conventional methods for the stability problem of the queueing system of interest. These methods are either computationally demanding or do not provide accurate information for system stability. Then we introduce a novel approach that develops two linear programs whose solutions provide sufficient conditions for stability or instability of the queueing system. The new approach is numerically efficient. The advantages and disadvantages of the methods introduced in this paper are analyzed both theoretically and numerically.  相似文献   

15.
Let (X,μ, T) be an ergodic dynamic system and let ξ = (C1, C2, ...) be a discrete decomposition of X. Conditions are considered for the existence almost everywhere of $$\mathop {\lim }\limits_{n \to \infty } \frac{1}{n}\left| {\log \mu (C_{\xi ^n } (x))} \right|,$$ whereC ξn(x) is the element of the decomposition ξn = ξ V T ξ V ... < Tn-1ξ containing x. It is proved that the condition H(ξ) < ∞ is close to being necessary. If T is a Markov automorphism and ξ is the decomposition into states, then the limit exists, even if H(ξ) = ∞, and is equal to the entropy of the chain.  相似文献   

16.
朱志锋  张绍义 《数学学报》2019,62(2):287-292
该文在一般状态空间下研究马氏链指数遍历性,指数遍历马氏链,增加条件π(f~p)<∞, p> 1,利用耦合方法得到了存在满的吸收集,使得马氏链在其上是f-指数遍历的.  相似文献   

17.
Let {T(t)}t0 be a strongly continuous semi-group of Markov operators on C(X) with generator G. If mC(X) is strictly positive, mG generates a semigroup. If {T(t)} is a group given by a flow, m may have isolated zeros and, under some regularity conditions, mG will still generate a flow, constructed explicitly. The connection between some ergodic properties of the new and original flow is studied. For the Markov semi-groups, the new one is strongly ergodic if and only if the original one is strongly ergodic.Our thanks to Howard Levine for pointing out the literature on multiplication perturbations of semi-group generators and to the referee whose comments enabled us to considerably shorten the argument for Theorem 2.1(b).Research partly supported by NSF grant GP 34118  相似文献   

18.
Decision-making in an environment of uncertainty and imprecision for real-world problems is a complex task. In this paper it is introduced general finite state fuzzy Markov chains that have a finite convergence to a stationary (may be periodic) solution. The Cesaro average and the -potential for fuzzy Markov chains are defined, then it is shown that the relationship between them corresponds to the Blackwell formula in the classical theory of Markov decision processes. Furthermore, it is pointed out that recurrency does not necessarily imply ergodicity. However, if a fuzzy Markov chain is ergodic, then the rows of its ergodic projection equal the greatest eigen fuzzy set of the transition matrix. Then, the fuzzy Markov chain is shown to be a robust system with respect to small perturbations of the transition matrix, which is not the case for the classical probabilistic Markov chains. Fuzzy Markov decision processes are finally introduced and discussed.  相似文献   

19.
20.
Summary The relation between the ergodic coefficient and deficiency relative to the least informative experiment is investigated. The result is applied to nonhomogeneous Markov chains (NMC's). Our main result can be described as follows: Given an NMC, define the experiments n (j) for n1 consisting in observing the (n+j)-th state of the chain, the j-th state being the unknown parameter. Then the chain is weakly ergodic if and only if for any j, n (j) converges as n (with respect to deficiencies) to the least informative experiment. It is finally shown that in the homogeneous case, the rate of convergence is always exponential.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号