首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 607 毫秒
1.
We justify and discuss expressions for joint lower and upper expectations in imprecise probability trees, in terms of the sub- and supermartingales that can be associated with such trees. These imprecise probability trees can be seen as discrete-time stochastic processes with finite state sets and transition probabilities that are imprecise, in the sense that they are only known to belong to some convex closed set of probability measures. We derive various properties for their joint lower and upper expectations, and in particular a law of iterated expectations. We then focus on the special case of imprecise Markov chains, investigate their Markov and stationarity properties, and use these, by way of an example, to derive a system of non-linear equations for lower and upper expected transition and return times. Most importantly, we prove a game-theoretic version of the strong law of large numbers for submartingale differences in imprecise probability trees, and use this to derive point-wise ergodic theorems for imprecise Markov chains.  相似文献   

2.
Discrete time Markov chains with interval probabilities   总被引:1,自引:0,他引:1  
The parameters of Markov chain models are often not known precisely. Instead of ignoring this problem, a better way to cope with it is to incorporate the imprecision into the models. This has become possible with the development of models of imprecise probabilities, such as the interval probability model. In this paper we discuss some modelling approaches which range from simple probability intervals to the general interval probability models and further to the models allowing completely general convex sets of probabilities. The basic idea is that precisely known initial distributions and transition matrices are replaced by imprecise ones, which effectively means that sets of possible candidates are considered. Consequently, sets of possible results are obtained and represented using similar imprecise probability models.We first set up the model and then show how to perform calculations of the distributions corresponding to the consecutive steps of a Markov chain. We present several approaches to such calculations and compare them with respect to the accuracy of the results. Next we consider a generalisation of the concept of regularity and study the convergence of regular imprecise Markov chains. We also give some numerical examples to compare different approaches to calculations of the sets of probabilities.  相似文献   

3.
We analyse the structure of imprecise Markov chains and study their convergence by means of accessibility relations. We first identify the sets of states, so-called minimal permanent classes, that are the minimal sets capable of containing and preserving the whole probability mass of the chain. These classes generalise the essential classes known from the classical theory. We then define a class of extremal imprecise invariant distributions and show that they are uniquely determined by the values of the upper probability on minimal permanent classes. Moreover, we give conditions for unique convergence to these extremal invariant distributions.  相似文献   

4.
We propose a model of random walks on weighted graphs where the weights are interval valued, and connect it to reversible imprecise Markov chains. While the theory of imprecise Markov chains is now well established, this is a first attempt to model reversible chains. In contrast with the existing theory, the probability models that have to be considered are now non-convex. This presents a difficulty in computational sense, since convexity is critical for the existence of efficient optimization algorithms used in the existing models. The second part of the paper therefore addresses the computational issues of the model. The goal is finding sets of weights which maximize or minimize expectations corresponding to multiple steps transition probabilities. In particular, we present a local optimization algorithm and numerically test its efficiency. We show that its application allows finding close approximations of the globally best solutions in reasonable time.  相似文献   

5.
We study the limit behaviour of a nonlinear differential equation whose solution is a superadditive generalisation of a stochastic matrix, prove convergence, and provide necessary and sufficient conditions for ergodicity. In the linear case, the solution of our differential equation is equal to the matrix exponential of an intensity matrix and can then be interpreted as the transition operator of a homogeneous continuous-time Markov chain. Similarly, in the generalised nonlinear case that we consider, the solution can be interpreted as the lower transition operator of a specific set of non-homogeneous continuous-time Markov chains, called an imprecise continuous-time Markov chain. In this context, our convergence result shows that for a fixed initial state, an imprecise continuous-time Markov chain always converges to a limiting distribution, and our ergodicity result provides a necessary and sufficient condition for this limiting distribution to be independent of the initial state.  相似文献   

6.
Suppose we observe a stationary Markov chain with unknown transition distribution. The empirical estimator for the expectation of a function of two successive observations is known to be efficient. For reversible Markov chains, an appropriate symmetrization is efficient. For functions of more than two arguments, these estimators cease to be efficient. We determine the influence function of efficient estimators of expectations of functions of several observations, both for completely unknown and for reversible Markov chains. We construct simple efficient estimators in both cases.  相似文献   

7.
高小燕 《大学数学》2013,29(1):38-42
研究了一类非齐次马氏链———渐近循环马氏链泛函的强大数定律,首先引出了渐近循环马氏链的概念,然后给出了若干引理.利用了渐近循环马氏链关于状态序偶出现频率的强大数定理给出并证明了关于渐近循环马氏链泛函的强大数定律,所得定理作为推论可得到已有的结果.  相似文献   

8.
The previously known works describing the generalization of least-square regularized regression algorithm are usually based on the assumption of independent and identically distributed (i.i.d.) samples. In this paper we go far beyond this classical framework by studying the generalization of least-square regularized regression algorithm with Markov chain samples. We first establish a novel concentration inequality for uniformly ergodic Markov chains, then we establish the bounds on the generalization of least-square regularized regression algorithm with uniformly ergodic Markov chain samples, and show that least-square regularized regression algorithm with uniformly ergodic Markov chains is consistent.  相似文献   

9.
Summary We establish necessary and sufficient condition of positive recurrence for a large class of Markov chains on permutations known as mixed libraries. This condition takes into account both geometrical and probabilistic properties of these chains.  相似文献   

10.
宋明珠  吴永锋 《数学杂志》2015,35(2):368-374
本文研究了马氏随机环境中马氏双链函数的强大数定律.利用将双链函数进行分段研究的方法,获得了马氏环境中马氏双链函数强大数定律成立的一个充分条件.运用该定律,推导出马氏双链从一个状态到另一个状态转移概率的极限性质,进而推广了马氏双链的极限性质.  相似文献   

11.
It is known that each Markov chain has associated with it a polytope and a family of Markov measures indexed by the interior points of the polytope. Measure-preserving factor maps between Markov chains must preserve the associated families. In the present paper, we augment this structure by identifying measures corresponding to points on the boundary of the polytope. These measures are also preserved by factor maps. We examine the data they provide and give examples to illustrate the use of this data in ruling out the existence of factor maps between Markov chains. E. Cawley was partially supported by the Modern Analysis joint NSF grant in Berkeley. S. Tuncel was partially supported by NSF Grant DMS-9303240.  相似文献   

12.
We consider the M/G/1 and GI/M/1 types of Markov chains for which their one step transitions depend on the times of the transitions. These types of Markov chains are encountered in several stochastic models, including queueing systems, dams, inventory systems, insurance risk models, etc. We show that for the cases when the time parameters are periodic the systems can be analyzed using some extensions of known results in the matrix-analytic methods literature. We have limited our examples to those relating to queueing systems to allow us a focus. An example application of the model to a real life problem is presented.  相似文献   

13.
《Journal of Complexity》1998,14(3):319-332
We study the relaxation time of product-type Markov chains approaching a product distribution. We bound the approach to stationarity for such Markov chains in terms of the mixing times of the component Markov chains. In cases where the component mixing times differ considerably we propose an optimized visiting scheme which makes such product-type Markov chains comparable to Gibbs-type samplers. We conclude the paper by discussing the relaxation of Metropolis-type samplers for separable energy functions.  相似文献   

14.
主要研究了树指标非齐次马氏链的广义熵遍历定理.首先证明了树指标非齐次马氏链上的二元函数延迟平均的强极限定理.然后得到了树指标非齐次马氏链上状态出现延迟频率的强大数定律,以及树指标非齐次马氏链的广义熵遍历定理.作为推论,推广了一些已有结果.同时,证明了局部有限无穷树树指标有限状态随机过程广义熵密度的一致可积性.  相似文献   

15.
研究了马氏环境中的可数马氏链,主要证明了过程于小柱集上的回返次数是渐近地服从Poisson分布。为此,引入熵函数h,首先给出了马氏环境中马氏链的Shannon-Mc Millan-Breiman定理,还给出了一个非马氏过程Posson逼近的例子。当环境过程退化为一常数序列时,便得到可数马氏链的Poisson极限定理。这是有限马氏链Pitskel相应结果的拓广。  相似文献   

16.
We give Gaussian lower and upper bounds for reversible Markov chains on a graph under two geometric assumptions (volume regularity and Poincaré inequality). This is first proved for continuous-time Markov chains via a parabolic Harnack inequality. Then, the estimates for the discrete-time Markov chains are derived by comparison.  相似文献   

17.
We consider stationary 0-valued Markov chains whose transition probabilities are associated with convolution structures of measures which are induced by linearization formulas of orthogonal polynomials. The best known examples are random walks on polynomial hypergroups and generalized birth and death random walks. Using central limit theorems derived in a recent paper by the author and some martingale arguments, we here prove a law of the iterated logarithm for a class of such Markov chains.  相似文献   

18.
We generalize previously known conditions for uniqueness of the Gibbs measure in statistical physics models by presenting conditions of any finite size for models on any underlying graph. We give two dual conditions, one requiring that the total influence on a site is small, and the other that the total influence of a site is small. Our proofs are combinatorial in nature and use tools from the analysis of discrete Markov chains, in particular the path coupling method. The implications of our conditions for the mixing time of natural Markov chains associated with the models are discussed as well. We also present some examples of models for which the conditions hold. © 2005 Wiley Periodicals, Inc. Random Struct. Alg., 2005  相似文献   

19.
主要研究了树指标马氏链的若干性质,它与一般直线上的马氏链有类似的性质.  相似文献   

20.
We consider a class of continuous time Markov chains on ? d . These chains are the discrete space analogue of Markov processes with jumps. Under some conditions, as we show, harmonic functions associated with these Markov chains are Hölder continuous.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号