首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We prove a smoothing property and the irreducibility of transition semigroups corresponding to a class of semilinear stochastic equations on a separable Hilbert space H. Existence and uniqueness of invariant measures are discussed as well.  相似文献   

2.
Shur  M. G. 《Mathematical Notes》2001,69(1-2):116-125
The author presents a revised and detailed version of his theorem on the existence of Feller's extensions of Markov chains; to this end the broader notion of quasi-Feller extension is used. The existence of Markov chains dual to the chains with Borel space of states is derived from this result. Chains irreducible in the Orey sense are studied in most detail. For example, we prove that for such chains the quasi-Feller extension can be chosen recurrent or Liouville if the original chains possess these properties.  相似文献   

3.
Let be an elliptic differential operator with unbounded coefficients on RN and assume that the associated Feller semigroup (T(t))t?0 has an invariant measure μ. Then (T(t))t?0 extends to a strongly continuous semigroup (Tp(t))t?0 on Lp(μ)=Lp(RN,μ) for every 1?p<∞. We prove that, under mild conditions on the coefficients of A, the space of test functions is a core for the generator (Ap,Dp) of (Tp(t))t?0 in Lp(μ) for 1?p<∞.  相似文献   

4.
This paper concerns a Markov operator T on a space L1, and aMarkov process P which defines a Markov operator on a spaceM of finite signed measures. For T, the paper presents necessaryand sufficient conditions for:
a the existence of invariant probabilitydensities (IPDs)
b the existence of strictly positive IPDs,and
c the existence and uniqueness of IPDs.
Similar resultson invariant probability measures for P are presented. The basicapproach is to pose a fixed-point problem as the problem ofsolving a certain linear equation in a suitable Banach space,and then obtain necessary and sufficient conditions for thisequation to have a solution. 1991 Mathematics Subject Classification:60J05, 47B65, 47N30.  相似文献   

5.
6.
Persi Diaconis and Phil Hanlon in their interesting paper(4) give the rates of convergence of some Metropolis Markov chains on the cubeZ d (2). Markov chains on finite groups that are actually random walks are easier to analyze because the machinery of harmonic analysis is available. Unfortunately, Metropolis Markov chains are, in general, not random walks on group structure. In attempting to understand Diaconis and Hanlon's work, the authors were led to the idea of a hypergroup deformation of a finite groupG, i.e., a continuous family of hypergroups whose underlying space isG and whose structure is naturally related to that ofG. Such a deformation is provided forZ d (2), and it is shown that the Metropolis Markov chains studied by Diaconis and Hanlon can be viewed as random walks on the deformation. A direct application of the Diaconis-Shahshahani Upper Bound Lemma, which applies to random walks on hypergroups, is used to obtain the rate of convergence of the Metropolis chains starting at any point. When the Markov chains start at 0, a result in Diaconis and Hanlon(4) is obtained with exactly the same rate of convergence. These results are extended toZ d (3).Research supported in part by the Office of Research and Sponsored Programs, University of Oregon.  相似文献   

7.
Let S be a denumerable state space and let P be a transition probability matrix on S. If a denumerable set M of nonnegative matrices is such that the sum of the matrices is equal to P, then we call M a partition of P.  相似文献   

8.
Let {Xn} be a ?-irreducible Markov chain on an arbitrary space. Sufficient conditions are given under which the chain is ergodic or recurrent. These extend known results for chains on a countable state space. In particular, it is shown that if the space is a normed topological space, then under some continuity conditions on the transition probabilities of {Xn} the conditions for ergodicity will be met if there is a compact set K and an ? > 0 such that E {6Xn+16 — 6Xn6 ∣ Xn = x} ? ?? whenever x lies outside K and E{6Xn+16 ∣ Xn=x} is bounded, xK; whilst the conditions for recurrence will be met if there exists a compact K with E {6Xn+16 ? 6Xn6 ∣ Xn = x} ? 0 for all x outside K. An application to queueing theory is given.  相似文献   

9.
In this paper, we develop an algorithmic method for the evaluation of the steady state probability vector of a special class of finite state Markov chains. For the class of Markov chains considered here, it is assumed that the matrix associated with the set of linear equations for the steady state probabilities possess a special structure, such that it can be rearranged and decomposed as a sum of two matrices, one lower triangular with nonzero diagonal elements, and the other an upper triangular matrix with only very few nonzero columns. Almost all Markov chain models of queueing systems with finite source and/or finite capacity and first-come-first-served or head of the line nonpreemptive priority service discipline belongs to this special class.  相似文献   

10.
This paper presents some conditions for the minimal Q-function to be a Feller transition function, for a given q-matrix Q. We derive a sufficient condition that is stated explicitly in terms of the transition rates. Furthermore, some necessary and sufficient conditions are derived of a more implicit nature, namely in terms of properties of a system of equations (or inequalities) and in terms of the operator induced by the q-matrix. The criteria lead to some perturbation results. These results are applied to birth-death processes with killing, yielding some sufficient and some necessary conditions for the Feller property directly in terms of the rates. An essential step in the analysis is the idea of associating the Feller property with individual states.  相似文献   

11.
A necessary and sufficient condition is given for the existence of a finite invariant measure equivalent to a given reference measure for a discrete time, general state Markov process. The condition is an extension of one given by D. Maharam in the deterministic case and involves an averaging method (called by Maraham ‘density averaging’) applied to the Radon-Nikodym derivatives with respect to the reference measure of the usual sequence of measures induced by the Markov process acting on the fixed reference  相似文献   

12.
The concepts of π -irreduciblity, recurrence and transience are introduced into the research field of Markov chains in random environments. That a π -irreducible chain must be either recurrent or transient is proved, a criterion is shown for recurrent Markov chains in double-infinite random environments, the existence of invariant measure of π -irreducible chains in double-infinite environments is discussed, and then Orey’s open-questions are partially answered.  相似文献   

13.
In a Markov chain model of a social process, interest often centers on the distribution of the population by state. One question, the stability question, is whether this distribution converges to an equilibrium value. For an ordinary Markov chain (a chain with constant transition probabilities), complete answers are available. For an interactive Markov chain (a chain which allows the transition probabilities governing each individual to depend on the locations by state of the rest of the population), few stability results are available. This paper presents new results. Roughly, the main result is that an interactive Markov chain with unique equilibrium will be stable if the chain satisfies a certain monotonicity property. The property is a generalization to interactive Markov chains of the standard definition of monotonicity for ordinary Markov chains.  相似文献   

14.
Estimation of spectral gap for Markov chains   总被引:7,自引:0,他引:7  
The study of the convergent rate (spectral gap) in theL 2-sense is motivated from several different fields: probability, statistics, mathematical physics, computer science and so on and it is now an active research topic. Based on a new approach (the coupling technique) introduced in [7] for the estimate of the convergent rate and as a continuation of [4], [5], [7–9], [23] and [24], this paper studies the estimate of the rate for time-continuous Markov chains. Two variational formulas for the rate are presented here for the first time for birth-death processes. For diffusions, similar results are presented in an accompany paper [10]. The new formulas enable us to recover or improve the main known results. The connection between the sharp estimate and the corresponding eigenfunction is explored and illustrated by various examples. A previous result on optimal Markovian couplings[4] is also extended in the paper.Research supported in part by NSFC, Qin Shi Sci & Tech. Foundation and the State Education Commission of China.  相似文献   

15.
In this paper, subgeometric ergodicity is investigated for continuous-time Markov chains. Several equivalent conditions, based on the first hitting time or the drift function, are derived as the main theorem. In its corollaries, practical drift criteria are given for ?-ergodicity and computable bounds on subgeometric convergence rates are obtained for stochastically monotone Markov chains. These results are illustrated by examples.  相似文献   

16.
The main purpose of this paper is to show that Markov solutions to the 3D Navier-Stokes equations driven by Gaussian noise have the strong Feller property up to the critical topology given by the domain of the Stokes operator to the power one-fourth.  相似文献   

17.
18.
利用随机环境中马氏链的Hofp极大遍历引理和Brunel极大遍历引理,给出了随机环境中马氏链的Chacon-Ornstein定理和Chacon认证定理.  相似文献   

19.
The isomorphism theorem of Dynkin is definitely an important tool to investigate the problems raised in terms of local times of Markov processes. This theorem concerns continuous time Markov processes. We give here an equivalent version for Markov chains.  相似文献   

20.
Existence of following factorization is proved:
Here A is a stochastic or semi-stochastic (substohastic) d×d matrix (d); I is the unit matrix; B and C are nonnegative, upper and lower triangular matrices. B is a semistochastic matrix; the diagonal entries of C are 1. An exact information on properties of matrices B and C are obtained in particular cases. Some results on existence of invariant distribution x for Markov chains in the cases of absence or presence of sources g of walking particles are obtained using the factorization (F). These problems described by homogeneous or nonhomogeneous equation (IA)x=g.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号