首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 406 毫秒
1.
本文考虑了非时齐Markov链泛函的弱收敛定理.本文的极限过程是一类随机积分型的随机过程.作为应用,首先考虑了单位根检验问题;其次考虑了协整回归模型中参数最小二乘估计量的渐近分布.这两类问题中,均涉及了收敛至随机积分的弱收敛定理.  相似文献   

2.
高勇  张文修 《中国科学A辑》1994,37(2):113-121
本文首次引入了超空间(子集空间)上选择算子概念,给出了几类选择算子的存在定理。作为它们的应用,给出了集值随机变量同分布的选择刻画;圆满解决了依分布收敛集值随机变量列的向量值选择问题;研究了集值随机过程的正则选择与Markov选择,给出了集值Markov过程的离散化定理,证明了紧凸集值渐近鞅的向量值渐近鞅选择存在定理。  相似文献   

3.
本文以首中时(或回返时)为脉络,从三个方面—Markov链的遍历性、拟平稳分布和不可逆问题—介绍Markov链研究的一些最新进展.这些内容包括:(1)以首中时的矩给出泛函不等式;(2)引入修正的回返时判定各种非常返性;(3)用回返时处理离散时间Markov链的泛函不等式;(4) Markov链首中时的分布表示;(5)以击中时的矩判定一族遍历的Markov过程收敛到平稳分布所产生切断(cutoff)现象;(6)从Markov链生命时的分布找到拟平稳分布存在唯一性;(7)发展Dirichlet原理来判定不可逆Markov链收敛到平稳分布"优于"相应的可逆过程的问题.  相似文献   

4.
乔正阳  刘易成 《应用数学》2021,34(1):146-157
本文研究带有切换拓扑和随机故障影响的Cucker-Smale模型的集群演化特点,在切换时间间隔和故障概率满足一定约束条件下,给出系统几乎必然发生集群演化的充分条件.鉴于随机故障和拓扑切换的综合影响,Cucker-Smale模型产生了复杂的集群演化行为,其中集群收敛速度随着故障概率的增加会显著减慢.同时,我们在切换过程中...  相似文献   

5.
复杂系统与过程的数学建模需要用随机动力学(stochastic dynamics)的思想和方法.随机动力学的理论有着两种不同的数学表述:随机过程(stochastic processes)和随机动力系统(random dynamical systems).后者是比前者更为精细的数学模型,它不但给出对应于每一个初值的随机过程,还全面地描述不同初值的多条随机轨道如何同时随时间变化.前者恰恰表述了有内在随机性的个体的运动,而后者则反映了多个相同的确定性个体同时经历同一个随机环境.本文称这两种情形为内源噪声和外源噪声.两者都在化学和生物学中有广泛的应用.近年来兴起的以图G(V,E)为基础的概率布尔网络正是一类以{0,1}~V为状态空间的随机动力系统(RDS).本文介绍有关离散时间离散空间的RDS,同时也给出一个它在统计推断隐Markov模型的收敛速率估算中的应用.  相似文献   

6.
提出了一种集值Markov链,该模型是对基于随机变量的Markov链推广,将随机变量提升到随机集上.模型继承了经典Markov链的诸多良好性质,而且可以退化为经典的Markov链模型.为了进一步分析该模型,引入随机集落影理论,提出转移落影、落影分布等概念,并给出了部分结论和性质.最后,给出一个应用实例.  相似文献   

7.
用随机过程的轨道,严格地刻划了Markov调制风险模型U=(Q,G,F;J,s,X),它是已有的Markov调制风险模型的一般化.基于模型U,分别给出带保费率向量C和带税率向量γ的Markov调制风险过程R~u={R~u(t),t≥0}和R~u(γ)={R~u(γ,t),t≥0}.给定特征组A=(Q,G,F),用概率方法构造了模型U.从而为用随机过程理论和方法研究Markov调制风险模型和过程,奠定了严实的随机过程基础.  相似文献   

8.
连续时间Markov链的遍历度   总被引:1,自引:0,他引:1       下载免费PDF全文
毛永华 《中国科学A辑》2003,33(5):409-420
以首中时的矩来研究连续时间Markov链高阶偏差矩阵的存在及有限性, 并由此给出转移矩阵收敛到平稳分布的多项式速度的估计. 对于生灭过程, 给出了显式的表达式.  相似文献   

9.
考虑有限状态空间上一可约Markov链及其指数扰动的不可约Markov链族,这是Freidlin-Wentzell理论的推广,首先是出于随机Ising模型、神经网络、模拟退火等方面的考虑,现证明一大类Markov链族都有亚稳态性,亚稳态就是可约Markov链的常返类,还仔细分析了高阶吸引子、高阶吸引域及其金字塔形结构,估计不同集合首达时的渐近指数量阶,首达时间除其平均值之后收敛到均值为1的指数分布.  相似文献   

10.
基于Markov链理论,研究了一类修正Cooper-Frieze模型的稳定性.通过将模型中节点度与Markov链建立联系,利用Markov理论中首达概率的思想为稳态度分布的存在性提供了证明,并从数学上推导出度分布的具体表达式.最后,对该模型的度分布和聚集性提供了仿真分析,并与BA模型作了相应的对比.  相似文献   

11.
The ergodic theory of Markov chains in random environments   总被引:70,自引:0,他引:70  
Summary A general formulation of the stochastic model for a Markov chain in a random environment is given, including an analysis of the dependence relations between the environmental process and the controlled Markov chain, in particular the problem of feedback. Assuming stationary environments, the ergodic theory of Markov processes is applied to give conditions for the existence of finite invariant measure (equilibrium distributions) and to obtain ergodic theorems, which provide results on convergence of products of random stochastic matrices. Coupling theory is used to obtain results on direct convergence of these products and the structure of the tail -field. State properties including classification and communication properties are discussed.  相似文献   

12.
研究任意随机变量序列的强收敛性.利用鞅差序列级数收敛定理,证明了任意随机序列的一个强极限定理,作为推论,得到了马氏过程、鞅差序列及独立随机变量序列的强大数定律.  相似文献   

13.
Recursive equations are derived for the conditional distribution of the state of a Markov chain, given observations of a function of the state. Mainly continuous time chains are considered. The equations for the conditional distribution are given in matrix form and in differential equation form. The conditional distribution itself forms a Markov process. Special cases considered are doubly stochastic Poisson processes with a Markovian intensity, Markov chains with a random time, and Markovian approximations of semi-Markov processes. Further the results are used to compute the Radon-Nikodym derivative for two probability measures for a Markov chain, when a function of the state is observed.  相似文献   

14.
《随机分析与应用》2013,31(2):419-441
We consider the stochastic model of water pollution, which mathematically can be written with a stochastic partial differential equation driven by Poisson measure noise. We use a stochastic particle Markov chain method to produce an implementable approximate solution. Our main result is the annealed law of large numbers establishing convergence in probability of our Markov chains to the solution of the stochastic reaction-diffusion equation while considering the Poisson source as a random medium for the Markov chains.  相似文献   

15.
This work develops numerical approximation algorithms for solutions of stochastic differential equations with Markovian switching. The existing numerical algorithms all use a discrete-time Markov chain for the approximation of the continuous-time Markov chain. In contrast, we generate the continuous-time Markov chain directly, and then use its skeleton process in the approximation algorithm. Focusing on weak approximation, we take a re-embedding approach, and define the approximation and the solution to the switching stochastic differential equation on the same space. In our approximation, we use a sequence of independent and identically distributed (i.i.d.) random variables in lieu of the common practice of using Brownian increments. By virtue of the strong invariance principle, we ascertain rates of convergence in the pathwise sense for the weak approximation scheme.  相似文献   

16.
We study the convergence of iterated random functions for stochastic feasibility in the consistent case (in the sense of Butnariu and Flåm [Numer. Funct. Anal. Optimiz., 1995]) in several different settings, under decreasingly restrictive regularity assumptions of the fixed point mappings. The iterations are Markov chains and, for the purposes of this study, convergence is understood in very restrictive terms. We show that sufficient conditions for geometric (linear) convergence in expectation of stochastic projection algorithms presented in Nedi? [Math. Program, 2011], are in fact necessary for geometric (linear) convergence in expectation more generally of iterated random functions.  相似文献   

17.
We present a new class of interacting Markov chain Monte Carlo methods to approximate numerically discrete-time nonlinear measure-valued equations. These stochastic processes belong to the class of self-interacting Markov chains with respect to their occupation measures. We provide several convergence results for these new methods including exponential estimates and a uniform convergence theorem with respect to the time parameter, yielding what seems to be the first results of this kind for this type of self-interacting models. We illustrate these models in the context of Feynman–Kac distribution semigroups arising in physics, biology and in statistics.  相似文献   

18.
Asymptotic properties of singularly perturbed Markov chains having measurable and/or continuous generators are developed in this work. The Markov chain under consideration has a finite-state space and is allowed to be nonstationary. Its generator consists of a rapidly varying part and a slowly changing part. The primary concerns are on the properties of the probability vectors and an aggregated process that depend on the characteristics of the fast varying part of the generators. The fast changing part of the generators can either consist of l recurrent classes, or include also transient states in addition to the recurrent classes. The case of inclusion of transient states is examined in detail. Convergence of the probability vectors under the weak topology of L2 is obtained first. Then under slightly stronger conditions, it is shown that the convergence also takes place pointwise. Moreover, convergence under the norm topology of L2 is derived. Furthermore, a process with aggregated states is obtained which converges to a Markov chain in distribution.  相似文献   

19.
The Zakai equation for the unnormalized conditional density is derived as a mild stochastic bilinear differential equation on a suitableL 2 space. It is assumed that the Markov semigroup corresponding to the state process isC 0 on such space. This allows the establishment of the existence and uniqueness of the solution by means of general theorems on stochastic differential equations in Hilbert space. Moreover, an easy treatment of convergence conditions can be given for a general class of finite-dimensional approximations, including Galerkin schemes. This is done by using a general continuity result for the solution of a mild stochastic bilinear differential equation on a Hilbert space with respect to the semigroup, the forcing operator, and the initial state, within a suitable topology.  相似文献   

20.
In this article we propose a numerical method for reflected backward stochastic differential equations (RBSDE). This method is based on the simple random walk, and the convergence is related to the Skorohod topology.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号