首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Any stationary 1-dependent Markov chain with up to four states is a 2-block factor of independent, identically distributed random variables. There is a stationary 1-dependent Markov chain with five states which is not, even though every 1-dependent renewal process is a 2-block factor.  相似文献   

2.
A standard strategy in simulation, for comparing two stochastic systems, is to use a common sequence of random numbers to drive both systems. Since regenerative output analysis of the steady-state of a system requires that the process be regenerative, it is of interest to derive conditions under which the method of common random numbers yields a regenerative process. It is shown here that if the stochastic systems are positive recurrent Markov chains with countable state space, then the coupled system is necessarily regenerative; in fact, we allow couplings more general than those induced by common random numbers. An example is given which shows that the regenerative property can fail to hold in general state space, even if the individual systems are regenerative.  相似文献   

3.
4.
Let Q be a convex solid in n , partitioned into two volumes u and v by an area s. We show that s>min(u,v)/diam Q, and use this inequality to obtain the lower bound n -5/2 on the conductance of order Markov chains, which describe nearly uniform generators of linear extensions for posets of size n. We also discuss an application of the above results to the problem of sorting of posets.Computing Center of the USSR Academy of Sciences USSR  相似文献   

5.
6.
Let {Xn}0 be an irreducible recurrent Markov Chain on the nonnegative integers. A result of Chosid and Isaac (1978) gives a sufficient condition for n?1Rn → 0 w.p.1. where Rn is the range of the chain. We give an alternative proof using Kingman's subadditive ergodic theorem (Kingman, 1973). Some examples are also given.  相似文献   

7.
8.
The present paper proposes a non-homogeneous multivariate Markov manpower system in the general category of mathematical human resource planning. More specifically, we suggest a model, which takes into account the divisions existing in an organization categorizing its employees into several groups (departments). In this context, it considers not only possible transitions within the departments (intra-department transitions), but also, transfers of personnel between departments (inter-department transitions). Additionally, the proposed modeling structure is accompanied by cost and stocks (personnel) objectives which are set and in the sequel could be achieved by controlling either the recruitment policy or the allocation policy of employees transferred to other departments (or both). We use a minmax fuzzy goal-programming approach, under different operating assumptions, in order to keep the operational cost below desired aspiration levels and reach desirable stock structures in the presence of system’s constraints and regulations. The paper concludes with a numerical illustration.  相似文献   

9.
10.
Kingman and Williams [6] showed that a pattern of positive elements can occur in a transition matrix of a finite state, nonhomogeneous Markov chain if and only if it may be expressed as a finite product of reflexive and transitive patterns. In this paper we solve a similar problem for doubly stochastic chains. We prove that a pattern of positive elements can occur in a transition matrix of a doubly stochastic Markov chain if and only if it may be expressed as a finite product of reflexive, transitive, and symmetric patterns. We provide an algorithm for determining whether a given pattern may be expressed as a finite product of reflexive, transitive, and symmetric patterns. This result has implications for the embedding problem for doubly stochastic Markov chains. We also give the application of the obtained characterization to the chain majorization.  相似文献   

11.
12.
Let{S n } n=0 be a Harris-recurrent Markov chain on a measurable state space. We prove strong approximation results for the additive functionals . Research supported by the Hungarian National Foundation for Scientific Research, Grant No. 1905. Mathematical Institute of the Hungarian Academy of Sciences, Budapest, P.O.B. 127, H-1364, Hungary. Research supported by an NSERC Canada Grant, Carleton University. Department of Mathematics and Statistics, Carleton University, Ottawa, Canada K1S 5B6.  相似文献   

13.
Two theorems on the existence of the potential of an ergodic Markov chain in an arbitrary phase space are proved.DeceasedTranslated from Ukrainskii Matematicheskii Zhurnal, Vol. 46, No. 4, pp. 446–449, April, 1994.This work was supported by the Ukrainian State Committee on Science and Technology.  相似文献   

14.
15.
Necessary and sufficient conditions are given for the convergence of the first moment of functionals of Markov chains.  相似文献   

16.
17.
18.
19.
Let X be a chain with discrete state space I, and V be the matrix of entries Vi,n, where Vi,n denotes the position of the process immediately after the nth visit to i. We prove that the law of X is a mixture of laws of Markov chains if and only if the distribution of V is invariant under finite permutations within rows (i.e., the Vi,n's are partially exchangeable in the sense of de Finetti). We also prove that an analogous statement holds true for mixtures of laws of Markov chains with a general state space and atomic kernels. Going back to the discrete case, we analyze the relationships between partial exchangeability of V and Markov exchangeability in the sense of Diaconis and Freedman. The main statement is that the former is stronger than the latter, but the two are equivalent under the assumption of recurrence. Combination of this equivalence with the aforesaid representation theorem gives the Diaconis and Freedman basic result for mixtures of Markov chains.  相似文献   

20.
Markov chains are often used as mathematical models of natural phenomena, with transition probabilities defined in terms of parameters that are of interest in the scientific question at hand. Sensitivity analysis is an important way to quantify the effects of changes in these parameters on the behavior of the chain. Many properties of Markov chains can be written as simple matrix expressions, and hence matrix calculus is a powerful approach to sensitivity analysis. Using matrix calculus, we derive the sensitivity and elasticity of a variety of properties of absorbing and ergodic finite-state chains. For absorbing chains, we present the sensitivities of the moments of the number of visits to each transient state, the moments of the time to absorption, the mean number of states visited before absorption, the quasistationary distribution, and the probabilities of absorption in each of several absorbing states. For ergodic chains, we present the sensitivity of the stationary distribution, the mean first passage time matrix, the fundamental matrix, and the Kemeny constant. We include two examples of application of the results to demographic and ecological problems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号