首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
3.
By a result of F. Hofbauer [11], piecewise monotonic maps of the interval can be identified with topological Markov chains with respect to measures with large entropy. We generalize this to arbitrary piecewise invertible dynamical systems under the following assumption: the total entropy of the system should be greater than the topological entropy of the boundary of some reasonable partition separating almost all orbits. We get a sufficient condition for these maps to have a finite number of invariant and ergodic probability measures with maximal entropy. We illustrate our results by quoting an application to a class of multi-dimensional, non-linear, non-expansive smooth dynamical systems. Part of this work was done at Université Paris-Sud, dép. de mathématiques, Orsay.  相似文献   

4.
5.
The problem of multivariate information analysis is considered. First, the interaction information in each dimension is defined analogously according to McGill [4] and then applied to Markov chains. The property of interaction information zero deeply relates to a certain class of weakly dependent random variables. For homogeneous, recurrent Markov chains with m states, mn ≥3, the zero criterion of n-dimensional interaction information is achieved only by (n ? 2)-dependent Markov chains, which are generated by some nilpotent matrices. Further for Gaussian Markov chains, it gives the decomposition rule of the variables into mutually correlated subchains.  相似文献   

6.
Reversible Markov chains are the basis of many applications. However, computing transition probabilities by a finite sampling of a Markov chain can lead to truncation errors. Even if the original Markov chain is reversible, the approximated Markov chain might be non‐reversible and will lose important properties, like the real‐valued spectrum. In this paper, we show how to find the closest reversible Markov chain to a given transition matrix. It turns out that this matrix can be computed by solving a convex minimization problem. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

7.
The topological Markov chain or the subshift of finite type is a restriction of the shift on an invariant subset determined by a 0, 1-matrix, which has some important applications in the theory of dynamical systems. In this paper, the topological Markov chain has been discussed. First, we introduce a structure of the directed gragh on a 0, 1-matrix, and then by using it as a tool, we give some equivalent conditions with respect to the relationship among topological entropy, chaos, the nonwandering set, the set of periodic points and the 0, 1-matrix involved. This work is supported in part by the Foundation of Advanced Research Centre, Zhongshan University.  相似文献   

8.
We show that for sufficiently large knapsacks the associated Markov chain on the state space of the admissible packings of the knapsack is rapidly mixing. Our condition basically states that at least half of all items should fit into the knapsack. This is much weaker than the condition assumed by Saloff-Coste (1997).  相似文献   

9.
We investigate the zeros of a family of hypergeometric polynomials $M_n(x;\beta ,c)=(\beta )_n\,{}_2F_1(-n,-x;\beta ;1-\frac{1}{c})$ , $n\in \mathbb N ,$ known as Meixner polynomials, that are orthogonal on $(0,\infty )$ with respect to a discrete measure for $\beta >0$ and $0<c<1.$ When $\beta =-N$ , $N\in \mathbb N $ and $c=\frac{p}{p-1}$ , the polynomials $K_n(x;p,N)=(-N)_n\,{}_2F_1(-n,-x;-N;\frac{1}{p})$ , $n=0,1,\ldots , N$ , $0<p<1$ are referred to as Krawtchouk polynomials. We prove results for the zero location of the orthogonal polynomials $M_n(x;\beta ,c)$ , $c<0$ and $n<1-\beta $ , the quasi-orthogonal polynomials $M_n(x;\beta ,c)$ , $-k<\beta <-k+1$ , $k=1,\ldots ,n-1$ and $0<c<1$ or $c>1,$ as well as the polynomials $K_{n}(x;p,N)$ with non-Hermitian orthogonality for $0<p<1$ and $n=N+1,N+2,\ldots $ . We also show that the polynomials $M_n(x;\beta ,c)$ , $\beta \in \mathbb R $ are real-rooted when $c\rightarrow 0$ .  相似文献   

10.
We focus on continuous Markov chains as a model to describe the evolution of credit ratings. In this work it is checked whether a simple, tridiagonal type of generator provides a good approximation to a general one. Three different tridiagonal approximations are proposed and their performance is checked against two generators, corresponding to a volatile and a stable period, respectively. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

11.
Markov chain theory is proving to be a powerful approach to bootstrap finite states processes, especially where time dependence is non linear. In this work we extend such approach to bootstrap discrete time continuous-valued processes. To this purpose we solve a minimization problem to partition the state space of a continuous-valued process into a finite number of intervals or unions of intervals (i.e. its states) and identify the time lags which provide “memory” to the process. A distance is used as objective function to stimulate the clustering of the states having similar transition probabilities. The problem of the exploding number of alternative partitions in the solution space (which grows with the number of states and the order of the Markov chain) is addressed through a Tabu Search algorithm. The method is applied to bootstrap the series of the German and Spanish electricity prices. The analysis of the results confirms the good consistency properties of the method we propose.  相似文献   

12.
13.
This paper introduces the Markov chain model as a simple tool for analyzing the pattern of financial asset holdings over time. The model is based on transition probabilities which give the probability of switching $1 of wealth from one asset to another. An illustrative application is provided.  相似文献   

14.
15.
A batch Markov arrival process (BMAP) X* = (N, J) is a 2-dimensional Markov process with two components, one is the counting process N and the other one is the phase process J. It is proved that the phase process is a time-homogeneous Markov chain with a finite state-space, or for short, Markov chain. In this paper, a new and inverse problem is proposed firstly: given a Markov chain J, can we deploy a process N such that the 2-dimensional process X* = (N, J) is a BMAP? The process X* = (N, J) is said to be an adjoining BMAP for the Markov chain J. For a given Markov chain the adjoining processes exist and they are not unique. Two kinds of adjoining BMAPs have been constructed. One is the BMAPs with fixed constant batches, the other one is the BMAPs with independent and identically distributed (i.i.d) random batches. The method we used in this paper is not the usual matrix-analytic method of studying BMAP, it is a path-analytic method. We constructed directly sample paths of adjoining BMAPs. The expressions of characteristic (D k , k = 0, 1, 2 · · ·) and transition probabilities of the adjoining BMAP are obtained by the density matrix Q of the given Markov chain J. Moreover, we obtained two frontal Theorems. We present these expressions in the first time.  相似文献   

16.
In this paper we study the flux through a finite Markov chain of a quantity, that we will call mass, which moves through the states of the chain according to the Markov transition probabilities. Mass is supplied by an external source and accumulates in the absorbing states of the chain. We believe that studying how this conserved quantity evolves through the transient (non-absorbing) states of the chain could be useful for the modelization of open systems whose dynamics has a Markov property.  相似文献   

17.
18.
The authors continue to study generalized coherent states for oscillator-like systems connected with a given family of orthogonal polynomials. In this work, we consider oscillators connected with Meixner and Meixner— Pollaczek polynomials and define generalized coherent states for these oscillators. A completeness condition for these states is proved by solution of a related classical moment problem. The results are compared with the other authors ones. In particular, we show that the Hamiltonian of the relativistic model of a linear harmonic oscillator can be treated as the linearization of a quadratic Hamiltonian, which arises naturally in our formalism. Bibliography: 56 titles. The authors dedicate this work to their friend and colleague P. P. Kulish on the occasion of his 60th birthday __________ Translated from Zapiski Nauchnykh Seminarov POMI, Vol. 317, 2004, pp. 66–93.  相似文献   

19.
层次模型Markov链的观测与统计   总被引:2,自引:1,他引:1  
对于连续时间的层次模型M arkov链,所有的转移速率都可以由最底层状态的逗留时间和击中时间分布惟一决定,因而整个M arkov链的统计性质由它们的统计所决定.并给出了相应的算法和数例.  相似文献   

20.
Classifying the states of a finite Markov chain requires the identification of all irreducible closed sets and the set of transient states. This paper presents an algorithm for identifying these states that executes in time O(MAX(|V|, |E|)) where number of states and |E| is the number of positive entries in the Markov matrix. The algorithm finds the closed strongly connected components of the transition graph using a depth-first search.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号