首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 343 毫秒
1.
A rigorous definition of semi-Markov dependent risk model is given. This model is a generalization of the Markov dependent risk model. A criterion and necessary conditions of semi- Markov dependent risk model are obtained. The results clarify relations between elements among semi-Markov dependent risk model more clear and are applicable for Markov dependent risk model.  相似文献   

2.
Limit theorems for functionals of classical (homogeneous) Markov renewal and semi-Markov processes have been known for a long time, since the pioneering work of Pyke Schaufele (Limit theorems for Markov renewal processes, Ann. Math. Statist., 35(4):1746–1764, 1964). Since then, these processes, as well as their time-inhomogeneous generalizations, have found many applications, for example, in finance and insurance. Unfortunately, no limit theorems have been obtained for functionals of inhomogeneous Markov renewal and semi-Markov processes as of today, to the best of the authors’ knowledge. In this article, we provide strong law of large numbers and central limit theorem results for such processes. In particular, we make an important connection of our results with the theory of ergodicity of inhomogeneous Markov chains. Finally, we provide an application to risk processes used in insurance by considering a inhomogeneous semi-Markov version of the well-known continuous-time Markov chain model, widely used in the literature.  相似文献   

3.
众所周知,在马尔科夫模型下,在每个状态下的逗留时间服从指数分布,这经常太受限制,尤其是在电场可修系统中,不太适合操作数据要求.因此,提出了利用半马尔科夫过程去研究电场可修系统,而且,利用了马尔科夫更新理论和概率分析方法讨论了电场系统的稳态可用度.最后,通过一个数值例子陈述了获得的结果.  相似文献   

4.
A continuous semi-Markov process with values in a closed interval is considered. This process coincides with a Markov diffusion process inside the interval. Thus, violation of the Markov property is only possible at the boundary of the interval. We prove a sufficient condition under which a semi-Markov process is Markov. We show that, in addition to Markov processes with instantaneous reflection from the boundary of the interval. there exists a class of Markov processes with delayed reflection from the boundary. Such a process has a positive average measure of time at which its trajectory belongs to the boundaries. This gives a different proof of a similar result by Gikhman and Skorokhod of 1968. Bibliography: 5 titles.  相似文献   

5.
On dispatching unequally capable service technicians   总被引:1,自引:0,他引:1  
A common problem that faces many high-tech electronic productcompanies is how to effectively provide systems support fortheir products/machines/systems installed at various customersites using different levels of technical personnel. The problemcan be viewed as a model with multiple level servers and multiplevolume demands. In this paper, a semi-Markov decision processwith average criterion has been employed to formulate the dynamicsof the system. Optimal policies for human resource allocationdecision with minimum costs have been derived and the loss ratefor customer requests has been considered. We prove that theembedding Markov chain is ergodic, induced by any stationarydeterministic policy. We obtain some related parameters whichcan answer questions from managers. We also provide some numericalexamples.  相似文献   

6.
This work studies the threshold dynamics and ergodicity of a stochastic SIRS epidemic model with the disease transmission rate driven by a semi-Markov process. The semi-Markov process used in this paper for describing a randomly changing environment is a very large extension of the most common Markov regime-switching process. We define a basic reproduction number for the semi-Markov regime-switching environment and show that its position with respect to 1 determines the extinction or persistence of the disease. In the case of disease persistence, we give mild sufficient conditions for ensuring the existence and absolute continuity of the invariant probability measure. Under the same conditions, we also prove the global attractivity of the Ω-limit set of the system and the convergence in total variation norm of the transition probability to the invariant measure. Compared with the existing results in the Markov regime-switching environment, the results generalized require almost no additional conditions.  相似文献   

7.
We study stochastic processes with age-dependent transition rates. A typical example of such a process is a semi-Markov process which is completely determined by the holding time distributions in each state and the transition probabilities of the embedded Markov chain. The process we construct generalizes semi-Markov processes. One important feature of this process is that unlike semi-Markov processes the transition probabilities of this process are age-dependent. Under certain condition we establish the Feller property of the process. Finally, we compute the limiting distribution of the process.  相似文献   

8.
We study asymptotic average and diffusion approximation schemes for semi-Markov queuing systems by a random evolution approach and using compensating operator of the corresponding extended Markov renewal process. These results generalize Markov and renewal flow queuing systems.   相似文献   

9.
Mixtures of recurrent semi-Markov processes are characterized through a partial exchangeability condition of the array of successor states and holding times. A stronger invariance condition on the joint law of successor states and holding times leads to mixtures of Markov laws.  相似文献   

10.
This paper concerns the study of asymptotic properties of the maximum likelihood estimator (MLE) for the general hidden semi-Markov model (HSMM) with backward recurrence time dependence. By transforming the general HSMM into a general hidden Markov model, we prove that under some regularity conditions, the MLE is strongly consistent and asymptotically normal. We also provide useful expressions for asymptotic covariance matrices, involving the MLE of the conditional sojourn times and the embedded Markov chain of the hidden semi-Markov chain. Bibliography: 17 titles.  相似文献   

11.
《Optimization》2012,61(3-4):367-382
This paper investigates discrete type shock semi-Markov decision processes (SMDP for short) with Borel state and action space. The discrete type shock SMDP describes a system which behaves like a discrete type SMDP, except that the system is subject to random shocks from its environment. Following each shock, an instantaneous state transition occurs and the parameters of the SMDP are changed. After presenting the model, we transform the discrete type shock SMDP into an equivalent discrete time Markov decision process under the condition that one of the assumptions P, N, D, holds. So the most results from discrete time Markov decision processes can be generalized directly to hold for the discrete type shock SMDP.  相似文献   

12.
In this paper we show how it is possible to construct an efficient Migration models in the study of credit risk problems presented in Jarrow et al. (Rev Financ Stud 10:481–523, 1997) with Markov environment. Recently it was introduced the semi-Markov process in the migration models (D’Amico et al. Decis Econ Finan 28:79–93, 2005a). The introduction of semi-Markov processes permits to overtake some of the Markov constraints given by the dependence of transition probabilities on the duration into a rating category. In this paper, it is shown how it is possible to take into account simultaneously backward and forward processes at beginning and at the end of the time in which the credit risk model is observed. With such a generalization, it is possible to consider what happens inside the time after the first transition and before the last transition where the problem is studied. This paper generalizes other papers presented before. The model is presented in a discrete time environment.  相似文献   

13.
The context of planned preventive maintenance lends itself readilyto probabilistic modelling. Indeed, many of the published theoreticalmodels to be found in the literature adopt a Markov approach,where states are usually ‘operating’, ‘operatingat one of several levels of deterioration’, and ‘failed’.However, most of these models assume the required Markovianproperty and do not address the issue of testing the assumption,or the related task of estimating parameters. It is possiblethat data are inadequate to test the assumption, or that theMarkov property is believed to be not strictly valid, but acceptableas an approximation. In this paper we consider within a specificinspection–maintenance context the robustness of a Markov-basedmodel when the Markov assumption is not valid. This is achievedby comparing the output of an exact delay time model of an inspection–maintenanceproblem with that of a semi-Markov approximation. The importanceof establishing the vadility of the Markov property in the modellingapplication is highlighted. If the plant behaviour is seen tobe nearly Markov, in the case considered the semi-Markov modelgives a good approximation to the exact model. Conversley ifthe Markov assumption is not a good approximation, the semi-Markovmodel can lead to inappropriate advice.  相似文献   

14.
We propose a computational approach for implementing discrete hidden semi-Markov chains. A discrete hidden semi-Markov chain is composed of a non-observable or hidden process which is a finite semi-Markov chain and a discrete observable process. Hidden semi-Markov chains possess both the flexibility of hidden Markov chains for approximating complex probability distributions and the flexibility of semi-Markov chains for representing temporal structures. Efficient algorithms for computing characteristic distributions organized according to the intensity, interval and counting points of view are described. The proposed computational approach in conjunction with statistical inference algorithms previously proposed makes discrete hidden semi-Markov chains a powerful model for the analysis of samples of non-stationary discrete sequences. Copyright © 1999 John Wiley & Sons, Ltd.  相似文献   

15.
This article addresses the estimation of hidden semi-Markov chains from nonstationary discrete sequences. Hidden semi-Markov chains are particularly useful to model the succession of homogeneous zones or segments along sequences. A discrete hidden semi-Markov chain is composed of a nonobservable state process, which is a semi-Markov chain, and a discrete output process. Hidden semi-Markov chains generalize hidden Markov chains and enable the modeling of various durational structures. From an algorithmic point of view, a new forward-backward algorithm is proposed whose complexity is similar to that of the Viterbi algorithm in terms of sequence length (quadratic in the worst case in time and linear in space). This opens the way to the maximum likelihood estimation of hidden semi-Markov chains from long sequences. This statistical modeling approach is illustrated by the analysis of branching and flowering patterns in plants.  相似文献   

16.
In this paper, we consider a class of semi-Markov processes, known as phase semi-Markov processes, which can be considered as an extension of Markov processes, but whose times between transitions are phase-type random variables. Based on the theory of generalized inverses, we derive expressions for the moments of the first-passage time distributions, generalizing the results obtained by Kemeny and Snell (1960) for Markov chains.  相似文献   

17.
We propose a system approach to the asymptotic analysis of stochastic systems in the scheme of series with averaging and diffusion approximation. Stochastic systems are defined by Markov processes with locally independent increments in a Euclidean space with random switchings that are described by jump Markov and semi-Markov processes. We use the asymptotic analysis of Markov and semi-Markov random evolutions. We construct the diffusion approximation using the asymptotic decomposition of generating operators and solutions of problems of singular perturbation for reducibly inverse operators. __________ Translated from Ukrains'kyi Matematychnyi Zhurnal, Vol. 57, No. 9, pp. 1235–1252, September, 2005.  相似文献   

18.
Markov chain usage models were successfully used to model systems and software. The most prominent approaches are the so-called failure state models Whittaker and Thomason (1994) and the arc-based Bayesian models Sayre and Poore (2000). In this paper we propose arc-based semi-Markov usage models to test systems. We extend previous studies that rely on the Markov chain assumption to the more general semi-Markovian setting. Among the obtained results we give a closed form representation of the first and second moments of the single-use reliability. The model and the validity of the results are illustrated through a numerical example.  相似文献   

19.
A semi-Markov process is easily made Markov by adding some auxiliary random variables. This paper discusses the I-type quasi-stationary distributions of such “extended” processes, and the α-invariant distributions for the corresponding Markov transition probabilities; and we show that there is an intimate relation between the two. The results have relevance in the study of the time to “absorption” or “death” of semi-Markov processes. The particular case of a terminating renewal process is studied as an example.  相似文献   

20.
We propose an approach to the proof of the weak convergence of a semi-Markov process to a Markov process under certain conditions imposed on local characteristics of the semi-Markov process.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号