首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In this work, we investigate sequential Bayesian estimation for inference of stochastic volatility with variance‐gamma (SVVG) jumps in returns. We develop an estimation algorithm that combines the sequential learning auxiliary particle filter with the particle learning filter. Simulation evidence and empirical estimation results indicate that this approach is able to filter latent variances, identify latent jumps in returns, and provide sequential learning about the static parameters of SVVG. We demonstrate comparative performance of the sequential algorithm and off‐line Markov Chain Monte Carlo in synthetic and real data applications.  相似文献   

2.
We propose a Bayesian approach for inference in the multivariate probit model, taking into account the association structure between binary observations. We model the association through the correlation matrix of the latent Gaussian variables. Conditional independence is imposed by setting some off-diagonal elements of the inverse correlation matrix to zero and this sparsity structure is modeled using a decomposable graphical model. We propose an efficient Markov chain Monte Carlo algorithm relying on a parameter expansion scheme to sample from the resulting posterior distribution. This algorithm updates the correlation matrix within a simple Gibbs sampling framework and allows us to infer the correlation structure from the data, generalizing methods used for inference in decomposable Gaussian graphical models to multivariate binary observations. We demonstrate the performance of this model and of the Markov chain Monte Carlo algorithm on simulated and real datasets. This article has online supplementary materials.  相似文献   

3.
This article proposes a new approach for Bayesian and maximum likelihood parameter estimation for stationary Gaussian processes observed on a large lattice with missing values. We propose a Markov chain Monte Carlo approach for Bayesian inference, and a Monte Carlo expectation-maximization algorithm for maximum likelihood inference. Our approach uses data augmentation and circulant embedding of the covariance matrix, and provides likelihood-based inference for the parameters and the missing data. Using simulated data and an application to satellite sea surface temperatures in the Pacific Ocean, we show that our method provides accurate inference on lattices of sizes up to 512 × 512, and is competitive with two popular methods: composite likelihood and spectral approximations.  相似文献   

4.
Our article considers the class of recently developed stochastic models that combine claims payments and incurred losses information into a coherent reserving methodology. In particular, we develop a family of hierarchical Bayesian paid–incurred claims models, combining the claims reserving models of Hertig (1985) and Gogol (1993). In the process we extend the independent log-normal model of Merz and Wüthrich (2010) by incorporating different dependence structures using a Data-Augmented mixture Copula paid–incurred claims model.In this way the paper makes two main contributions: firstly we develop an extended class of model structures for the paid–incurred chain ladder models where we develop precisely the Bayesian formulation of such models; secondly we explain how to develop advanced Markov chain Monte Carlo sampling algorithms to make inference under these copula dependence PIC models accurately and efficiently, making such models accessible to practitioners to explore their suitability in practice. In this regard the focus of the paper should be considered in two parts, firstly development of Bayesian PIC models for general dependence structures with specialised properties relating to conjugacy and consistency of tail dependence across the development years and accident years and between Payment and incurred loss data are developed. The second main contribution is the development of techniques that allow general audiences to efficiently work with such Bayesian models to make inference. The focus of the paper is not so much to illustrate that the PIC paper is a good class of models for a particular data set, the suitability of such PIC type models is discussed in Merz and Wüthrich (2010) and Happ and Wüthrich (2013). Instead we develop generalised model classes for the PIC family of Bayesian models and in addition provide advanced Monte Carlo methods for inference that practitioners may utilise with confidence in their efficiency and validity.  相似文献   

5.
本文讨论了潜伏期和传染期均服从威布尔分布、易感性随机变化的一类随机流行病模型,并利用M CM C算法对潜伏期、传染期的参数和易感性的超参数作了贝叶期推断.这种分析方法比以往各种方法更适用于各类疾病.  相似文献   

6.
We propose a special panel quantile regression model with multiple stochastic change‐points to analyze latent structural breaks in the short‐term post‐offering price–volume relationships in China's growth enterprise market where the piecewise quantile equations are defined by change point indication functions. We also develop a new Bayesian inference and Markov chain Monte Carlo simulation approach to estimate the parameters, including the locations of change points, and put forth simulation‐based posterior Bayesian factor tests to find the best number of change points. Our empirical evidence suggests that the single change point effect is significant on quantile‐based price–volume relationships in China's growth enterprise market. The lagged initial public offering (IPO) return and the IPO volume rate of change have positive impacts on the current IPO return before and after the change point. Along with investors' gradually declining hot sentiment toward a new IPO, the market index volume rate of change induces the abnormal short‐term post‐offering IPO return to move back to the equilibrium. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

7.
Markov chain Monte Carlo (MCMC) algorithms offer a very general approach for sampling from arbitrary distributions. However, designing and tuning MCMC algorithms for each new distribution can be challenging and time consuming. It is particularly difficult to create an efficient sampler when there is strong dependence among the variables in a multivariate distribution. We describe a two-pronged approach for constructing efficient, automated MCMC algorithms: (1) we propose the “factor slice sampler,” a generalization of the univariate slice sampler where we treat the selection of a coordinate basis (factors) as an additional tuning parameter, and (2) we develop an approach for automatically selecting tuning parameters to construct an efficient factor slice sampler. In addition to automating the factor slice sampler, our tuning approach also applies to the standard univariate slice samplers. We demonstrate the efficiency and general applicability of our automated MCMC algorithm with a number of illustrative examples. This article has online supplementary materials.  相似文献   

8.
多元Copula-GARCH模型及其在金融风险分析上的应用   总被引:7,自引:0,他引:7  
针对传统风险分析模型的不足,结合Copula技术和GARCH模型,提出了多元Copula-GARCH模型。指出该模型不仅可以捕捉金融市场间的非线性相关性,还可以得到更灵活的多元分布进而用于资产投资组合VaR分析。在详细探讨了基于Copula技术的资产投资组合的MonteCarlo仿真技术的基础上,运用具有不同边缘分布的多元Copula-GARCH模型,对上海股市进行了研究,结果证实了所提模型和方法的可行性和有效性。  相似文献   

9.
We propose an approach to a twofold optimal parameter search for a combined variance reduction technique of the control variates and the important sampling in a suitable pure-jump Lévy process framework. The parameter search procedure is based on the two-time-scale stochastic approximation algorithm with equilibrated control variates component and with quasi-static importance sampling one. We prove the almost sure convergence of the algorithm to a unique optimum. The parameter search algorithm is further embedded in adaptive Monte Carlo simulations in the case of the gamma distribution and process. Numerical examples of the CDO tranche pricing with the Gamma copula model and the intensity Gamma model are provided to illustrate the effectiveness of our method.   相似文献   

10.
In this article, we develop a new approach within the framework of asset pricing models that incorporates two key features of the latent volatility: co‐movement among conditionally heteroscedastic financial returns and switching between different unobservable regimes. By combining latent factor models with hidden Markov chain models we derive a dynamical local model for segmentation and prediction of multivariate conditionally heteroscedastic financial time series. We concentrate more precisely on situations where the factor variances are modelled by univariate generalized quadratic autoregressive conditionally heteroscedastic processes. The expectation maximization algorithm that we have developed for the maximum likelihood estimation is based on a quasi‐optimal switching Kalman filter approach combined with a generalized pseudo‐Bayesian approximation, which yield inferences about the unobservable path of the common factors, their variances and the latent variable of the state process. Extensive Monte Carlo simulations and preliminary experiments obtained with daily foreign exchange rate returns of eight currencies show promising results. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

11.
In this article, we review the concept of a Lévy copula to describe the dependence structure of a bivariate compound Poisson process. In this first statistical approach we consider a parametric model for the Lévy copula and estimate the parameters of the full dependent model based on a maximum likelihood approach. This approach ensures that the estimated model remains in the class of multivariate compound Poisson processes. A simulation study investigates the small sample behaviour of the MLEs, where we also suggest a new simulation algorithm. Finally, we apply our method to Danish fire insurance data.  相似文献   

12.
Single-index models have found applications in econometrics and biometrics, where multidimensional regression models are often encountered. This article proposes a nonparametric estimation approach that combines wavelet methods for nonequispaced designs with Bayesian models. We consider a wavelet series expansion of the unknown regression function and set prior distributions for the wavelet coefficients and the other model parameters. To ensure model identifiability, the direction parameter is represented via its polar coordinates. We employ ad hoc hierarchical mixture priors that perform shrinkage on wavelet coefficients and use Markov chain Monte Carlo methods for a posteriori inference. We investigate an independence-type Metropolis-Hastings algorithm to produce samples for the direction parameter. Our method leads to simultaneous estimates of the link function and of the index parameters. We present results on both simulated and real data, where we look at comparisons with other methods.  相似文献   

13.
Probabilistic programming is an area of research that aims to develop general inference algorithms for probabilistic models expressed as probabilistic programs whose execution corresponds to inferring the parameters of those models. In this paper, we introduce a probabilistic programming language (PPL) based on abductive logic programming for performing inference in probabilistic models involving categorical distributions with Dirichlet priors. We encode these models as abductive logic programs enriched with probabilistic definitions and queries, and show how to execute and compile them to boolean formulas. Using the latter, we perform generalized inference using one of two proposed Markov Chain Monte Carlo (MCMC) sampling algorithms: an adaptation of uncollapsed Gibbs sampling from related work and a novel collapsed Gibbs sampling (CGS). We show that CGS converges faster than the uncollapsed version on a latent Dirichlet allocation (LDA) task using synthetic data. On similar data, we compare our PPL with LDA-specific algorithms and other PPLs. We find that all methods, except one, perform similarly and that the more expressive the PPL, the slower it is. We illustrate applications of our PPL on real data in two variants of LDA models (Seed and Cluster LDA), and in the repeated insertion model (RIM). In the latter, our PPL yields similar conclusions to inference with EM for Mallows models.  相似文献   

14.
We develop and implement a method for maximum likelihood estimation of a regime-switching stochastic volatility model. Our model uses a continuous time stochastic process for the stock dynamics with the instantaneous variance driven by a Cox–Ingersoll–Ross process and each parameter modulated by a hidden Markov chain. We propose an extension of the EM algorithm through the Baum–Welch implementation to estimate our model and filter the hidden state of the Markov chain while using the VIX index to invert the latent volatility state. Using Monte Carlo simulations, we test the convergence of our algorithm and compare it with an approximate likelihood procedure where the volatility state is replaced by the VIX index. We found that our method is more accurate than the approximate procedure. Then, we apply Fourier methods to derive a semi-analytical expression of S&P500 and VIX option prices, which we calibrate to market data. We show that the model is sufficiently rich to encapsulate important features of the joint dynamics of the stock and the volatility and to consistently fit option market prices.  相似文献   

15.
In this paper we propose forecasting market risk measures, such as Value at Risk (VaR) and Expected Shortfall (ES), for large dimensional portfolios via copula modeling. For that we compare several high dimensional copula models, from naive ones to complex factor copulas, which are able to simultaneously tackle the curse of dimensionality and introduce a high level of complexity into the model. We explore both static and dynamic copula fitting. In the dynamic case we allow different levels of flexibility for the dependence parameters which are driven by a GAS (Generalized Autoregressive Scores) model, in the spirit of Oh and Patton (2015). Our empirical results, for assets negotiated at Brazilian BOVESPA stock market from January, 2008 to December, 2014, suggest that, compared to the other copula models, the GAS dynamic factor copula approach has a superior performance in terms of AIC (Akaike Information Criterion) and a non-inferior performance with respect to VaR and ES forecasting.  相似文献   

16.
In this paper we examine the relationship between a newly developed local dependence measure, the local Gaussian correlation, and standard copula theory. We are able to describe characteristics of the dependence structure in different copula models in terms of the local Gaussian correlation. Further, we construct a goodness-of-fit test for bivariate copula models. An essential ingredient of this test is the use of a canonical local Gaussian correlation and Gaussian pseudo-observations which make the test independent of the margins, so that it is a genuine test of the copula structure. A Monte Carlo study reveals that the test performs very well compared to a commonly used alternative test. We also propose two types of diagnostic plots which can be used to investigate the cause of a rejected null. Finally, our methods are applied to a “classical” insurance data set.  相似文献   

17.
Spatial Regression Models for Extremes   总被引:2,自引:0,他引:2  
Meteorological data are often recorded at a number of spatial locations. This gives rise to the possibility of pooling data through a spatial model to overcome some of the limitations imposed on an extreme value analysis by a lack of information. In this paper we develop a spatial model for extremes based on a standard representation for site-wise extremal behavior, combined with a spatial latent process for parameter variation over the region. A smooth, but possibly non-linear, spatial structure is an intrinsic feature of the model, and difficulties in computation are solved using Markov chain Monte Carlo inference. A simulation study is carried out to illustrate the potential gain in efficiency achieved by the spatial model. Finally, the model is applied to data generated from a climatological model in order to characterize the hurricane climate of the Gulf and Atlantic coasts of the United States.  相似文献   

18.
We present a Bayesian decision theoretic approach for developing replacement strategies. In so doing, we consider a semiparametric model to describe the failure characteristics of systems by specifying a nonparametric form for cumulative intensity function and by taking into account effect of covariates by a parametric form. Use of a gamma process prior for the cumulative intensity function complicates the Bayesian analysis when the updating is based on failure count data. We develop a Bayesian analysis of the model using Markov chain Monte Carlo methods and determine replacement strategies. Adoption of Markov chain Monte Carlo methods involves a data augmentation algorithm. We show the implementation of our approach using actual data from railroad tracks. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

19.
Cure rate models offer a convenient way to model time-to-event data by allowing a proportion of individuals in the population to be completely cured so that they never face the event of interest (say, death). The most studied cure rate models can be defined through a competing cause scenario in which the random variables corresponding to the time-to-event for each competing causes are conditionally independent and identically distributed while the actual number of competing causes is a latent discrete random variable. The main interest is then in the estimation of the cured proportion as well as in developing inference about failure times of the susceptibles. The existing literature consists of parametric and non/semi-parametric approaches, while the expectation maximization (EM) algorithm offers an efficient tool for the estimation of the model parameters due to the presence of right censoring in the data. In this paper, we study the cases wherein the number of competing causes is either a binary or Poisson random variable and a piecewise linear function is used for modeling the hazard function of the time-to-event. Exact likelihood inference is then developed based on the EM algorithm and the inverse of the observed information matrix is used for developing asymptotic confidence intervals. The Monte Carlo simulation study demonstrates the accuracy of the proposed non-parametric approach compared to the results attained from the true correct parametric model. The proposed model and the inferential method is finally illustrated with a data set on cutaneous melanoma.  相似文献   

20.
The purpose of this paper is to present a comprehensive Monte Carlo simulation study on the performance of minimum-distance (MD) and maximum-likelihood (ML) estimators for bivariate parametric copulas. In particular, I consider Cramér-von-Mises-, Kolmogorov-Smirnov- and L 1-variants of the CvM-statistic based on the empirical copula process, Kendall’s dependence function and Rosenblatt’s probability integral transform. The results presented in this paper show that regardless of the parametric form of the copula, the sample size or the location of the parameter, maximum-likelihood yields smaller estimation biases at less computational effort than any of the MD-estimators. The MD-estimators based on copula goodness-of-fit metrics, on the other hand, suffer from large biases especially when used for estimating the parameters of archimedean copulas. Moreover, the results show that the bias and efficiency of the minimum-distance estimators are strongly influenced by the location of the parameter. Conversely, the results for the maximum-likelihood estimator are relatively stable over the parameter interval of the respective parametric copula.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号