首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
讨论了具有散度偏大特征计数数据的建模与拟合问题.针对导致数据散度偏大的原因和常用的几类候选模型的结构,分别给出了关于嵌套模型的模型与变量同时选择的Bayes方法和关于非嵌套模型的模型检验与比较方法,并在此基础上进一步完善,提出了较为系统完整的模型与变量选择方法.实际例子说明了方法的具体实现过程和有效性.  相似文献   

2.
Gaussian process models have been widely used in spatial statistics but face tremendous modeling and computational challenges for very large nonstationary spatial datasets. To address these challenges, we develop a Bayesian modeling approach using a nonstationary covariance function constructed based on adaptively selected partitions. The partitioned nonstationary class allows one to knit together local covariance parameters into a valid global nonstationary covariance for prediction, where the local covariance parameters are allowed to be estimated within each partition to reduce computational cost. To further facilitate the computations in local covariance estimation and global prediction, we use the full-scale covariance approximation (FSA) approach for the Bayesian inference of our model. One of our contributions is to model the partitions stochastically by embedding a modified treed partitioning process into the hierarchical models that leads to automated partitioning and substantial computational benefits. We illustrate the utility of our method with simulation studies and the global Total Ozone Matrix Spectrometer (TOMS) data. Supplementary materials for this article are available online.  相似文献   

3.
Incurred but not reported (IBNR) loss reserving is an important issue for Property & Casualty (P&C) insurers. To calculate IBNR reserve, one needs to model claim arrivals and then predict IBNR claims. However, factors such as temporal dependence among claim arrivals and environmental variation are often not incorporated in many of the current loss reserving models, which may greatly affect the accuracy of IBNR predictions.In this paper, we propose to model the claim arrival process together with its reporting delays as a marked Cox process. Our model is versatile in modeling temporal dependence, allowing also for natural interpretations. This paper focuses mainly on the theoretical aspects of the proposed model. We show that the associated reported claim process and IBNR claim process are both marked Cox processes with easily convertible intensity functions and marking distributions. The proposed model can also account for fluctuations in the exposure. By an order statistics property, we show that the corresponding discretely observed process preserves all the information about the claim arrivals. Finally, we derive closed-form expressions for both the autocorrelation function (ACF) and the distributions of the numbers of reported claims and IBNR claims. Model estimation and its applications are considered in a subsequent paper, Badescu et al. (2015b).  相似文献   

4.
In comparing two populations, sometimes a model incorporating a certain probability order is desired. In this setting, Bayesian modeling is attractive since a probability order restriction imposed a priori on the population distributions is retained a posteriori. Extending the work in Gelfand and Kottas (2001) for stochastic order specifications, we formulate modeling for distributions ordered in variability. We work with Dirichlet process mixtures resulting in a fully Bayesian semiparametric approach. The details for simulation-based model fitting and prior specification are provided. An example, based on two small subsets of time intervals between eruptions of the Old Faithful geyser, illustrates the methodology.  相似文献   

5.
Pricing life insurance contracts with early exercise features   总被引:3,自引:0,他引:3  
In this paper we describe an algorithm based on the Least Squares Monte Carlo method to price life insurance contracts embedding American options. We focus on equity-linked contracts with surrender options and terminal guarantees on benefits payable upon death, survival and surrender. The framework allows for randomness in mortality as well as stochastic volatility and jumps in financial risk factors. We provide numerical experiments demonstrating the performance of the algorithm in the context of multiple risk factors and exercise dates.  相似文献   

6.
In this paper a simulation approach for defaultable yield curves is developed within the Heath et al. (1992) framework. The default event is modelled using the Cox process where the stochastic intensity represents the credit spread. The forward credit spread volatility function is affected by the entire credit spread term structure. The paper provides the defaultable bond and credit default swap option price in a probability setting equipped with a subfiltration structure. The Euler–Maruyama stochastic integral approximation and the Monte Carlo method are applied to develop a numerical scheme for pricing. Finally, the antithetic variable technique is used to reduce the variance of credit default swap option prices.  相似文献   

7.
Abstract

In this article, Swendsen–Wang–Wolff algorithms are extended to simulate spatial point processes with symmetric and stationary interactions. Convergence of these algorithms is considered. Some further generalizations of the algorithms are discussed. The ideas presented in this article can also be useful in handling some large and complicated systems.  相似文献   

8.
In Bayesian analysis of mixture models, the label-switching problem occurs as a result of the posterior distribution being invariant to any permutation of cluster indices under symmetric priors. To solve this problem, we propose a novel relabeling algorithm and its variants by investigating an approximate posterior distribution of the latent allocation variables instead of dealing with the component parameters directly. We demonstrate that our relabeling algorithm can be formulated in a rigorous framework based on information theory. Under some circumstances, it is shown to resemble the classical Kullback-Leibler relabeling algorithm and include the recently proposed equivalence classes representatives relabeling algorithm as a special case. Using simulation studies and real data examples, we illustrate the efficiency of our algorithm in dealing with various label-switching phenomena. Supplemental materials for this article are available online.  相似文献   

9.
To simulate a multivariate density with multi-hump, Markov chain Monte Carlo method encounters the obstacle of escaping from one hump to another, since it usually takes extraordinately long time and then becomes practically impossible to perform. To overcome these difficulties, a reversible scheme to generate a Markov chain, in terms of which the simulated density may be successful in rather general cases of practically avoiding being trapped in local humps, was suggested.  相似文献   

10.
Summary  In the inference of contingency table, when the cell counts are not large enough for asymptotic approximation, conditioning exact method is used and often computationally impractical for large tables. Instead, various sampling methods can be used. Based on permutation, the Monte Carlo sampling may become again impractical for large tables. For this, existing the Markov chain method is to sample a few elements of the table at each iteration and is inefficient. Here we consider a Markov chain, in which a sub-table of user specified size is updated at each iteration, and it achieves high sampling efficiency. Some theoretical properties of the chain and its applications to some commonly used tables are discussed. As an illustration, this method is applied to the exact test of the Hardy-Weinberg equilibrium in the population genetics context.  相似文献   

11.
Abstract

Many Bayesian analyses use Markov chain Monte Carlo (MCMC) techniques. MCMC techniques work fastest (per iteration) when the prior distribution of the parameters is chosen conveniently, such as a conjugate prior. However, this is sometimes at odds with the prior desired by the investigator. We describe two motivating examples where nonconjugate priors are preferred. One is a Dirichlet process where it is difficult to implement alternative, nonconjugate priors. We develop a method that allows computation to be done with a convenient prior but adjusts the equilibrium distribution of the Markov chain to be the posterior distribution from the desired prior. In addition to allowing more freedom in choosing prior distributions, the method enables the investigator to perform quick sensitivity analyses, even in nonparametric settings.  相似文献   

12.
Since Ferguson's seminal article on the Dirichlet process, the area of Bayesian nonparametric statistics has seen development of many flexible prior classes. At the center of the development lies the neutral to the right (NTR) process proposed by Doksum. Although the class of NTR processes is very rich in its members and has well-developed theoretical properties, its application has been restricted to very small portions of the class—mainly the Dirichlet, gamma, and beta processes. We believe that this is due to the lack of flexible computational algorithms that can be used as a component in a Markov chain Monte Carlo (MCMC) algorithm.

The main purpose of this article is to introduce a collection of algorithms (or a tool box), some already available in the literature and others newly proposed here, so that one can construct a suitable combination of algorithms from this collection to solve one's problem.  相似文献   

13.
Spatial Regression Models for Extremes   总被引:2,自引:0,他引:2  
Meteorological data are often recorded at a number of spatial locations. This gives rise to the possibility of pooling data through a spatial model to overcome some of the limitations imposed on an extreme value analysis by a lack of information. In this paper we develop a spatial model for extremes based on a standard representation for site-wise extremal behavior, combined with a spatial latent process for parameter variation over the region. A smooth, but possibly non-linear, spatial structure is an intrinsic feature of the model, and difficulties in computation are solved using Markov chain Monte Carlo inference. A simulation study is carried out to illustrate the potential gain in efficiency achieved by the spatial model. Finally, the model is applied to data generated from a climatological model in order to characterize the hurricane climate of the Gulf and Atlantic coasts of the United States.  相似文献   

14.
This article proposes a probability model for k-dimensional ordinal outcomes, that is, it considers inference for data recorded in k-dimensional contingency tables with ordinal factors. The proposed approach is based on full posterior inference, assuming a flexible underlying prior probability model for the contingency table cell probabilities. We use a variation of the traditional multivariate probit model, with latent scores that determine the observed data. In our model, a mixture of normals prior replaces the usual single multivariate normal model for the latent variables. By augmenting the prior model to a mixture of normals we generalize inference in two important ways. First, we allow for varying local dependence structure across the contingency table. Second, inference in ordinal multivariate probit models is plagued by problems related to the choice and resampling of cutoffs defined for these latent variables. We show how the proposed mixture model approach entirely removes these problems. We illustrate the methodology with two examples, one simulated dataset and one dataset of interrater agreement.  相似文献   

15.
This paper investigates the behaviour of the random walk Metropolis algorithm in high-dimensional problems. Here we concentrate on the case where the components in the target density is a spatially homogeneous Gibbs distribution with finite range. The performance of the algorithm is strongly linked to the presence or absence of phase transition for the Gibbs distribution; the convergence time being approximately linear in dimension for problems where phase transition is not present. Related to this, there is an optimal way to scale the variance of the proposal distribution in order to maximise the speed of convergence of the algorithm. This turns out to involve scaling the variance of the proposal as the reciprocal of dimension (at least in the phase transition-free case). Moreover, the actual optimal scaling can be characterised in terms of the overall acceptance rate of the algorithm, the maximising value being 0.234, the value as predicted by studies on simpler classes of target density. The results are proved in the framework of a weak convergence result, which shows that the algorithm actually behaves like an infinite-dimensional diffusion process in high dimensions.  相似文献   

16.
This paper presents a dynamic forecasting model that accommodates asymmetric market responses to marketing mix variable—price promotion—by threshold models. As a threshold variable to generate a mechanism for different market responses, we use the counterpart to the concept of a price threshold applied to a representative consumer in a store. A Bayesian approach is taken for statistical modelling because of advantages that it offers over estimation and forecasting. The proposed model incorporates the lagged effects of a price variable. Thereby, myriad pricing strategies can be implemented in the time horizon. Their effectiveness can be evaluated using the predictive density. We intend to improve the forecasting performance over conventional linear time series models. Furthermore, we discuss efficient dynamic pricing in a store using strategic simulations under some scenarios suggested by an estimated structure of the models. Empirical studies illustrate the superior forecasting performance of our model against conventional linear models in terms of the root mean square error of the forecasts. Useful information for dynamic pricing is derived from its structural parameter estimates. This paper develops a dynamic forecasting model that accommodates asymmetric market responses to marketing mix variable—price promotion—by the threshold models. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

17.
The modified mixture model with Markov switching volatility specification is introduced to analyze the relationship between stock return volatility and trading volume. We propose to construct an algorithm based on Markov chain Monte Carlo simulation methods to estimate all the parameters in the model using a Bayesian approach. The series of returns and trading volume of the British Petroleum stock will be analyzed. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

18.
We present an extension of continuous domain Simulated Annealing. Our algorithm employs a globally reaching candidate generator, adaptive stochastic acceptance probabilities, and converges in probability to the optimal value. An application to simulation-optimization problems with asymptotically diminishing errors is presented. Numerical results on a noisy protein-folding problem are included.  相似文献   

19.
20.
In this paper, we propose a new hybrid scheme of parallel tempering and simulated annealing (hybrid PT/SA). Within the hybrid PT/SA scheme, a composite system with multiple conformations is evolving in parallel on a temperature ladder with various transition step sizes. The simulated annealing (SA) process uses a cooling scheme to decrease the temperature values in the temperature ladder to the target temperature. The parallel tempering (PT) scheme is employed to reduce the equilibration relaxation time of the composite system at a particular temperature ladder configuration in the SA process. The hybrid PT/SA method reduces the waiting time in deep local minima and thus leads to a more efficient sampling capability on high-dimensional complicated objective function landscapes. Compared to the approaches PT and parallel SA with the same temperature ladder, transition step sizes, and cooling scheme (parallel SA) configurations, our preliminary results obtained with the hybrid PT/SA method confirm the expected improvements in simulations of several test objective functions, including the Rosenbrock’s function and the “rugged” funnel-like function, and several instantiations of the traveling salesman problem. The hybrid PT/SA may have slower convergence than genetic algorithms (GA) with good crossover heuristics, but it has the advantage of tolerating “bad” initial values and displaying robust sampling capability, even in the absence of additional information. Moreover, the hybrid PT/SA has natural parallelization potential.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号