首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The multiperiod Bayesian forecast under the normal-gamma prior assumption for univariateAR models with strongly exogenous variables is investigated. A two-stage approximate method is proposed to provide an estimator of the posterior predictive density for any future observation in a convenient closed form. Some properties of the proposed method are proven analytically for a one-step ahead forecast. The precision of the proposed method is examined by using some simulated data and two sets of real data up to lead-twelve-ahead forecasts by comparison with a path sampling method. It is found that most of the results for the two discussed methods are rather close for short period forecast. Especially when sample size is sufficiently large, the estimated predictive density provided by the two-stage method asymptotically converges to the true density. A heuristic proof of this asymptotic property is also presented.  相似文献   

2.
In a Bayesian setup, we consider the problem of predicting a dependent variable given an independent variable and past observations on the two variables. An asymptotic formula for the relevant posterior predictive density is worked out. Considering posterior quantiles and highest predictive density regions, we then characterize priors that ensure approximate frequentist validity of Bayesian prediction in the above setting. Application to regression models is also discussed.  相似文献   

3.
In this paper the Bayesian approach for nonlinear multivariate calibration will be illustrated. This goal will be achieved by applying the Gibbs sampler to the rhinoceros data given by Clarke (1992, Biometrics, 48(4), 1081–1094). It will be shown that the point estimates obtained from the profile likelihoods and those calculated from the marginal posterior densities using improper priors will in most cases be similar.  相似文献   

4.
We analyze the reliability of NASA composite pressure vessels by using a new Bayesian semiparametric model. The data set consists of lifetimes of pressure vessels, wrapped with a Kevlar fiber, grouped by spool, subject to different stress levels; 10% of the data are right censored. The model that we consider is a regression on the log‐scale for the lifetimes, with fixed (stress) and random (spool) effects. The prior of the spool parameters is nonparametric, namely they are a sample from a normalized generalized gamma process, which encompasses the well‐known Dirichlet process. The nonparametric prior is assumed to robustify inferences to misspecification of the parametric prior. Here, this choice of likelihood and prior yields a new Bayesian model in reliability analysis. Via a Bayesian hierarchical approach, it is easy to analyze the reliability of the Kevlar fiber by predicting quantiles of the failure time when a new spool is selected at random from the population of spools. Moreover, for comparative purposes, we review the most interesting frequentist and Bayesian models analyzing this data set. Our credibility intervals of the quantiles of interest for a new random spool are narrower than those derived by previous Bayesian parametric literature, although the predictive goodness‐of‐fit performances are similar. Finally, as an original feature of our model, by means of the discreteness of the random‐effects distribution, we are able to cluster the spools into three different groups. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

5.
A variety of statistical problems (e.g. the x-intercept in linear regression, the abscissa of the point of intersection of two simple linear regression lines or the point of extremum in quadratic regression) can be viewed as questions of inference on nonlinear functions of the parameters in the general linear regression model. In this paper inferences on the threshold temperatures and summation constants in crop development will be made. A Bayesian approach for the general formulation of this problem will be developed. By using numerical integration, credibility intervals for individual functions as well as for linear combinations of the functions of the parameters can be obtained. The implementation of an odds ratio procedure is facilitated by placing a proper prior on the ratio of the relevant parameters.Financially supported by the University of the Orange Free State Research Fund.  相似文献   

6.
We apply a Bayesian approach to the problem of prediction in an unbalanced growth curve model using noninformative priors. Due to the complexity of the model, no analytic forms of the predictive densities are available. We propose both approximations and a prediction-oriented Metropolis-Hastings sampling algorithm for two types of prediction, namely the prediction of future observations for a new subject and the prediction of future values for a partially observed subject. They are illustrated and compared through real data and simulation studies. Two of the approximations compare favorably with the approximation in Fearn (1975, Biometrika, 62, 89–100) and are very comparable to the more accurate Rao-Blackwellization from Metropolis-Hastings sampling algorithm.  相似文献   

7.
Bayesian inference is considered for the seemingly unrelated regressions with an elliptically contoured error distribution. We show that the posterior distribution of the regression parameters and the predictive distribution of future observations under elliptical errors assumption are identical to those obtained under independently distributed normal errors when an improper prior is used. This gives inference robustness with respect to departures from the reference case of independent sampling from the normal distribution.  相似文献   

8.
The continuity of densities given by the weight functions n α , α ∈ [−1, ∞[, with respect to the parameter α is investigated. This work is supported by MIUR Italy, Program Barrande n. 2003-009-2, MSM6198898701 and GA ČR no. 201/04/0381.  相似文献   

9.
在正态分布的假定下,变点问题按照均值和方差的变化有四种情形.本文把TAR模型门限非线性的检验问题,看作是对应均值变化,方差不变情形下的变点问题.然后利用可逆跳马尔可夫蒙特卡罗模拟(RJMCMC)方法计算两个比较模型(AR和TAR模型)的后验概率.后验概率的结果支持TAR模型表明门限非线性的存在.模拟实验的结果说明基于贝叶斯推断的检验方法可以很好的区分AR和TAR模型.  相似文献   

10.
It is often the case that some information is available on the parameter of failure time distributions from previous experiments or analyses of failure time data. The Bayesian approach provides the methodology for incorporation of previous information with the current data. In this paper, given a progressively type II censored sample from a Rayleigh distribution, Bayesian estimators and credible intervals are obtained for the parameter and reliability function. We also derive the Bayes predictive estimator and highest posterior density prediction interval for future observations. Two numerical examples are presented for illustration and some simulation study and comparisons are performed. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

11.
Summary New Bayesian cohort models designed to resolve the identification problem in cohort analysis are proposed in this paper. At first, the basic cohort model which represents the statistical structure of time-series social survey data in terms of age, period and cohort effects is explained. The logit cohort model for qualitative data from a binomial distribution and the normal-type cohort model for quantitative data from a normal distribution are considered as two special cases of the basic model. In order to overcome the identification problem in cohort analysis, a Bayesian approach is adopted, based on the assumption that the effect parameters change gradually. A Bayesian information criterion ABIC is introduced for the selection of the optimal model. This approach is so flexible that both the logit and the normal-type cohort models can be made applicable, not only to standard cohort tables but also to general cohort tables in which the range of age group is not equal to the interval between periods. The practical utility of the proposed models is demonstrated by analysing two data sets from the literature on cohort analysis. The Institute of Statistical Mathematics  相似文献   

12.
The method of determining Bayesian estimators for the special ratios of variance components called the intraclass correlation coefficients is presented. The exact posterior distribution for these ratios of variance components is obtained. The approximate posterior mean of this distribution is also derived. All computations are non-iterative and avoid numerical integration.  相似文献   

13.
Summary  In this paper we investigate a Bayesian procedure for the estimation of a flexible generalised distribution, notably the MacGillivray adaptation of theg-and-k distribution. This distribution, described through its inverse cdf or quantile function, generalises the standard normal through extra parameters which together describe skewness and kurtosis. The standard quantile-based methods for estimating the parameters of generalised distributions are often arbitrary and do not rely on computation of the likelihood. MCMC, however, provides a simulation-based alternative for obtaining the maximum likelihood estimates of parameters of these distributions or for deriving posterior estimates of the parameters through a Bayesian framework. In this paper we adopt the latter approach. The proposed methodology is illustrated through an application in which the parameter of interest is slightly skewed.  相似文献   

14.
The aim of this paper is to introduce a new methodology for operational risk management, based on Bayesian copulae. One of the main problems related to operational risk management is understanding the complex dependence structure of the associated variables. In order to model this structure in a flexible way, we construct a method based on copulae. This allows us to split the joint multivariate probability distribution of a random vector of losses into individual components characterized by univariate marginals. Thus, copula functions embody all the information about the correlation between variables and provide a useful technique for modelling the dependency of a high number of marginals. Another important problem in operational risk modelling is the lack of loss data. This suggests the use of Bayesian models, computed via simulation methods and, in particular, Markov chain Monte Carlo. We propose a new methodology for modelling operational risk and for estimating the required capital. This methodology combines the use of copulae and Bayesian models.   相似文献   

15.
A finite mixture model using the multivariate t distribution has been well recognized as a robust extension of Gaussian mixtures. This paper presents an efficient PX-EM algorithm for supervised learning of multivariate t mixture models in the presence of missing values. To simplify the development of new theoretic results and facilitate the implementation of the PX-EM algorithm, two auxiliary indicator matrices are incorporated into the model and shown to be effective. The proposed methodology is a flexible mixture analyzer that allows practitioners to handle real-world multivariate data sets with complex missing patterns in a more efficient manner. The performance of computational aspects is investigated through a simulation study and the procedure is also applied to the analysis of real data with varying proportions of synthetic missing values.  相似文献   

16.
Bayesian Nonparametric Analysis for a Generalized Dirichlet Process Prior   总被引:1,自引:0,他引:1  
This paper considers a generalization of the Dirichlet process which is obtained by suitably normalizing superposed independent gamma processes having increasing integer-valued scale parameter. A comprehensive treatment of this random probability measure is provided. We prove results concerning its finite-dimensional distributions, moments, predictive distributions and the distribution of its mean. Most expressions are given in terms of multiple hypergeometric functions, thus highlighting the interplay between Bayesian Nonparametrics and special functions. Finally, a suitable simulation algorithm is applied in order to compute quantities of statistical interest.  相似文献   

17.
Tremendous progress has been made in the last two decades in the area of high-dimensional regression, especially in the “large p, small n” setting. Such sample starved settings inevitably lead to models which are potentially very unstable and hence quite unreliable. To this end, Bayesian shrinkage methods have generated a lot of recent interest in the modern high-dimensional regression and model selection context. Such methods span the wide spectrum of modern regression approaches and include among others, spike-and-slab priors, the Bayesian lasso, ridge regression, and global-local shrinkage priors such as the Horseshoe prior and the Dirichlet–Laplace prior. These methods naturally facilitate tractable uncertainty quantification and have thus been used extensively across diverse applications. A common unifying feature of these models is that the corresponding priors on the regression coefficients can be expressed as a scale mixture of normals. This property has been leveraged extensively to develop various three-step Gibbs samplers to explore the corresponding intractable posteriors. The convergence of such samplers however is very slow in high dimensions settings, making them disconnected to the very setting that they are intended to work in. To address this challenge, we propose a comprehensive and unifying framework to draw from the same family of posteriors via a class of tractable and scalable two-step blocked Gibbs samplers. We demonstrate that our proposed class of two-step blocked samplers exhibits vastly superior convergence behavior compared to the original three-step sampler in high-dimensional regimes on simulated data as well as data from a variety of applications including gene expression data, infrared spectroscopy data, and socio-economic/law enforcement data. We also provide a detailed theoretical underpinning to the new method by deriving explicit upper bounds for the (geometric) rate of convergence, and by proving that the proposed two-step sampler has superior spectral properties. Supplementary material for this article is available online.  相似文献   

18.
In the common Fourier regression model we determine the optimal designs for estimating the coefficients corresponding to the lower frequencies. An analytical solution is provided which is found by an alternative characterization of c-optimal designs. Several examples are provided and the performance of the D-optimal design with respect to the estimation of the lower order coefficients is investigated. The results give a complete answer to an open question which was recently raised in the literature.  相似文献   

19.
This paper proposes a conditional technique for the estimation of VaR and expected shortfall measures based on the skewed generalized t (SGT) distribution. The estimation of the conditional mean and conditional variance of returns is based on ten popular variations of the GARCH model. The results indicate that the TS-GARCH and EGARCH models have the best overall performance. The remaining GARCH specifications, except in a few cases, produce acceptable results. An unconditional SGT-VaR performs well on an in-sample evaluation and fails the tests on an out-of-sample evaluation. The latter indicates the need to incorporate time-varying mean and volatility estimates in the computation of VaR and expected shortfall measures.  相似文献   

20.
Constrained optimal discrimination designs for Fourier regression models   总被引:1,自引:0,他引:1  
In this article, the problem of constructing efficient discrimination designs in a Fourier regression model is considered. We propose designs which maximize the power of the F-test, which discriminates between the two highest order models, subject to the constraints that the tests that discriminate between lower order models have at least some given relative power. A complete solution is presented in terms of the canonical moments of the optimal designs, and for the special case of equal constraints even more specific formulae are available.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号