首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Univariate or multivariate ordinal responses are often assumed to arise from a latent continuous parametric distribution, with covariate effects that enter linearly. We introduce a Bayesian nonparametric modeling approach for univariate and multivariate ordinal regression, which is based on mixture modeling for the joint distribution of latent responses and covariates. The modeling framework enables highly flexible inference for ordinal regression relationships, avoiding assumptions of linearity or additivity in the covariate effects. In standard parametric ordinal regression models, computational challenges arise from identifiability constraints and estimation of parameters requiring nonstandard inferential techniques. A key feature of the nonparametric model is that it achieves inferential flexibility, while avoiding these difficulties. In particular, we establish full support of the nonparametric mixture model under fixed cut-off points that relate through discretization the latent continuous responses with the ordinal responses. The practical utility of the modeling approach is illustrated through application to two datasets from econometrics, an example involving regression relationships for ozone concentration, and a multirater agreement problem. Supplementary materials with technical details on theoretical results and on computation are available online.  相似文献   

2.
A novel method is proposed to compute the Bayes estimate for a logistic Gaussian process prior for density estimation. The method gains speed by drawing samples from the posterior of a finite-dimensional surrogate prior, which is obtained by imputation of the underlying Gaussian process. We establish that imputation results in quite accurate computation. Simulation studies show that accuracy and high speed can be combined. This fact, along with known flexibility of the logistic Gaussian priors for modeling smoothness and recent results on their large support, makes these priors and the resulting density estimate very attractive.  相似文献   

3.
本文主要讨论软件测试过程中NHPP模型参数发生变化的情形,并用Bayes方法对GGO模型进行变点分析,运用基于Gibbs抽样的MCMC方法模拟出参数后验分布的马尔科夫链,最后借助于BUGS软件包对软件故障数据集Musa进行建模仿真,其结果表明该模型在软件可靠性变点分析中的直观性和有效性。  相似文献   

4.
This article proposes a probability model for k-dimensional ordinal outcomes, that is, it considers inference for data recorded in k-dimensional contingency tables with ordinal factors. The proposed approach is based on full posterior inference, assuming a flexible underlying prior probability model for the contingency table cell probabilities. We use a variation of the traditional multivariate probit model, with latent scores that determine the observed data. In our model, a mixture of normals prior replaces the usual single multivariate normal model for the latent variables. By augmenting the prior model to a mixture of normals we generalize inference in two important ways. First, we allow for varying local dependence structure across the contingency table. Second, inference in ordinal multivariate probit models is plagued by problems related to the choice and resampling of cutoffs defined for these latent variables. We show how the proposed mixture model approach entirely removes these problems. We illustrate the methodology with two examples, one simulated dataset and one dataset of interrater agreement.  相似文献   

5.
In comparing two populations, sometimes a model incorporating a certain probability order is desired. In this setting, Bayesian modeling is attractive since a probability order restriction imposed a priori on the population distributions is retained a posteriori. Extending the work in Gelfand and Kottas (2001) for stochastic order specifications, we formulate modeling for distributions ordered in variability. We work with Dirichlet process mixtures resulting in a fully Bayesian semiparametric approach. The details for simulation-based model fitting and prior specification are provided. An example, based on two small subsets of time intervals between eruptions of the Old Faithful geyser, illustrates the methodology.  相似文献   

6.
It is common in practice to estimate the quantiles of a complicated distribution by using the order statistics of a simulated sample. If the distribution of interest has known population mean, then it is often possible to improve the mean square error of the standard quantile estimator substantially through the simple device of mean-correction: subtract off the sample mean and add on the known population mean. Asymptotic results for the meancorrected quantile estimator are derived and compared to the standard sample quantile. Simulation results for a variety of distributions and processes illustrate the asymptotic theory. Application to Markov chain Monte Carlo and to simulation-based uncertainty analysis is described.  相似文献   

7.
Widely used parametric generalized linear models are, unfortunately, a somewhat limited class of specifications. Nonparametric aspects are often introduced to enrich this class, resulting in semiparametric models. Focusing on single or k-sample problems, many classical nonparametric approaches are limited to hypothesis testing. Those that allow estimation are limited to certain functionals of the underlying distributions. Moreover, the associated inference often relies upon asymptotics when nonparametric specifications are often most appealing for smaller sample sizes. Bayesian nonparametric approaches avoid asymptotics but have, to date, been limited in the range of inference. Working with Dirichlet process priors, we overcome the limitations of existing simulation-based model fitting approaches which yield inference that is confined to posterior moments of linear functionals of the population distribution. This article provides a computational approach to obtain the entire posterior distribution for more general functionals. We illustrate with three applications: investigation of extreme value distributions associated with a single population, comparison of medians in a k-sample problem, and comparison of survival times from different populations under fairly heavy censoring.  相似文献   

8.
Gaussian process models have been widely used in spatial statistics but face tremendous modeling and computational challenges for very large nonstationary spatial datasets. To address these challenges, we develop a Bayesian modeling approach using a nonstationary covariance function constructed based on adaptively selected partitions. The partitioned nonstationary class allows one to knit together local covariance parameters into a valid global nonstationary covariance for prediction, where the local covariance parameters are allowed to be estimated within each partition to reduce computational cost. To further facilitate the computations in local covariance estimation and global prediction, we use the full-scale covariance approximation (FSA) approach for the Bayesian inference of our model. One of our contributions is to model the partitions stochastically by embedding a modified treed partitioning process into the hierarchical models that leads to automated partitioning and substantial computational benefits. We illustrate the utility of our method with simulation studies and the global Total Ozone Matrix Spectrometer (TOMS) data. Supplementary materials for this article are available online.  相似文献   

9.
In this article, we model multivariate categorical (binary and ordinal) response data using a very rich class of scale mixture of multivariate normal (SMMVN) link functions to accommodate heavy tailed distributions. We consider both noninformative as well as informative prior distributions for SMMVN-link models. The notation of informative prior elicitation is based on available similar historical studies. The main objectives of this article are (i) to derive theoretical properties of noninformative and informative priors as well as the resulting posteriors and (ii) to develop an efficient Markov chain Monte Carlo algorithm to sample from the resulting posterior distribution. A real data example from prostate cancer studies is used to illustrate the proposed methodologies.  相似文献   

10.
Abstract

The members of a set of conditional probability density functions are called compatible if there exists a joint probability density function that generates them. We generalize this concept by calling the conditionals functionally compatible if there exists a non-negative function that behaves like a joint density as far as generating the conditionals according to the probability calculus, but whose integral over the whole space is not necessarily finite. A necessary and sufficient condition for functional compatibility is given that provides a method of calculating this function, if it exists. A Markov transition function is then constructed using a set of functionally compatible conditional densities and it is shown, using the compatibility results, that the associated Markov chain is positive recurrent if and only if the conditionals are compatible. A Gibbs Markov chain, constructed via “Gibbs conditionals” from a hierarchical model with an improper posterior, is a special case. Therefore, the results of this article can be used to evaluate the consequences of applying the Gibbs sampler when the posterior's impropriety is unknown to the user. Our results cannot, however, be used to detect improper posteriors. Monte Carlo approximations based on Gibbs chains are shown to have undesirable limiting behavior when the posterior is improper. The results are applied to a Bayesian hierarchical one-way random effects model with an improper posterior distribution. The model is simple, but also quite similar to some models with improper posteriors that have been used in conjunction with the Gibbs sampler in the literature.  相似文献   

11.
Very often, one needs to perform (classical or Bayesian) inference, when essentially nothing is known about the distribution of the dependent variable given certain covariates. The paper proposes to approximate the unknown distribution by its non-parametric counterpart—a step function—and treat the points of the support and the corresponding density values, as parameters, whose posterior distributions should be determined based on the available data. The paper proposes distributions should be determined based on the available data. The paper proposes Markov chain Monte Carlo methods to perform posterior analysis, and applies the new method to an analysis of stock returns. Copyright © 1999 John Wiley & Sons, Ltd.  相似文献   

12.
Bayesian inference using Markov chain Monte Carlo (MCMC) is computationally prohibitive when the posterior density of interest, π, is computationally expensive to evaluate. We develop a derivative-free algorithm GRIMA to accurately approximate π by interpolation over its high-probability density (HPD) region, which is initially unknown. Our local approach reduces the waste of computational budget on approximation of π in the low-probability region, which is inherent in global experimental designs. However, estimation of the HPD region is nontrivial when derivatives of π are not available or are not informative about the shape of the HPD region. Without relying on derivatives, GRIMA iterates (a) sequential knot selection over the estimated HPD region of π to refine the surrogate posterior and (b) re-estimation of the HPD region using an MCMC sample from the updated surrogate density, which is inexpensive to obtain. GRIMA is applicable to approximation of general unnormalized posterior densities. To determine the range of tractable problem dimensions, we conduct simulation experiments on test densities with linear and nonlinear component-wise dependence, skewness, kurtosis and multimodality. Subsequently, we use GRIMA in a case study to calibrate a computationally intensive nonlinear regression model to real data from the Town Brook watershed. Supplemental materials for this article are available online.  相似文献   

13.
This article proposes a four-pronged approach to efficient Bayesian estimation and prediction for complex Bayesian hierarchical Gaussian models for spatial and spatiotemporal data. The method involves reparameterizing the covariance structure of the model, reformulating the means structure, marginalizing the joint posterior distribution, and applying a simplex-based slice sampling algorithm. The approach permits fusion of point-source data and areal data measured at different resolutions and accommodates nonspatial correlation and variance heterogeneity as well as spatial and/or temporal correlation. The method produces Markov chain Monte Carlo samplers with low autocorrelation in the output, so that fewer iterations are needed for Bayesian inference than would be the case with other sampling algorithms. Supplemental materials are available online.  相似文献   

14.
To simulate a multivariate density with multi-hump, Markov chain Monte Carlo method encounters the obstacle of escaping from one hump to another, since it usually takes extraordinately long time and then becomes practically impossible to perform. To overcome these difficulties, a reversible scheme to generate a Markov chain, in terms of which the simulated density may be successful in rather general cases of practically avoiding being trapped in local humps, was suggested.  相似文献   

15.
Abstract

In this article, Swendsen–Wang–Wolff algorithms are extended to simulate spatial point processes with symmetric and stationary interactions. Convergence of these algorithms is considered. Some further generalizations of the algorithms are discussed. The ideas presented in this article can also be useful in handling some large and complicated systems.  相似文献   

16.
In latent Dirichlet allocation, the number of topics, T, is a hyperparameter of the model that must be specified before one can fit the model. The need to specify T in advance is restrictive. One way of dealing with this problem is to put a prior on T, but unfortunately the distribution on the latent variables of the model is then a mixture of distributions on spaces of different dimensions, and estimating this mixture distribution by Markov chain Monte Carlo is very difficult. We present a variant of the Metropolis–Hastings algorithm that can be used to estimate this mixture distribution, and in particular the posterior distribution of the number of topics. We evaluate our methodology on synthetic data and compare it with procedures that are currently used in the machine learning literature. We also give an illustration on two collections of articles from Wikipedia. Supplemental materials for this article are available online.  相似文献   

17.
Since Ferguson's seminal article on the Dirichlet process, the area of Bayesian nonparametric statistics has seen development of many flexible prior classes. At the center of the development lies the neutral to the right (NTR) process proposed by Doksum. Although the class of NTR processes is very rich in its members and has well-developed theoretical properties, its application has been restricted to very small portions of the class—mainly the Dirichlet, gamma, and beta processes. We believe that this is due to the lack of flexible computational algorithms that can be used as a component in a Markov chain Monte Carlo (MCMC) algorithm.

The main purpose of this article is to introduce a collection of algorithms (or a tool box), some already available in the literature and others newly proposed here, so that one can construct a suitable combination of algorithms from this collection to solve one's problem.  相似文献   

18.
The accurate estimation of outstanding liabilities of an insurance company is an essential task. This is to meet regulatory requirements, but also to achieve efficient internal capital management. Over the recent years, there has been increasing interest in the utilisation of insurance data at a more granular level, and to model claims using stochastic processes. So far, this so-called ‘micro-level reserving’ approach has mainly focused on the Poisson process.In this paper, we propose and apply a Cox process approach to model the arrival process and reporting pattern of insurance claims. This allows for over-dispersion and serial dependency in claim counts, which are typical features in real data. We explicitly consider risk exposure and reporting delays, and show how to use our model to predict the numbers of Incurred-But-Not-Reported (IBNR) claims. The model is calibrated and illustrated using real data from the AUSI data set.  相似文献   

19.
Spatial Regression Models for Extremes   总被引:2,自引:0,他引:2  
Meteorological data are often recorded at a number of spatial locations. This gives rise to the possibility of pooling data through a spatial model to overcome some of the limitations imposed on an extreme value analysis by a lack of information. In this paper we develop a spatial model for extremes based on a standard representation for site-wise extremal behavior, combined with a spatial latent process for parameter variation over the region. A smooth, but possibly non-linear, spatial structure is an intrinsic feature of the model, and difficulties in computation are solved using Markov chain Monte Carlo inference. A simulation study is carried out to illustrate the potential gain in efficiency achieved by the spatial model. Finally, the model is applied to data generated from a climatological model in order to characterize the hurricane climate of the Gulf and Atlantic coasts of the United States.  相似文献   

20.
本文研究了Dirichlet分布总体的参数和其他感光趣的量的贝叶斯估计。在参数的有实际意义的函数上设置均匀的先验分布,对适当变换后的参数用Metropolis算法得到马尔可夫链蒙特卡罗后验样本,由此即得参数和其他感兴趣的量的贝叶斯估计。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号