首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In this paper, we develop a semi-parametric Bayesian estimation approach through the Dirichlet process (DP) mixture in fitting linear mixed models. The random-effects distribution is specified by introducing a multivariate skew-normal distribution as base for the Dirichlet process. The proposed approach efficiently deals with modeling issues in a wide range of non-normally distributed random effects. We adopt Gibbs sampling techniques to achieve the parameter estimates. A small simulation study is conducted to show that the proposed DP prior is better at the prediction of random effects. Two real data sets are analyzed and tested by several hypothetical models to illustrate the usefulness of the proposed approach.  相似文献   

2.
This article describes posterior simulation methods for mixture models whose mixing distribution has a Normalized Random Measure prior. The methods use slice sampling ideas and introduce no truncation error. The approach can be easily applied to both homogeneous and nonhomogeneous Normalized Random Measures and allows the updating of the parameters of the random measure. The methods are illustrated on data examples using both Dirichlet and Normalized Generalized Gamma process priors. In particular, the methods are shown to be computationally competitive with previously developed samplers for Dirichlet process mixture models. Matlab code to implement these methods is available as supplemental material.  相似文献   

3.
The core of the nonparametric/semiparametric Bayesian analysis is to relax the particular parametric assumptions on the distributions of interest to be unknown and random, and assign them a prior. Selecting a suitable prior therefore is especially critical in the nonparametric Bayesian fitting. As the distribution of distribution, Dirichlet process (DP) is the most appreciated nonparametric prior due to its nice theoretical proprieties, modeling flexibility and computational feasibility. In this paper, we review and summarize some developments of DP during the past decades. Our focus is mainly concentrated upon its theoretical properties, various extensions, statistical modeling and applications to the latent variable models.  相似文献   

4.
Summary A random measure is said to be selected by a weighted gamma prior probability if the values it assigns to disjoint sets are independent gamma random variables with positive multipliers. If the intensity measure of a nonhomogeneous Poisson point process is selected by a weighted gamma prior probability and if a sample is drawn from the Poisson point process having this intensity measure, then the posterior random intensity measure given the observations is also selected by a weighted gamma prior probability. If the measure space is Euclidean and if the true intensity measure is continuous and finite, the centered posterior process, rescaled by the square root of the sample size, will converge weakly in Skorohod topology to a Wiener process subject to a change of time scale.This research was supported in part by the National Science Foundation Grants MCS 77-10376 and MCS 75-14194  相似文献   

5.
There exists a wide variety of models for return, and the chosen model determines the tool required to calculate the value at risk (VaR). This paper introduces an alternative methodology to model‐based simulation by using a Monte Carlo simulation of the Dirichlet process. The model is constructed in a Bayesian framework, using properties initially described by Ferguson. A notable advantage of this model is that, on average, the random draws are sampled from a mixed distribution that consists of a prior guess by an expert and the empirical process based on a random sample of historical asset returns. The method is relatively automatic and similar to machine learning tools, e.g. the estimate is updated as new data arrive.  相似文献   

6.
Summary The Bayes method is seldom applied to nonparametric statistical problems, for the reason that it is hard to find mathematically tractable prior distributions on a set of probability measures. However, it is found that the Dirichlet process generates randomly a family of probability distributions which can be taken as a family of prior distributions for an application of the Bayes method to such problems. This paper presents a Bayesian analysis of a nonparametric problem of selecting a distribution with the largestpth quantile value, fromk≧2 given distributions. It is assumed a priori that the given distributions have been generated from a Dirichlet process. This work was supported by the U.S. Office of Naval Research under Contract No. 00014-75-C-0451.  相似文献   

7.
Bayesian Nonparametric Analysis for a Generalized Dirichlet Process Prior   总被引:1,自引:0,他引:1  
This paper considers a generalization of the Dirichlet process which is obtained by suitably normalizing superposed independent gamma processes having increasing integer-valued scale parameter. A comprehensive treatment of this random probability measure is provided. We prove results concerning its finite-dimensional distributions, moments, predictive distributions and the distribution of its mean. Most expressions are given in terms of multiple hypergeometric functions, thus highlighting the interplay between Bayesian Nonparametrics and special functions. Finally, a suitable simulation algorithm is applied in order to compute quantities of statistical interest.  相似文献   

8.
Clustering is one of the most widely used procedures in the analysis of microarray data, for example with the goal of discovering cancer subtypes based on observed heterogeneity of genetic marks between different tissues. It is well known that in such high-dimensional settings, the existence of many noise variables can overwhelm the few signals embedded in the high-dimensional space. We propose a novel Bayesian approach based on Dirichlet process with a sparsity prior that simultaneous performs variable selection and clustering, and also discover variables that only distinguish a subset of the cluster components. Unlike previous Bayesian formulations, we use Dirichlet process (DP) for both clustering of samples as well as for regularizing the high-dimensional mean/variance structure. To solve the computational challenge brought by this double usage of DP, we propose to make use of a sequential sampling scheme embedded within Markov chain Monte Carlo (MCMC) updates to improve the naive implementation of existing algorithms for DP mixture models. Our method is demonstrated on a simulation study and illustrated with the leukemia gene expression dataset.  相似文献   

9.
Two characterisations of a random mean from a Dirichlet process, as a limit of finite sums of a simple symmetric form and as a solution of a certain stochastic equation, are developed and investigated. These are used to reach results on and new insights into the distributions of such random means. In particular, identities involving functional transforms and recursive moment formulae are established. Furthermore, characterisations for several choices of the Dirichlet process parameter (leading to symmetric, unimodal, stable, and finite mixture distributions) are provided. Our methods lead to exact simulation recipes for prior and posterior random means, an approximation algorithm for the exact densities of these means, and limiting normality theorems for posterior distributions. The theory also extends to mixtures of Dirichlet processes and to the case of several random means simultaneously.  相似文献   

10.
The two parameter Poisson–Dirichlet distribution PD(α, θ) is the distribution of an infinite dimensional random discrete probability. It is a generalization of Kingman’s Poisson–Dirichlet distribution. The two parameter Dirichlet process ${\Pi_{\alpha,\theta,\nu_0}}$ is the law of a pure atomic random measure with masses following the two parameter Poisson–Dirichlet distribution. In this article we focus on the construction and the properties of the infinite dimensional symmetric diffusion processes with respective symmetric measures PD(α, θ) and ${\Pi_{\alpha,\theta,\nu_0}}$ . The methods used come from the theory of Dirichlet forms.  相似文献   

11.
Here, a Mandelbrot measure is a statistically self-similar measure μ on the boundary of a c-ary tree, obtained by multiplying random weights indexed by the nodes of the tree. We take a particular interest in the random variable Y = ‖μ‖: we study the existence of finite moments of negative orders for Y, conditionally to Y > 0, and the continuity properties of Y with respect to the weights. Our results on moments make possible to study, with probability one, the existence of a local Hölder exponent for μ, almost everywhere with respect to another Mandelbrot measure, as well as to perform the multifractal analysis of μ, under hypotheses that are weaker than those usually assumed.  相似文献   

12.
??The local limit theorems for the minimum of a random walk with Markovian increments is given, with using Presman's factorization theory. This result implies the asymptotic behaviour of the survival probability for a critical branching process in Markovian depended random environment.  相似文献   

13.
We develop a multi-element probabilistic collocation method (ME-PCM) for arbitrary discrete probability measures with finite moments and apply it to solve partial differential equations with random parameters. The method is based on numerical construction of orthogonal polynomial bases in terms of a discrete probability measure. To this end, we compare the accuracy and efficiency of five different constructions. We develop an adaptive procedure for decomposition of the parametric space using the local variance criterion. We then couple the ME-PCM with sparse grids to study the Korteweg–de Vries (KdV) equation subject to random excitation, where the random parameters are associated with either a discrete or a continuous probability measure. Numerical experiments demonstrate that the proposed algorithms lead to high accuracy and efficiency for hybrid (discrete–continuous) random inputs.  相似文献   

14.
The aim of this paper is to derive new near-ignorance models on the probability simplex, which do not directly involve the Dirichlet distribution and, thus, are alternative to the Imprecise Dirichlet Model (IDM). We focus our investigation on a particular class of distributions on the simplex which is known as the class of Normalized Infinitely Divisible (NID) distributions; it includes the Dirichlet distribution as a particular case. For this class it is possible to derive general formulae for prior and posterior predictive inferences, by exploiting the Lévy–Khintchine representation theorem. This allows us to generally characterize the near-ignorance properties of the NID class. After deriving these general properties, we focus our attention on three members of this class. We will show that one of these near-ignorance models satisfies the representation invariance principle and, for a given value of the prior strength, always provides inferences that encompass those of the IDM. The other two models do not satisfy this principle, but their imprecision depends linearly or almost linearly on the number of observed categories; we argue that this is sometimes a desirable property for a predictive model.  相似文献   

15.
Subjective estimates of probability or utility are prone to two kinds of error: random and systematic (bias). The usual approach to reducing random error is averaging. How to reduce systematic error is not clear. This paper deals with the question of bias propagation when the estimates that are averaged are bias prone. Both simple and weighted averages are examined. The idea of using averaging as a bias reduction technique is explored. When the weights used in the averaging process are themselves biased, the random error in the average is also affected, possibly adversely. A balance has to be struck between bias reduction requirements and random error reduction requirements.  相似文献   

16.
We analyze the reliability of NASA composite pressure vessels by using a new Bayesian semiparametric model. The data set consists of lifetimes of pressure vessels, wrapped with a Kevlar fiber, grouped by spool, subject to different stress levels; 10% of the data are right censored. The model that we consider is a regression on the log‐scale for the lifetimes, with fixed (stress) and random (spool) effects. The prior of the spool parameters is nonparametric, namely they are a sample from a normalized generalized gamma process, which encompasses the well‐known Dirichlet process. The nonparametric prior is assumed to robustify inferences to misspecification of the parametric prior. Here, this choice of likelihood and prior yields a new Bayesian model in reliability analysis. Via a Bayesian hierarchical approach, it is easy to analyze the reliability of the Kevlar fiber by predicting quantiles of the failure time when a new spool is selected at random from the population of spools. Moreover, for comparative purposes, we review the most interesting frequentist and Bayesian models analyzing this data set. Our credibility intervals of the quantiles of interest for a new random spool are narrower than those derived by previous Bayesian parametric literature, although the predictive goodness‐of‐fit performances are similar. Finally, as an original feature of our model, by means of the discreteness of the random‐effects distribution, we are able to cluster the spools into three different groups. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

17.
This paper distinguishes between objective probability—or chance—and subjective probability. Most statistical methods in machine learning are based on the hypothesis that there is a random experiment from which we get a set of observations. This random experiment could be identified with a chance or objective probability, but these probabilities depend on some unknown parameters. Our knowledge of these parameters is not objective and in order to learn about them, we must assess some epistemic probabilities about their values. In some cases, our objective knowledge about these parameters is vacuous, so the question is: What epistemic probabilities should be assumed? In this paper we argue for the assumption of non-vacuous (a proper subset of [0, 1]) interval probabilities. There are several reasons for this; some are based on the betting interpretation of epistemic probabilities while others are based on the learning capabilities under the vacuous representation. The implications of the selection of epistemic probabilities in different concepts as conditioning and learning are studied. It is shown that in order to maintain some reasonable learning capabilities we have to assume more informative prior models than those frequently used in the literature, such as the imprecise Dirichlet model.  相似文献   

18.
This work develops Feynman–Kac formulas for a class of regime-switching jump diffusion processes, in which the jump part is driven by a Poisson random measure associated with a general Lévy process and the switching part depends on the jump diffusion processes. Under broad conditions, the connections of such stochastic processes and the corresponding partial integro-differential equations are established. Related initial, terminal and boundary value problems are also treated. Moreover, based on weak convergence of probability measures, it is demonstrated that a sequence of random variables related to the regime-switching jump diffusion process converges in distribution to the arcsine law.  相似文献   

19.
Transient random walk on a tree induces a Dirichlet form on its Martin boundary, which is the Cantor set. The procedure of the inducement is analogous to that of the Douglas integral on S1 associated with the Brownian motion on the unit disk. In this paper, those Dirichlet forms on the Cantor set induced by random walks on trees are investigated. Explicit expressions of the hitting distribution (harmonic measure) ν and the induced Dirichlet form on the Cantor set are given in terms of the effective resistances. An intrinsic metric on the Cantor set associated with the random walk is constructed. Under the volume doubling property of ν with respect to the intrinsic metric, asymptotic behaviors of the heat kernel, the jump kernel and moments of displacements of the process associated with the induced Dirichlet form are obtained. Furthermore, relation to the noncommutative Riemannian geometry is discussed.  相似文献   

20.
This paper presents a consensus model for group decision making with interval multiplicative and fuzzy preference relations based on two consensus criteria: (1) a consensus measure which indicates the agreement between experts’ preference relations and (2) a measure of proximity to find out how far the individual opinions are from the group opinion. These measures are calculated by using the relative projections of individual preference relations on the collective one, which are obtained by extending the relative projection of vectors. First, the weights of experts are determined by the relative projections of individual preference relations on the initial collective one. Then using the weights of experts, all individual preference relations are aggregated into a collective one. The consensus and proximity measures are calculated by using the relative projections of experts’ preference relations respectively. The consensus measure is used to guide the consensus process until the collective solution is achieved. The proximity measure is used to guide the discussion phase of consensus reaching process. In such a way, an iterative algorithm is designed to guide the experts in the consensus reaching process. Finally the expected value preference relations are defined to transform the interval collective preference relation to a crisp one and the weights of alternatives are obtained from the expected value preference relations. Two numerical examples are given to illustrate the models and approaches.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号