首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 120 毫秒
1.
Abstract

Maximum pseudo-likelihood estimation has hitherto been viewed as a practical but flawed alternative to maximum likelihood estimation, necessary because the maximum likelihood estimator is too hard to compute, but flawed because of its inefficiency when the spatial interactions are strong. We demonstrate that a single Newton-Raphson step starting from the maximum pseudo-likelihood estimator produces an estimator which is close to the maximum likelihood estimator in terms of its actual value, attained likelihood, and efficiency, even in the presence of strong interactions. This hybrid technique greatly increases the practical applicability of pseudo-likelihood-based estimation. Additionally, in the case of the spatial point processes, we propose a proper maximum pseudo-likelihood estimator which is different from the conventional one. The proper maximum pseudo-likelihood estimator clearly shows better performance than the conventional one does when the spatial interactions are strong.  相似文献   

2.
This article introduces a novel and flexible framework for investigating the roles of actors within a network. Particular interest is in roles as defined by local network connectivity patterns, identified using the ego-networks extracted from the network. A mixture of exponential-family random graph models (ERGM) is developed for these ego-networks to cluster the nodes into roles. We refer to this model as the ego-ERGM. An expectation-maximization algorithm is developed to infer the unobserved cluster assignments and to estimate the mixture model parameters using a maximum pseudo-likelihood approximation. We demonstrate the flexibility and utility of the method using examples of simulated and real networks.  相似文献   

3.
In this paper, we carry out an in-depth theoretical investigation for existence of maximum likelihood estimates for the Cox model [D.R. Cox, Regression models and life tables (with discussion), Journal of the Royal Statistical Society, Series B 34 (1972) 187–220; D.R. Cox, Partial likelihood, Biometrika 62 (1975) 269–276] both in the full data setting as well as in the presence of missing covariate data. The main motivation for this work arises from missing data problems, where models can easily become difficult to estimate with certain missing data configurations or large missing data fractions. We establish necessary and sufficient conditions for existence of the maximum partial likelihood estimate (MPLE) for completely observed data (i.e., no missing data) settings as well as sufficient conditions for existence of the maximum likelihood estimate (MLE) for survival data with missing covariates via a profile likelihood method. Several theorems are given to establish these conditions. A real dataset from a cancer clinical trial is presented to further illustrate the proposed methodology.  相似文献   

4.
Total Variation-based regularization, well established for image processing applications such as denoising, was recently introduced for Maximum Penalized Likelihood Estimation (MPLE) as an effective way to estimate nonsmooth probability densities. While the estimates show promise for a variety of applications, the nonlinearity of the regularization leads to computational challenges, especially in multidimensions. In this article we present a numerical methodology, based upon the Split Bregman L1 minimization technique, that overcomes these challenges, allowing for the fast and accurate computation of 2D TV-based MPLE. We test the methodology with several examples, including V-fold cross-validation with large 2D datasets, and highlight the application of TV-based MPLE to point process crime modeling. The proposed algorithm is implemented as the Matlab function TVMPLE. The Matlab (mex) code and datasets for examples and simulations are available as online supplements.  相似文献   

5.
In this article, we focus on statistical models for binary data on a regular two-dimensional lattice. We study two classes of models, the Markov mesh models (MMMs) based on causal-like, asymmetric spatial dependence, and symmetric Markov random fields (SMFs) based on noncausal-like, symmetric spatial dependence. Building on results of Enting (1977), we give sufficient conditions for the asymmetrically defined binary MMMs (of third order) to be equivalent to a symmetrically defined binary SMF. Although not every binary SMF can be written as a binary MMM, our results show that many can. For such SMFs, their joint distribution can be written in closed form and their realizations can be simulated with just one pass through the lattice. An important consequence of the latter observation is that there are nontrivial spatial processes for which exact probabilities can be used to benchmark the performance of Markov-chain-Monte-Carlo and other algorithms.  相似文献   

6.
半参数广义线性混合效应模型的估计及其渐近性质   总被引:1,自引:0,他引:1       下载免费PDF全文
半参数广义线性混合效应模型在心理学、生物育种、医学等领域有广泛的应用. Zhang(1998)用最大惩罚似然函数的方法(MPLE)对模型的参数和非参数部分进行了估计, 而Zhang (1998) MPLE方法只适用于正态数据模型. 对于泊松等常用的模型, 常的方法是将随机效应看作缺失数据, 再引入EM算法. 本文基于McCulloch 1997)提出的MCNR算法, 此算法推广到半参数广义线性混合效应模型中并得到相应的估计算法. 于非参数部分, 本文采用P样条拟合并利用GCV方法选取光滑参数, 时证明了所得估计的相合性和渐近正态性. 最后, 过模拟和实例与其它算法作比较验证本文估计方法的有效性.  相似文献   

7.
We show that copulae and kernel estimation can be mixed to estimate the risk of an economic loss. We analyze the properties of the Sarmanov copula. We find that the maximum pseudo-likelihood estimation of the dependence parameter associated with the copula with double transformed kernel estimation to estimate marginal cumulative distribution functions is a useful method for approximating the risk of extreme dependent losses when we have large data sets. We use a bivariate sample of losses from a real database of auto insurance claims.  相似文献   

8.
Parameters of Gaussian multivariate models are often estimated using the maximum likelihood approach. In spite of its merits, this methodology is not practical when the sample size is very large, as, for example, in the case of massive georeferenced data sets. In this paper, we study the asymptotic properties of the estimators that minimize three alternatives to the likelihood function, designed to increase the computational efficiency. This is achieved by applying the information sandwich technique to expansions of the pseudo-likelihood functions as quadratic forms of independent normal random variables. Theoretical calculations are given for a first-order autoregressive time series and then extended to a two-dimensional autoregressive process on a lattice. We compare the efficiency of the three estimators to that of the maximum likelihood estimator as well as among themselves, using numerical calculations of the theoretical results and simulations.  相似文献   

9.
中国银行间拆借利率扩散模型的极大拟似然估计   总被引:1,自引:1,他引:0  
本文用极大拟似然估计法估计了中国银行间市场七天拆借利率扩散模型的参数。并用自助法对众多不同的模型进行了广义拟似然比检验。结论表明:中国货币市场利率具有均值回复效应:利率敏感系数γ值为1.421265,对利率水平具有较高敏感性。  相似文献   

10.
The problem of marginal density estimation for a multivariate density function f(x) can be generally stated as a problem of density function estimation for a random vector λ(x) of dimension lower than that of x. In this article, we propose a technique, the so-called continuous Contour Monte Carlo (CCMC) algorithm, for solving this problem. CCMC can be viewed as a continuous version of the contour Monte Carlo (CMC) algorithm recently proposed in the literature. CCMC abandons the use of sample space partitioning and incorporates the techniques of kernel density estimation into its simulations. CCMC is more general than other marginal density estimation algorithms. First, it works for any density functions, even for those having a rugged or unbalanced energy landscape. Second, it works for any transformation λ(x) regardless of the availability of the analytical form of the inverse transformation. In this article, CCMC is applied to estimate the unknown normalizing constant function for a spatial autologistic model, and the estimate is then used in a Bayesian analysis for the spatial autologistic model in place of the true normalizing constant function. Numerical results on the U.S. cancer mortality data indicate that the Bayesian method can produce much more accurate estimates than the MPLE and MCMLE methods for the parameters of the spatial autologistic model.  相似文献   

11.
The pseudo likelihood method of Besag (1974) has remained a popular method for estimating Markov random field on a very large lattice, despite various documented deficiencies. This is partly because it remains the only computationally tractable method for large lattices. We introduce a novel method to estimate Markov random fields defined on a regular lattice. The method takes advantage of conditional independence structures and recursively decomposes a large lattice into smaller sublattices. An approximation is made at each decomposition. Doing so completely avoids the need to compute the troublesome normalizing constant. The computational complexity is O(N), where N is the number of pixels in the lattice, making it computationally attractive for very large lattices. We show through simulations, that the proposed method performs well, even when compared with methods using exact likelihoods. Supplementary material for this article is available online.  相似文献   

12.
Parameter estimation for two-dimensional point pattern data is difficult, because most of the available stochastic models have intractable likelihoods which usually depend on an unknown scaling factor. However, this problem can be bypassed using the pseudo-likelihood estimation method. Baddeley and Turner (1998) presented a numerical algorithm for computing approximated maximum pseudo-likelihood estimates for Gibbs point processes with exponential family likelihoods. We use their method and a new technique based on Voronoi polygons to evaluate the qua-drature points to present an intensive comparative simulation study which evaluates the performance of these two methods compared to the traditional approximation under varying circumstances. Two Gibbs point process models, the Strauss and saturation processes, have been used. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

13.
讨论了定数截尾样本下双参数指数分布环境因子的极大似然估计、区间估计和Bayes估计.以参数后验密度的商密度作为环境因子的后验密度,并结合专家经验运用Bayes方法给出了环境因子在平方损失下和LINEX损失下的Bayes估计.最后运用Monte Carlo方法对各估计结果的均方误差(MSE),进行了模拟比较.结果表明LINEX损失下环境因子的估计较好.  相似文献   

14.
Pareto分布环境因子的估计及其应用   总被引:2,自引:0,他引:2  
给出了Pareto分布环境因子的定义,讨论了在定数截尾样本下Pareto分布环境因子的极大似然估计和修正极大似然估计,并尝试把环境因子用于可靠性评估中.最后运用Monte Carlo方法对极大似然估计,修正极大似然估计和可靠性指标的均方误差(MSE),进行了模拟比较,结果表明修正极大似然估计优于极大似然估计且考虑环境因子的可靠性评估结果较好.  相似文献   

15.
We consider point processes defined through a pairwise interaction potential and admitting a two-dimensional sufficient statistic. It is shown that the pseudo maximum likelihood estimate can be stochastically normed so that the limiting distribution is a standard normal distribution. This result is true irrespectively of the possible existence of phase transitions. The work here is an extension of the work Guyon and Künsch (1992,Lecture Notes in Statist.,74, Springer, New York) and is based on viewing a point process interchangeably as a lattice field.  相似文献   

16.
The equilibrium statistics of the Euler equations in two dimensions are studied, and a new continuum model of coherent, or organized, states is proposed. This model is defined by a maximum entropy principle similar to that governing the Miller‐Robert model except that the family of global vorticity invariants is relaxed to a family of inequalities on all convex enstrophy integrals. This relaxation is justified by constructing the continuum model from a sequence of lattice models defined by Gibbs measures whose invariants are derived from the exact vorticity dynamics, not a spectral truncation or spatial discretization of it. The key idea is that the enstrophy integrals can be partially lost to vorticity fluctuations on a range of scales smaller than the lattice microscale, while energy is retained in the larger scales. A consequence of this relaxation is that many of the convex enstrophy constraints can be inactive in equilibrium, leading to a simplification of the mean‐field equation for the coherent state. Specific examples of these simplified theories are established for vortex patch dynamics. In particular, a universal relation between mean vorticity and stream function is obtained in the dilute limit of the vortex patch theory, which is different from the sinh relation predicted by the Montgomery‐Joyce theory of point vortices. © 1999 John Wiley & Sons, Inc.  相似文献   

17.
Three semiparametric methods for estimating dependence parameters in copula models are compared, namely maximum pseudo-likelihood estimation and the two method-of-moment approaches based on the inversion of Spearman’s rho and Kendall’s tau. For each of these three asymptotically normal estimators, an estimator of their asymptotic (co)variance is stated in three different situations, namely the bivariate one-parameter case, the multivariate one-parameter case and the multivariate multiparameter case. An extensive Monte Carlo study is carried out to compare the finite-sample performance of the three estimators under consideration in these three situations. In the one-parameter case, it involves up to six bivariate and four-variate copula families, and up to five levels of dependence. In the multiparameter case, attention is restricted to trivariate and four-variate normal and t copulas. The maximum pseudo-likelihood estimator appears as the best choice in terms of mean square error in all situations except for small and weakly dependent samples. It is followed by the method-of-moment estimator based on Kendall’s tau, which overall appears to be significantly better than its analogue based on Spearman’s rho. The simulation results are complemented by asymptotic relative efficiency calculations. The numerical computation of Spearman’s rho, Kendall’s tau and their derivatives in the case of copula families for which explicit expressions are not available is also investigated.  相似文献   

18.
The standard tool used in the analysis of geostatistical data is the variogram. When the variogram is applied to lattice data, most commonly the data associated with each region are assumed to have been observed and arbitrarily assigned at the center or centroid of the region. Distances between centroids are then used to develop the spatial covariance structure through the variogram function directly. This arbitrariness of assigning the data to the centroid causes concern because the spatial structure estimated by the variogram depends heavily on the distances between observations. This article investigates what happens to the estimation of the variogram when each lattice value is, in fact, placed randomly within its associated region. We examine the effect that this randomly placed location has on the empirical variogram, the fitted theoretical variogram, and testing for the existence of spatial correlation. Both a regular lattice and an irregular lattice are used for demonstration. In particular, county level summaries of standardized mortality rates for lung, pancreas, and stomach cancer are investigated to see how placing data points randomly throughout the county affects the estimation of the variogram.  相似文献   

19.
Multiresolution spatial models are able to capture complex dependence correlation in spatial data and are excellent alternatives to the traditional random field models for mapping spatial processes. Because of the multiresolution structures, spatial process prediction can be obtained by direct and fast computation algorithms. However, the existing multiresolution models usually assume a simple constant mean structure, which may not be suitable in practice. In this article, we focus on a multiresolution tree-structured spatial model and extend the model to incorporate a linear regression mean. We explore the properties of the multiresolution tree-structured spatial linear model in depth and estimate the parameters in the linear regression mean and the spatial-dependence structure simultaneously. An expectation-maximization algorithm is adopted to obtain the maximum likelihood estimates of the model parameters and the corresponding information matrix. Given the estimated parameters, a one-pass change-of-resolution Kalman filter algorithm is implemented to obtain the best linear unbiased predictor of the true underlying spatial process. For illustration, the methodology is applied to optimally map crop yield in a Wisconsin field, after accounting for the field conditions by a linear regression.  相似文献   

20.
For the mixed effects models with balanced data, a new ordering of design matrices of random effects is defined, and then a simple formula of the spectral decomposition of covariance matrix is obtained. To compare with the two methods in literature, the decomposition can not only give the actual number of all distinct eigenvalues and their expression, but also show clearly the relationship between the design matrices of random effects and the decomposition. These results can be applied to the problems for testifying the analysis of the variance estimate being a minimum variance unbiased under all random effects models and some mixed effects models with balanced data, for finding the explicit solution of maximum likelihood equations for the general mixed effects model and for showing the relationship between the spectral decomposition estimate and the analysis of variance estimate.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号