首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Distribution estimation is very important in order to make statistical inference for parameters or its functions based on this distribution.In this work we propose an estimator of the distribution of some variable with non-smooth auxiliary information,for example,a symmetric distribution of this variable.A smoothing technique is employed to handle the non-differentiable function.Hence,a distribution can be estimated based on smoothed auxiliary information.Asymptotic properties of the distribution estimator are derived and analyzed.The distribution estimators based on our method are found to be significantly efficient than the corresponding estimators without these auxiliary information.Some simulation studies are conducted to illustrate the finite sample performance of the proposed estimators.  相似文献   

2.
Receiver operating characteristic (ROC) curves are often used to study the two sample problem in medical studies. However, most data in medical studies are censored. Usually a natural estimator is based on the Kaplan-Meier estimator. In this paper we propose a smoothed estimator based on kernel techniques for the ROC curve with censored data. The large sample properties of the smoothed estimator are established. Moreover, deficiency is considered in order to compare the proposed smoothed estimator of the ROC curve with the empirical one based on Kaplan-Meier estimator. It is shown that the smoothed estimator outperforms the direct empirical estimator based on the Kaplan-Meier estimator under the criterion of deficiency. A simulation study is also conducted and a real data is analyzed.  相似文献   

3.
In this paper, we consider the standard two-sample framework with right censoring. We construct useful confidence intervals for the ratio or difference of two hazard functions using smoothed empirical likelihood (EL) methods. The empirical log-likelihood ratio is derived and its asymptotic distribution is a standard chi-squared distribution. Bootstrap confidence bands are also proposed. Simulation studies show that the proposed EL confidence intervals have outperformed normal approximation methods in terms of coverage probability. It is concluded that the empirical likelihood methods provide better inference results.  相似文献   

4.
This paper deals with the problem of choosing the optimum criterion to select the best of a set of nested binary choice models. Special attention is given to the procedures which are derived in a decision-theoretic framework, called model selection criteria (MSC). We propose a new criterion, which we call C 2, whose theoretical behaviour is compared with that of the AIC and SBIC criteria. The result of the theoretical study shows that the SBIC is the best criterion whatever the situation we consider, while the AIC and C 2 are only adequate in some cases. The Monte Carlo experiment that is carried out corroborates the theoretical results and adds others: finite sample behaviour and robustness to changes in some aspects of the data generating process. The classical hypothesis testing procedures LR and LM are included and compared with the three criteria of the MSC category. The authors wish to thank the financial support provided by the Spanish Department of Education under project BEC 2003-01757.  相似文献   

5.
This article develops a new algorithm named TTRISK to solve high-dimensional risk-averse optimization problems governed by differential equations (ODEs and/or partial differential equations [PDEs]) under uncertainty. As an example, we focus on the so-called Conditional Value at Risk (CVaR), but the approach is equally applicable to other coherent risk measures. Both the full and reduced space formulations are considered. The algorithm is based on low rank tensor approximations of random fields discretized using stochastic collocation. To avoid nonsmoothness of the objective function underpinning the CVaR, we propose an adaptive strategy to select the width parameter of the smoothed CVaR to balance the smoothing and tensor approximation errors. Moreover, unbiased Monte Carlo CVaR estimate can be computed by using the smoothed CVaR as a control variate. To accelerate the computations, we introduce an efficient preconditioner for the Karush–Kuhn–Tucker (KKT) system in the full space formulation.The numerical experiments demonstrate that the proposed method enables accurate CVaR optimization constrained by large-scale discretized systems. In particular, the first example consists of an elliptic PDE with random coefficients as constraints. The second example is motivated by a realistic application to devise a lockdown plan for United Kingdom under COVID-19. The results indicate that the risk-averse framework is feasible with the tensor approximations under tens of random variables.  相似文献   

6.
An alternative to the accelerated failure time model is to regress the median of the failure time on the covariates. In the recent years, censored median regression models have been shown to be useful for analyzing a variety of censored survival data with the robustness property. Based on missing information principle, a semiparametric inference procedure for regression parameter has been developed when censoring variable depends on continuous covariate. In order to improve the low coverage accuracy of such procedure, we apply an empirical likelihood ratio method (EL) to the model and derive the limiting distributions of the estimated and adjusted empirical likelihood ratios for the vector of regression parameter. Two kinds of EL confidence regions for the unknown vector of regression parameters are obtained accordingly. We conduct an extensive simulation study to compare the performance of the proposed methods with that normal approximation based method. The simulation results suggest that the EL methods outperform the normal approximation based method in terms of coverage probability. Finally, we make some discussions about our methods.  相似文献   

7.
In this paper, an online algorithm is proposed for the identification of unknown time-varying input delay in the case of discrete non-linear systems described by decoupled multimodel. This method relies on the minimization of a performance index based on the error between the real system and the partial internal models outputs. In addition, a decoupled internal multimodel control is proposed for the compensation of discrete non-linear systems with time-varying delay. This control scheme incorporates partial internal model controls. Each partial controller is associated to a specified operating zone of the non-linear system. The switching between these controllers is ensured by a supervisor that contains a set of local predictors. A simulation example is carried out to illustrate the significance of the proposed time-varying delay identification algorithm and the proposed internal multimodel control scheme.  相似文献   

8.

In this paper, we investigate the quantile varying coefficient model for longitudinal data, where the unknown nonparametric functions are approximated by polynomial splines and the estimators are obtained by minimizing the quadratic inference function. The theoretical properties of the resulting estimators are established, and they achieve the optimal convergence rate for the nonparametric functions. Since the objective function is non-smooth, an estimation procedure is proposed that uses induced smoothing and we prove that the smoothed estimator is asymptotically equivalent to the original estimator. Moreover, we propose a variable selection procedure based on the regularization method, which can simultaneously estimate and select important nonparametric components and has the asymptotic oracle property. Extensive simulations and a real data analysis show the usefulness of the proposed method.

  相似文献   

9.
A class of optimal adaptive multi-arm clinical trial designs is proposed based on an extended generalized Pólya urn (GPU) model. The design is applicable to both the qualitative and quantitative responses and achieves, asymptotically, some pre-specified optimality criterion. Such criterion is specified by a functional of the response distributions and is implemented through the relationship between the design matrix and its first eigenvector. The asymptotic properties of the design are studied using the existing methods on GPU. Some examples for commonly used clinical designs are given as illustration.  相似文献   

10.
In this paper we address the issue of designing optimal fuzzy interfaces, which are fundamental components of a fuzzy inference system. Due to the different roles of input and output interfaces, optimality conditions are analyzed separately for the two types of interface. We prove that input interfaces are optimal when based on a particular class of fuzzy sets called “bi-monotonic”, provided that mild conditions hold. The class of bi-monotonic fuzzy sets covers a broad range of fuzzy sets shapes, including convex fuzzy sets, so that the provided theoretical results can be applied to several fuzzy models. Such theoretical results are not applicable to output interfaces, for which a different optimality criterion is proposed. Such criterion leads to the definition of an optimality degree that measures the quality of a fuzzy output interface. Illustrative examples are presented to highlight the features of the proposed optimality degree in assessing the quality of output interfaces.  相似文献   

11.
In this paper, we use smoothed empirical likelihood methods to construct confidence intervals for hazard and density functions under right censorship. Some empirical log-likelihood ratios for the hazard and density functions are obtained and their asymptotic limits are derived. Approximate confidence intervals based on these methods are constructed. Simulation studies are used to compare the empirical likelihood methods and the normal approximation methods in terms of coverage accuracy. It is found that the empirical likelihood methods provide better inference.  相似文献   

12.
A self-weighted quantile procedure is proposed to study the inference for a spatial unilateral autoregressive model with independent and identically distributed innovations belonging to the domain of attraction of a stable law with index of stability α, α ∈ (0, 2]. It is shown that when the model is stationary, the self-weighted quantile estimate of the parameter has a closed form and converges to a normal limiting distribution, which avoids the difficulty of Roknossadati and Zarepour (2010) in deriving their limiting distribution for an M-estimate. On the contrary, we show that when the model is not stationary, the proposed estimates have the same limiting distributions as those of Roknossadati and Zarepour. Furthermore, a Wald test statistic is proposed to consider the test for a linear restriction on the parameter, and it is shown that under a local alternative, the Wald statistic has a non-central chisquared distribution. Simulations and a real data example are also reported to assess the performance of the proposed method.  相似文献   

13.
Recent advances in the transformation model have made it possible to use this model for analyzing a variety of censored survival data. For inference on the regression parameters, there are semiparametric procedures based on the normal approximation. However, the accuracy of such procedures can be quite low when the censoring rate is heavy. In this paper, we apply an empirical likelihood ratio method and derive its limiting distribution via U-statistics. We obtain confidence regions for the regression parameters and compare the proposed method with the normal approximation based method in terms of coverage probability. The simulation results demonstrate that the proposed empirical likelihood method overcomes the under-coverage problem substantially and outperforms the normal approximation based method. The proposed method is illustrated with a real data example. Finally, our method can be applied to general U-statistic type estimating equations.  相似文献   

14.
This article deals with the inference on a right-censored partially linear single-index model (RCPLSIM). The main focus is the local empirical likelihood-based inference on the nonparametric part in RCPLSIM. With a synthetic data approach, an empirical log-likelihood ratio statistic for the nonparametric part is defined and it is shown that its limiting distribution is not a central chi-squared distribution. To increase the accuracy of the confidence interval, we also propose a corrected empirical log-likelihood ratio statistic for the nonparametric function. The resulting statistic is proved to follow a standard chi-squared limiting distribution. Simulation studies are undertaken to assess the finite sample performance of the proposed confidence intervals. A real example is also considered.  相似文献   

15.
This paper deals with nonparametric inference problems in the multiplicative intensity model for counting processes. We propose a Nelson–Aalen type estimator based on discrete observation. The functional asymptotic normality of the estimator is proved. The limit process is the same as that in the continuous observation case, thus the proposed estimator based on discrete observation has the same properties as the Nelson–Aalen estimator based on continuous observation. For example, the asymptotic efficiency of proposed estimator is valid based on less information than the continuous observation case. A Kaplan–Meier type estimator is also discussed. Nonparametric goodness of fit test is considered, and an asymptotically distribution free test is proposed.  相似文献   

16.
We consider the problem of making statistical inference about the mean of a normal distribution based on a random sample of quantized (digitized) observations. This problem arises, for example, in a measurement process with errors drawn from a normal distribution and with a measurement device or process with a known resolution, such as the resolution of an analog-to-digital converter or another digital instrument. In this paper we investigate the effect of quantization on subsequent statistical inference about the true mean. If the standard deviation of the measurement error is large with respect to the resolution of the indicating measurement device, the effect of quantization (digitization) diminishes and standard statistical inference is still valid. Hence, in this paper we consider situations where the standard deviation of the measurement error is relatively small. By Monte Carlo simulations we compare small sample properties of the interval estimators of the mean based on standard approach (i.e. by ignoring the fact that the measurements have been quantized) with some recently suggested methods, including the interval estimators based on maximum likelihood approach and the fiducial approach. The paper extends the original study by Hannig et al. (2007).  相似文献   

17.
This paper is focused on testing the parameters of the quantile regression models. For complete observation, it is shown in literature that the test statistics, based on empirical likelihood (EL) method and smoothed empirical likelihood (SEL) method, both converge weakly to the standard Chi-square distribution $\chi_M^2$ under the null hypothesis. For right censored data, the statistics in literature, by the EL method, have a weighted Chi-square limiting distribution, but the weights are unknown. In this paper, we show that the statistics based on the EL method and the SEL method also converge weakly to $\chi_M^2$ under the null hypothesis, so there is no need to estimate any weights. As its estimating function is smoothed, the SEL method can be Bartlett corrected. Numerical results show that the SEL method, via Bartlett correction, outperforms some recent methods.  相似文献   

18.
Accelerated failure time (AFT) models are useful regression tools for studying the association between a survival time and covariates. Semiparametric inference procedures have been proposed in an extensive literature. Among these, use of an estimating equation which is monotone in the regression parameter and has some excellent properties was proposed by Fygenson and Ritov (1994). However, there is a serious under-coverage problem for small sample sizes. In this paper, we derive the limiting distribution of the empirical log-likelihood ratio for the regression parameter on the basis of the monotone estimating equations. Furthermore, the empirical likelihood (EL) confidence intervals/regions for the regression parameter are obtained. We conduct a simulation study in order to compare the proposed EL method with the normal approximation method. The simulation results suggest that the empirical likelihood based method outperforms the normal approximation based method in terms of coverage probability. Thus, the proposed EL method overcomes the under-coverage problem of the normal approximation method.  相似文献   

19.
Expected gain in Shannon information is commonly suggested as a Bayesian design evaluation criterion. Because estimating expected information gains is computationally expensive, examples in which they have been successfully used in identifying Bayes optimal designs are both few and typically quite simplistic. This article discusses in general some properties of estimators of expected information gains based on Markov chain Monte Carlo (MCMC) and Laplacian approximations. We then investigate some issues that arise when applying these methods to the problem of experimental design in the (technically nontrivial) random fatigue-limit model of Pascual and Meeker. An example comparing follow-up designs for a laminate panel study is provided.  相似文献   

20.
At the present time, the methods for the measurement and prediction of the dynamic strength of materials are complicated and unstandardized. An experimental data processing method based on the incubation time criterion is considered. Only a finite number of measurements containing random errors and limited statistical information are usually available in practice, since dynamic tests are laborious, and every individual test requires a lot of time. This strongly restricts the number of applicable data processing methods unless we are satisfied with approximate and heuristic solutions. The method of sign-perturbed sums (SPS) is used for the estimation of finite-sample confidence regions with a specified confidence probability under the assumption of noise symmetries. It is shown that several experimental points are sufficient to determine the strength parameter with an accuracy acceptable for engineering calculations. The applicability of the proposed method is demonstrated in the processing of a number of experiments on the dynamic fracture of rocks.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号