首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The matrix exponential plays a fundamental role in the solution of differential systems which appear in different science fields. This paper presents an efficient method for computing matrix exponentials based on Hermite matrix polynomial expansions. Hermite series truncation together with scaling and squaring and the application of floating point arithmetic bounds to the intermediate results provide excellent accuracy results compared with the best acknowledged computational methods. A backward-error analysis of the approximation in exact arithmetic is given. This analysis is used to provide a theoretical estimate for the optimal scaling of matrices. Two algorithms based on this method have been implemented as MATLAB functions. They have been compared with MATLAB functions funm and expm obtaining greater accuracy in the majority of tests. A careful cost comparison analysis with expm is provided showing that the proposed algorithms have lower maximum cost for some matrix norm intervals. Numerical tests show that the application of floating point arithmetic bounds to the intermediate results may reduce considerably computational costs, reaching in numerical tests relative higher average costs than expm of only 4.43% for the final Hermite selected order, and obtaining better accuracy results in the 77.36% of the test matrices. The MATLAB implementation of the best Hermite matrix polynomial based algorithm has been made available online.  相似文献   

2.
Traditional tabular and graphical displays of results of simultaneous confidence intervals or hypothesis tests are deficient in several respects. Expanding on earlier work, we present new mean–mean multiple comparison graphs that succinctly and compactly display the results of traditional procedures for multiple comparisons of population means or linear contrasts involving means. The MMC plot can be used with unbalanced, multifactor designs with covariates. After reviewing the construction of these displays in the S language (S-Plus and R), we demonstrate their application to four multiple comparison scenarios.  相似文献   

3.
The purpose of this article is to review the findings of Professor Fujikoshi which are primarily in multivariate analysis. He derived many asymptotic expansions for multivariate statistics which include MANOVA tests, dimensionality tests and latent roots under normality and nonnormality. He has made a large contribution in the study on theoretical accuracy for asymptotic expansions by deriving explicit error bounds. A large contribution has been also made in an important problem involving the selection of variables with introducing “no additional information hypotheses” in some multivariate models and the application of model selection criteria. Recently he is challenging to a high-dimensional statistical problem. He has been involved in other topics in multivariate analysis, such as power comparison of a class of tests, monotone transformations with improved approximations, etc.  相似文献   

4.
In this paper, a method is developed to evaluate firms on the basis of the risks they face. In accordance with the multi-factor method, risk is represented as a vector of sensitivities to unexpected changes of risk factors. Subsequently, the sensitivities themselves are related to firm characteristics.In addition, an application of the method to interfirm comparison is presented. This application is illustrated by a numerical example based on estimates concerning real data. Apart from this application, some other possible future applications are mentioned. Finally, some decision support tools are presented which may enhance the usefulness of the method in practice.  相似文献   

5.
It is a common issue to compare treatment-specific survival and the weighted log-rank test is the most popular method for group comparison. However, in observational studies, treatments and censoring times are usually not independent, which invalidates the weighted log-rank tests. In this paper, we propose adjusted weighted log-rank tests in the presence of non-random treatment assignment and dependent censoring. A double-inverse weighted technique is developed to adjust the weighted log-rank tests. Specifically, inverse probabilities of treatment and censoring weighting are involved to balance the baseline treatment assignment and to overcome dependent censoring, respectively. We derive the asymptotic distribution of the proposed adjusted tests under the null hypothesis, and propose a method to obtain the critical values. Simulation studies show that the adjusted log-rank tests have correct sizes whereas the traditional weighted log-rank tests may fail in the presence of non-random treatment assignment and dependent censoring. An application to oropharyngeal carcinoma data from the Radiation Therapy Oncology Group is provided for illustration.  相似文献   

6.
The physical parameters obtained from modal tests do not satisfy the dynamic constraints of eigenvalue function and orthogonality requirements due to modeling and measurement errors. The purpose of this study is to present the analytical equations on the updated stiffness and mass matrices in the satisfaction of such dynamic constraints. Minimizing the cost functions of the difference between analytical and desired physical parameter matrices, the corrected parameter matrices are straightforwardly derived by utilizing the Moore–Penrose inverse without using any multipliers. The cost functions given by a few researchers are utilized. From the comparison of the existing analytical results and the proposed equations, the validity of the proposed methods is evaluated in an application.  相似文献   

7.
The Lagrange interpolation problem in Banach spaces is approached by cardinal basis interpolation. Some error estimates are given and the results of several numerical tests are reported in order to show the approximation performances of the proposed interpolants. A comparison between some examples of interpolants is presented in the noteworthy case of Hilbert spaces, with some considerations about the possible localization of the formulas. Finally, some remarks about the cardinal basis interpolation framework are made from the application point of view.  相似文献   

8.
For some general multivariate linear models, linear rank statistics are used in conjunction with Roy's Union-Intersection Principle to develop some tests for inference on the parameter (vector) when they are subject to certain linear constraints. More powerful tests are designed by incorporating the a priori information on these constraints. Profile analysis is an important application of this type of hypothesis testing problem; it consists of a set of hypothesis testing problem for the p responses q-sample model, where it is a priori assumed that the response-sample interactions are null.  相似文献   

9.
Our discussion in this article centers on the application of a Lagrangean relaxation and a subgradient optimization technique to the problem of primary route assignment (PRA) in survivable connection-oriented networks. The PRA problem consists in a static optimization of primary routes minimizing the Lost Flow in Node (LFN) function. The major contribution of this work is a combination of the Lagrangean relaxation with other heuristic algorithms. We evaluate the performance of the proposed Lagrangean-based heuristic by making a comparison with their counterparts including evolutionary algorithm and GRASP using various network topologies and demand patterns. The results of simulation tests show that the new algorithm provides sub-optimal results, which are better than other heuristics.  相似文献   

10.
Summary In a previous paper [5], the author has proposed a class of asymptotically optimal (in the sense of Wald [11]) nonparametric tests for testing the hypothesis of no regression in a multiple linear regression model. In the present paper, we are interested in testing that the intercept in the multiple (linear) regression model is zero along with the absence of regression. A class of permutationally distribution-free tests has been proposed and their asymptotic optimality has been established. These results generalize analogus findings of Puri and Sen [9] for ungrouped data. As an important application, an alternative test to the problem of paired comparison has been considered. Research supported by National Institutes of Health, Institute of General Medical Sciences GM 12868-05.  相似文献   

11.
Within the data envelopment analysis context, problems of discrimination between efficient and inefficient decision-making units often arise, particularly if there are a relatively large number of variables with respect to observations. This paper applies Monte Carlo simulation to generalize and compare two discrimination improving methods; principal component analysis applied to data envelopment analysis (PCA–DEA) and variable reduction based on partial covariance (VR). Performance criteria are based on the percentage of observations incorrectly classified; efficient decision-making units mistakenly defined as inefficient and inefficient units defined as efficient. A trade-off was observed with both methods improving discrimination by reducing the probability of the latter error at the expense of a small increase in the probability of the former error. A comparison of the methodologies demonstrates that PCA–DEA provides a more powerful tool than VR with consistently more accurate results. PCA–DEA is applied to all basic DEA models and guidelines for its application are presented in order to minimize misclassification and prove particularly useful when analyzing relatively small datasets, removing the need for additional preference information.  相似文献   

12.
Arnold  Martin  Murua  Ander 《Numerical Algorithms》1998,19(1-4):25-41
Non-stiff differential-algebraic equations (DAEs) can be solved efficiently by partitioned methods that combine well-known non-stiff integrators from ODE theory with an implicit method to handle the algebraic part of the system. In the present paper we consider partitioned one-step and partitioned multi-step methods for index-2 DAEs in Hessenberg form and the application of these methods to constrained mechanical systems. The methods are presented from a unified point of view. The comparison of various classes of methods is completed by numerical tests for benchmark problems from the literature. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

13.
李素芳  张虎  吴芳 《运筹与管理》2019,28(10):89-99
针对传统面板协整检验在建模过程中易受异常值影响以及其原假设设置的主观选择问题,本文利用动态公共因子刻画面板数据潜在的截面相关结构,提出基于动态因子的截面相关结构的贝叶斯分位面板协整检验,结合各个主要分位数水平下参数的条件后验分布,设计结合卡尔曼滤波的Gibbs抽样算法,进行贝叶斯分位面板协整检验;并进行Monte Carlo仿真实验验证贝叶斯分位面板协整检验的可行性与有效性。同时,采用中国各省金融发展和经济增长的面板数据进行实证研究,结果发现在各主要分位数水平下中国金融发展和经济增长之间具有协整关系。研究结果表明:贝叶斯分位面板协整检验方法避免了传统面板数据协整方法由于原假设设置不同而发生误判的问题,克服了异常值的影响,能够提供全面准确的模型参数估计和协整检验结果。  相似文献   

14.
A multivariate dispersion ordering based on random simplices is proposed in this paper. Given a Rd-valued random vector, we consider two random simplices determined by the convex hulls of two independent random samples of sizes d+1 of the vector. By means of the stochastic comparison of the Hausdorff distances between such simplices, a multivariate dispersion ordering is introduced. Main properties of the new ordering are studied. Relationships with other dispersion orderings are considered, placing emphasis on the univariate version. Some statistical tests for the new order are proposed. An application of such ordering to the clinical evaluation of human corneal endothelia is provided. Different analyses are included using an image database of human corneal endothelia.  相似文献   

15.
The DEAHP method for weight deviation and aggregation in the analytic hierarchy process (AHP) has been found flawed and sometimes produces counterintuitive priority vectors for inconsistent pairwise comparison matrices, which makes its application very restrictive. This paper proposes a new data envelopment analysis (DEA) method for priority determination in the AHP and extends it to the group AHP situation. In this new DEA methodology, two specially constructed DEA models that differ from the DEAHP model are used to derive the best local priorities from a pairwise comparison matrix or a group of pairwise comparison matrices no matter whether they are perfectly consistent or inconsistent. The new DEA method produces true weights for perfectly consistent pairwise comparison matrices and the best local priorities that are logical and consistent with decision makers (DMs)’ subjective judgments for inconsistent pairwise comparison matrices. In hierarchical structures, the new DEA method utilizes the simple additive weighting (SAW) method for aggregation of the best local priorities without the need of normalization. Numerical examples are examined throughout the paper to show the advantages of the new DEA methodology and its potential applications in both the AHP and group decision making.  相似文献   

16.
Statistical tests are developed regarding linear combinations of the parameters of several independent gamma populations. The tests are based on a generalized minimum chi-square procedure. On utilizing these, one can test hypotheses regarding the means or the scale parameters when the shape parameters are unknown. In these tests there is no need to assume the equality of the shape parameters of the underlying populations. Tests for comparing coefficients of variation of several gamma populations have also been developed. For the two population case, a power comparison of these tests with some existing tests is also presented. Two examples are provided to explain the procedure.  相似文献   

17.
A variety of very useful methods of statistical shape analysis are available for landmark data. In particular, standard methods of multivariate analysis can often be applied after suitable alignment and transformation of the data. An important example is the use of principal components analysis to provide a convenient route to graphical exploration of the main modes of variation in a sample. Where there are many landmarks or shape information is extracted in the form of curves or surfaces, the dimensionality of the resulting data can be very high and it is unlikely that substantial proportions of variability will be captured in one or two principal components. Issues of graphical exploration are explored in this setting, including random tours of a suitable low-dimensional subspace, the comparison of different groups of data, longitudinal changes and the identification of the features which distinguish individual cases from a group of controls. A suitable software environment for handling these methods with three-dimensional data is outlined. Issues of comparing principal components across time are also tackled through appropriately constructed permutation tests. All of these techniques are illustrated on a longitudinal study of facial development in young children, with particular interest in the identification of differences in facial shape between control children and those who have undergone surgical repair of a cleft lip and/or palate.  相似文献   

18.
A theoretical and experimental study on the buckling of pultruded columns under an axial load is reported. The tests were carried out on wide-flange (H-shape) and narrow-flange (I-shape) pultruded structural profiles reinforced with glass fibers (GFRP). The aim of the research was to provide more detailed information on the effective buckling behavior, the interaction between local and global buckling, and the type of failure, considering different slenderness ratios, from 9 to 195. Also, a comparison between the results of a finite-element analysis and experimental data is shown.  相似文献   

19.
1. Summary The main object of this paper is to find a criterion for comparison of two tests for non-parametric hypotheses, taking advantage of the qualitative information that may exist. After a detailed analysis of the problem and some earlier suggestins for its solution (sections 2–4), a criterion is suggested in section 5. In order to apply it to a concrete case, a location problem is specified in section 6. The rank tests to be compared are analyzed in section 7, and the comparison by way of the criterion is carried out in section 8. It turns out that sign tests, sometimes slightly modified, are very often optimal according to the criterion used.  相似文献   

20.
本文讨论了正态分布方差只有一个变点的检验问题,我们构造了三个检验统计量,其中L检验基于非参数U统计量,B检验基于Bayes方法,R检验由极大似然比方法导出.本文给出了L、B、R检验的渐近临界值,并用MonteCarlo模拟方法研究了这三个检验与平方的CUSUM检验以及LM检验的势,并进行了比较。当变点在序列的前一半位置时,L和R检验较好,当变点在序列的后一半位置时,平方的CUSUM和B检验较好.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号