首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
For many complex business and industry problems, high‐dimensional data collection and modeling have been conducted. It has been shown that interactions may have important implications beyond the main effects. The number of unknown parameters in an interaction analysis can be larger or much larger than the sample size. As such, results generated from analyzing a single data set are often unsatisfactory. Integrative analysis, which jointly analyzes the raw data from multiple independent studies, has been conducted in a series of recent studies and shown to outperform single–data set analysis, meta‐analysis, and other multi–data set analyses. In this study, our goal is to conduct integrative analysis in interaction analysis. For regularized estimation and selection of important interactions (and main effects), we apply a threshold gradient directed regularization approach. Advancing from the existing studies, the threshold gradient directed regularization approach is modified to respect the “main effects, interactions” hierarchy. The proposed approach has an intuitive formulation and is computationally simple and broadly applicable. Simulations and the analyses of financial early warning system data and news‐APP (application) recommendation behavior data demonstrate its satisfactory practical performance.  相似文献   

2.
Meta-synthesis method is proposed to tackle with open complex giant system problems which cannot be effectively solved by traditional reductionism methods by a Chinese system scientist Qian, Xuesen (Tsien HsueShen) around the early 1990s. The method emphasizes the synthesis of collected information and knowledge of various kinds of experts, and combining quantitative methods with qualitative knowledge. Since then, continuous endeavors have been taken to put those ideas into practice. In this paper, firstly we review meta-synthesis approach and other research relevant to complex system modeling briefly. Then we discuss two main issues, model integration and opinion synthesis, which are often confronted when applying meta-synthesis approach, together with an exhibit of the development of an embryonic meta-synthetic support prototype. Such a demonstration shows how to model complex problems, such as macro-economic problems in Hall for Workshop on Meta-Synthetic Engineering with versatile resources in information collection, model integration and opinion synthesis. Finally, some future work is indicated.  相似文献   

3.
One of the most important steps in the application of modeling using data envelopment analysis (DEA) is the choice of input and output variables. In this paper, we develop a formal procedure for a “stepwise” approach to variable selection that involves sequentially maximizing (or minimizing) the average change in the efficiencies as variables are added or dropped from the analysis. After developing the stepwise procedure, applications from classic DEA studies are presented and the new managerial insights gained from the stepwise procedure are discussed. We discuss how this easy to understand and intuitively sound method yields useful managerial results and assists in identifying DEA models that include variables with the largest impact on the DEA results.  相似文献   

4.
Aiming at the modeling and analysis of the multi-layer, multi-temporal geographical model simulation data, the geometric algebra (GA) is introduced to design methods for data modeling, spatio-temporal queries and dynamic visualization. Algorithms, including the slices and cross-section, area and volume computation, morphology characteristics computation and change detection, are constructed directly based on the GA operators. We developed a prototype system “GA-Coupling Analyzer” to integrate all the methods. The system is demonstrated with the simulation data of Antarctic “Ice–Ocean–Land” coupled changes. The results suggest that our approach can provide a unified geometric meaningful approach for complex geo-simulation data representation and analysis. The representation can well integrate the geometric representation and algebraic computation. With the powerful GA operators, the spatio-temporal analysis methods can be directly and simply constructed and implemented.  相似文献   

5.
In this paper we propose forecasting market risk measures, such as Value at Risk (VaR) and Expected Shortfall (ES), for large dimensional portfolios via copula modeling. For that we compare several high dimensional copula models, from naive ones to complex factor copulas, which are able to simultaneously tackle the curse of dimensionality and introduce a high level of complexity into the model. We explore both static and dynamic copula fitting. In the dynamic case we allow different levels of flexibility for the dependence parameters which are driven by a GAS (Generalized Autoregressive Scores) model, in the spirit of Oh and Patton (2015). Our empirical results, for assets negotiated at Brazilian BOVESPA stock market from January, 2008 to December, 2014, suggest that, compared to the other copula models, the GAS dynamic factor copula approach has a superior performance in terms of AIC (Akaike Information Criterion) and a non-inferior performance with respect to VaR and ES forecasting.  相似文献   

6.
Dimensionality reduction is used to preserve significant properties of data in a low-dimensional space. In particular, data representation in a lower dimension is needed in applications, where information comes from multiple high dimensional sources. Data integration, however, is a challenge in itself.In this contribution, we consider a general framework to perform dimensionality reduction taking into account that data are heterogeneous. We propose a novel approach, called Deep Kernel Dimensionality Reduction which is designed for learning layers of new compact data representations simultaneously. The method can be also used to learn shared representations between modalities. We show by experiments on standard and on real large-scale biomedical data sets that the proposed method embeds data in a new compact meaningful representation, and leads to a lower classification error compared to the state-of-the-art methods.  相似文献   

7.
组合评价方法的建模技术对于信息不完备的经济系统具有一定的实用性 .鉴于经济系统的复杂性及非线性的特征 ,首先利用辽宁省 2 0 0 1年有关数据 ,分别采用综合指数法、加权平均法、主成分分析法和因子分析法 ,对辽宁省 1 4个主要地区的综合经济实力进行了综合评价和排位 ;其次 ,建立了模糊组合评价模型 ,应用该模型对辽宁省 1 4个地区的经济发展状况重新进行了综合评价和排位 ,其结果更加科学、合理 .  相似文献   

8.
This article introduces a new case‐based density approach to modeling big data longitudinally, which uses ordinary differential equations and the linear advection partial differential equations (PDE) to treat macroscopic, dynamical change as a transport issue of aggregate cases across continuous time. The novelty of this approach comes from its unique data‐driven treatment of cases: which are K dimensional vectors; where the velocity vector for each case is computed according to its particular measurements on some set of empirically defined social, psychological, or biological variables. The three main strengths of this approach are its ability to: (1) translate the data driven, nonlinear trajectories of microscopic constituents (cases) into the linear movement of macroscopic trajectories, which take the form of densities; (2) detect the presence of multiple, complex steady state behaviors, including sinks, spiraling sources, saddles, periodic orbits, and attractor points; and (3) predict the motion of novel cases and time instances. To demonstrate the utility of this approach, we used it to model a recognized cohort dynamic: the longitudinal relationship between a country's per capita gross domestic product (GDP) and its longevity rates. Data for the model came from the widely used Gapminder dataset. Empirical results, including the strength of the model's fit and the novelty of its results (particularly on a topic of such extensive study) support the utility of our new approach. © 2014 Wiley Periodicals, Inc. Complexity 20: 45–57, 2015  相似文献   

9.
We introduce a method for learning pairwise interactions in a linear regression or logistic regression model in a manner that satisfies strong hierarchy: whenever an interaction is estimated to be nonzero, both its associated main effects are also included in the model. We motivate our approach by modeling pairwise interactions for categorical variables with arbitrary numbers of levels, and then show how we can accommodate continuous variables as well. Our approach allows us to dispense with explicitly applying constraints on the main effects and interactions for identifiability, which results in interpretable interaction models. We compare our method with existing approaches on both simulated and real data, including a genome-wide association study, all using our R package glinternet.  相似文献   

10.
This paper examines ways in which the addition of data modeling features can enhance the capabilities of mathematical modeling languages. It demonstrates how such integration is achieved as an application of the embedded languages technique proposed by Bhargava and Kimbrough [4]. Decision-making, and decision support systems, require the representation and manipulation of both data and mathematical models. Several data modeling languages as well as several mathematical modeling languages exist, but they have different sets of these capabilities. We motivate with a detailed example the need for the integration of these capabilities. We describe the benefits that might result, and claim that this could lead to a significant improvement in the functionality of model management systems. Then we present our approach for the integration of these languages, and specify how the claimed benefits can be realized.The author's work on this paper was performed in conjunction with research funded by the Naval Postgraduate School.  相似文献   

11.
An underlying assumption in DEA is that the weights coupled with the ratio scales of the inputs and outputs imply linear value functions. In this paper, we present a general modeling approach to deal with outputs and/or inputs that are characterized by nonlinear value functions. To this end, we represent the nonlinear virtual outputs and/or inputs in a piece-wise linear fashion. We give the CCR model that can assess the efficiency of the units in the presence of nonlinear virtual inputs and outputs. Further, we extend the models with the assurance region approach to deal with concave output and convex input value functions. Actually, our formulations indicate a transformation of the original data set to an augmented data set where standard DEA models can then be applied, remaining thus in the grounds of the standard DEA methodology. To underline the usefulness of such a new development, we revisit a previous work of one of the authors dealing with the assessment of the human development index on the light of DEA.  相似文献   

12.
A survey of data envelopment analysis in energy and environmental studies   总被引:4,自引:0,他引:4  
Data envelopment analysis has gained great popularity in energy and environmental (E&E) modeling in recent years. In this paper, we present a literature survey on the application of data envelopment analysis (DEA) to E&E studies. We begin with an introduction to the most widely used DEA techniques, which is followed by a classification of 100 publications in this field. The main features observed are summarized. Issues related to the selection of DEA models in E&E studies are discussed.  相似文献   

13.
This paper describes a method for an objective selection of the optimal prior distribution, or for adjusting its hyper-parameter, among the competing priors for a variety of Bayesian models. In order to implement this method, the integration of very high dimensional functions is required to get the normalizing constants of the posterior and even of the prior distribution. The logarithm of the high dimensional integral is reduced to the one-dimensional integration of a cerain function with respect to the scalar parameter over the range of the unit interval. Having decided the prior, the Bayes estimate or the posterior mean is used mainly here in addition to the posterior mode. All of these are based on the simulation of Gibbs distributions such as Metropolis' Monte Carlo algorithm. The improvement of the integration's accuracy is substantial in comparison with the conventional crude Monte Carlo integration. In the present method, we have essentially no practical restrictions in modeling the prior and the likelihood. Illustrative artificial data of the lattice system are given to show the practicability of the present procedure.  相似文献   

14.
Finite mixture modeling approach is widely used for the analysis of bimodal or multimodal data that are individually observed in many situations. However, in some applications, the analysis becomes substantially challenging as the available data are grouped into categories. In this work, we assume that the observed data are grouped into distinct non-overlapping intervals and follow a finite mixture of normal distributions. For the inference of the model parameters, we propose a parametric approach that accounts for the categorical features of the data. The main idea of our method is to impute the missing information of the original data through the Bayesian framework using the Gibbs sampling techniques. The proposed method was compared with the maximum likelihood approach, which uses the Expectation-Maximization algorithm for the estimation of the model parameters. It was also illustrated with an application to the Old Faithful geyser data.  相似文献   

15.
In this paper we tackle the problem of outlier detection in data envelopment analysis (DEA). We propose a procedure where we merge the super-efficiency DEA and the forward search. Since DEA provides efficiency scores which are not parameters to fit the model to the data, we introduce a distance, to be monitored along the search. This distance is obtained through the integration of a regression model and the super-efficiency DEA. We simulate a Cobb-Douglas production function and we compare the super-efficiency DEA and the forward search analysis in both uncontaminated and contaminated settings. For inference about outliers, we exploit envelopes obtained through Monte Carlo simulations.  相似文献   

16.
In this paper, we consider a general bilinear three dimensional ODE system, whose structure generalizes many mathematical models of biological interest, including many from epidemics. Our main goal is to find sufficient conditions, expressed in terms of the parameters of the system, ensuring that the geometric approach to global stability analysis, due to [M.Y. Li, J.S. Muldowney, A geometric approach to global-stability problems, SIAM J. Math. Anal. 27 (4) (1996) 1070-1083], may be successfully applied. We completely determine the dynamics of the general system, including thresholds and global stability of the nontrivial equilibrium. The obtained result is applied to several epidemic models. We further show how the role of new parameters on stability of well-established models may be emphasized.  相似文献   

17.
We demonstrate a real-world application of the interactive multiple objective optimization (MOO) approach to the simultaneous setting of input and output amounts for the opening of new branches. As illustrated by the case example, all the branches of a fast-food company employ multiple inputs to generate multiple outputs. The company launches several new branches each year and, therefore, needs to plan the quantities of inputs and outputs to be used and produced before their operations. Such input–output settings are a vital practical problem that arises whenever a new branch is opened in a host of different industries. In this paper, we show in detail the entire process of the application from modeling the case problem to generating its solution. In the modeling stage, a data envelopment analysis model and a statistical method are subsequently utilized to form a nonlinear MOO problem for the input–output settings. To solve this problem, we then develop and apply an interactive MOO method, which combines the two earlier interactive methods ( and ), while compensating for their drawbacks and capturing their positive aspects.  相似文献   

18.
We propose a two-step variable selection procedure for censored quantile regression with high dimensional predictors. To account for censoring data in high dimensional case, we employ effective dimension reduction and the ideas of informative subset idea. Under some regularity conditions, we show that our procedure enjoys the model selection consistency. Simulation study and real data analysis are conducted to evaluate the finite sample performance of the proposed approach.  相似文献   

19.
We consider the numerical evaluation of one-dimensional projections of general multivariate stable densities introduced by Abdul-Hamid and Nolan [H. Abdul-Hamid, J.P. Nolan, Multivariate stable densities as functions of one dimensional projections, J. Multivariate Anal. 67 (1998) 80-89]. In their approach higher order derivatives of one-dimensional densities are used, which seems to be cumbersome in practice. Furthermore there are some difficulties for even dimensions. In order to overcome these difficulties we obtain the explicit finite-interval integral representation of one-dimensional projections for all dimensions. For this purpose we utilize the imaginary part of complex integration, whose real part corresponds to the derivative of the one-dimensional inversion formula. We also give summaries on relations between various parametrizations of stable multivariate density and its one-dimensional projection.  相似文献   

20.
Sinc approximate methods are often used to solve complex boundary value problems such as problems on unbounded domains or problems with endpoint singularities. A recent implementation of the Sinc method [Li, C. and Wu, X., Numerical solution of differential equations using Sinc method based on the interpolation of the highest derivatives, Applied Mathematical Modeling 31 (1) 2007 1–9] in which Sinc basis functions are used to approximate the highest derivative in the governing equation of the boundary value problem is evaluated for structural mechanics applications in which interlaminar stresses are desired. We suggest an alternative approach for specifying the boundary conditions, and we compare the numerical results for analysis of a laminated composite Timoshenko beam, implementing both Li and Wu’s approach and our alternative approach for applying the boundary conditions. For the Timoshenko beam problem, we obtain accurate results using both approaches, including transverse shear stress by integration of the 3D equilibrium equations of elasticity. The beam results indicate our approach is less dependent on the selection of the Sinc mesh size than Li and Wu’s SIHD. We also apply SIHD to analyze a classical laminated composite plate. For the plate example, we experience difficulty in obtaining a complete system of equations using Li and Wu’s approach. For our approach, we suggest that additional necessary information may be obtained by applying the derivatives of the boundary conditions on each edge. Using this technique, we obtain accurate results for deflection and stresses, including interlaminar stresses by integration of the 3D equilibrium equations of elasticity. Our results for both the beam and the plate problems indicate that this approach is easily implemented, has a high level of accuracy, and good convergence properties.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号