首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 531 毫秒
1.
Environmental impact assessment (EIA) problems are often characterised by a large number of identified environmental factors that are qualitative in nature and can only be assessed on the basis of human judgments, which inevitably involve various types of uncertainties such as ignorance and fuzziness. So, EIA problems need to be modelled and analysed using methods that can handle uncertainties. The evidential reasoning (ER) approach provides such a modelling framework and analysis method. In this paper the ER approach will be applied to conduct EIA analysis for the first time. The environmental impact consequences are characterized by a set of assessment grades that are assumed to be collectively exhaustive and mutually exclusive. All assessment information, quantitative or qualitative, complete or incomplete, and precise or imprecise, is modelled using a unified framework of a belief structure. The original ER approach with a recursive ER algorithm will be introduced and a new analytical ER algorithm will be investigated which provides a means for using the ER approach in decision situations where an explicit ER aggregation function is needed such as in optimisation problems. The ER approach will be used to aggregate multiple environmental factors, resulting in an aggregated distributed assessment for each alternative policy. A numerical example and its modified version are studied to illustrate the detailed implementation process of the ER approach and demonstrate its potential applications in EIA.  相似文献   

2.
This paper proposes a new approach to minimise inventory levels and their associated costs within large geographically dispersed organisations. For such organisations, attaining a high degree of agility is becoming increasingly important. Linear regression-based tools have traditionally been employed to assist human experts in inventory optimisation; endeavours; recently, Neural Network (NN) techniques have been proposed for this domain. The objective of this paper is to create a hybrid framework that can be utilised for analysis, modelling and forecasting purposes. This framework combines two existing approaches and introduces a new associated cost parameter that serves as a surrogate for customer satisfaction. The use of this hybrid framework is described using a running example related to a large geographically dispersed organisation.  相似文献   

3.
In this paper, an ensemble technique combining the principal component analysis (PCA) with scale-dependent Lyapunov exponent (SDLE) is used to characterize complexity of precipitation dynamical system. The spatial–temporal precipitation data is decomposed by employing PCA method and then the SDLE for the first few principal components (PCs) time series are computed. The first few PCs time series are found to exhibit the different scaling laws on different time scales. The study illustrate that the spatial–temporal precipitation data is chaotic and the precipitation system is truly multiscaled and complex.  相似文献   

4.
Principal component analysis (PCA) is a canonical tool that reduces data dimensionality by finding linear transformations that project the data into a lower dimensional subspace while preserving the variability of the data. Selecting the number of principal components (PC) is essential but challenging for PCA since it represents an unsupervised learning problem without a clear target label at the sample level. In this article, we propose a new method to determine the optimal number of PCs based on the stability of the space spanned by PCs. A series of analyses with both synthetic data and real data demonstrates the superior performance of the proposed method.  相似文献   

5.
Principal component analysis (PCA) is a widely used tool for data analysis and dimension reduction in applications throughout science and engineering. However, the principal components (PCs) can sometimes be difficult to interpret, because they are linear combinations of all the original variables. To facilitate interpretation, sparse PCA produces modified PCs with sparse loadings, i.e. loadings with very few non-zero elements. In this paper, we propose a new sparse PCA method, namely sparse PCA via regularized SVD (sPCA-rSVD). We use the connection of PCA with singular value decomposition (SVD) of the data matrix and extract the PCs through solving a low rank matrix approximation problem. Regularization penalties are introduced to the corresponding minimization problem to promote sparsity in PC loadings. An efficient iterative algorithm is proposed for computation. Two tuning parameter selection methods are discussed. Some theoretical results are established to justify the use of sPCA-rSVD when only the data covariance matrix is available. In addition, we give a modified definition of variance explained by the sparse PCs. The sPCA-rSVD provides a uniform treatment of both classical multivariate data and high-dimension-low-sample-size (HDLSS) data. Further understanding of sPCA-rSVD and some existing alternatives is gained through simulation studies and real data examples, which suggests that sPCA-rSVD provides competitive results.  相似文献   

6.
高等学校各院系办学绩效水平直接关系到高校人才培养的质量,本文针对目前高等院校对下属各院系本科教学考核评估指标权重确定方法中存在的问题,将平衡记分卡(BSC)和层次分析法(AHP)引入到本科教学工作绩效评估中,设计了一种用来确定高校对本科教学目标管理量化考核评估指标权重的决策方法既是一种理论创新,也是一种方法创新。解决了以往凭经验确定指标权重导致评估失衡的问题。  相似文献   

7.
In this article, we propose a new framework for matrix factorization based on principal component analysis (PCA) where sparsity is imposed. The structure to impose sparsity is defined in terms of groups of correlated variables found in correlation matrices or maps. The framework is based on three new contributions: an algorithm to identify the groups of variables in correlation maps, a visualization for the resulting groups, and a matrix factorization. Together with a method to compute correlation maps with minimum noise level, referred to as missing-data for exploratory data analysis (MEDA), these three contributions constitute a complete matrix factorization framework. Two real examples are used to illustrate the approach and compare it with PCA, sparse PCA, and structured sparse PCA. Supplementary materials for this article are available online.  相似文献   

8.
Department of Health staff wished to use systems modelling to discuss acute patient flows with groups of NHS staff. The aim was to assess the usefulness of system dynamics (SD) in a healthcare context and to elicit proposals concerning ways of improving patient experience. Since time restrictions excluded simulation modelling, a hybrid approach using stock/flow symbols from SD was created. Initial interviews and hospital site visits generated a series of stock/flow maps. A ‘Conceptual Framework’ was then created to introduce the mapping symbols and to generate a series of questions about different patient paths and what might speed or slow patient flows. These materials formed the centre of three workshops for NHS staff. The participants were able to propose ideas for improving patient flows and the elicited data was subsequently employed to create a finalized suite of maps of a general acute hospital. The maps and ideas were communicated back to the Department of Health and subsequently assisted the work of the Modernization Agency.  相似文献   

9.
In this paper, a new methodology is investigated to support the prioritization of the voices of customers through various customer satisfaction surveys. This new methodology consists of two key components: an innovative evidence-driven decision modelling framework for representing and transforming large amounts of data sets and a generic reasoning-based decision support process for aggregating evidence to prioritize the voices of customer on the basis of the Evidential Reasoning (ER) approach. Methods and frameworks for data collection and representation via multiple customer satisfaction surveys were examined first and the distinctive features of quantitative and qualitative survey data are analysed. Several novel yet natural and pragmatic rule-based functions are then proposed to transform survey data systematically and consistently from different measurement scales to a common scale, with the original features and profiles of the data preserved in the transformation process. These new transformation functions are proposed to mimic expert judgement processes and designed to be sufficiently flexible and rigorous so that expert judgements and domain specific knowledge can be taken into account naturally, systematically and consistently in the transformation process. The ER approach is used for synthesizing quantitative and qualitative data under uncertainty that can be caused due to missing data and ambiguous survey questions. A new generic method is also proposed for ranking the voices of customer based on qualitative measurement scales without having to quantify assessment grades to fixed numerical values. A case study is examined using an Intelligent Decision System (IDS) to illustrate the application of the decision modelling framework and decision support process for prioritizing the voices of customer for a world-leading car manufacturer.  相似文献   

10.
An augmented Lagrangian approach for sparse principal component analysis   总被引:1,自引:0,他引:1  
Principal component analysis (PCA) is a widely used technique for data analysis and dimension reduction with numerous applications in science and engineering. However, the standard PCA suffers from the fact that the principal components (PCs) are usually linear combinations of all the original variables, and it is thus often difficult to interpret the PCs. To alleviate this drawback, various sparse PCA approaches were proposed in the literature (Cadima and Jolliffe in J Appl Stat 22:203–214, 1995; d’Aspremont et?al. in J Mach Learn Res 9:1269–1294, 2008; d’Aspremont et?al. SIAM Rev 49:434–448, 2007; Jolliffe in J Appl Stat 22:29–35, 1995; Journée et?al. in J Mach Learn Res 11:517–553, 2010; Jolliffe et?al. in J Comput Graph Stat 12:531–547, 2003; Moghaddam et?al. in Advances in neural information processing systems 18:915–922, MIT Press, Cambridge, 2006; Shen and Huang in J Multivar Anal 99(6):1015–1034, 2008; Zou et?al. in J Comput Graph Stat 15(2):265–286, 2006). Despite success in achieving sparsity, some important properties enjoyed by the standard PCA are lost in these methods such as uncorrelation of PCs and orthogonality of loading vectors. Also, the total explained variance that they attempt to maximize can be too optimistic. In this paper we propose a new formulation for sparse PCA, aiming at finding sparse and nearly uncorrelated PCs with orthogonal loading vectors while explaining as much of the total variance as possible. We also develop a novel augmented Lagrangian method for solving a class of nonsmooth constrained optimization problems, which is well suited for our formulation of sparse PCA. We show that it converges to a feasible point, and moreover under some regularity assumptions, it converges to a stationary point. Additionally, we propose two nonmonotone gradient methods for solving the augmented Lagrangian subproblems, and establish their global and local convergence. Finally, we compare our sparse PCA approach with several existing methods on synthetic (Zou et?al. in J Comput Graph Stat 15(2):265–286, 2006), Pitprops (Jeffers in Appl Stat 16:225–236, 1967), and gene expression data (Chin et?al in Cancer Cell 10:529C–541C, 2006), respectively. The computational results demonstrate that the sparse PCs produced by our approach substantially outperform those by other methods in terms of total explained variance, correlation of PCs, and orthogonality of loading vectors. Moreover, the experiments on random data show that our method is capable of solving large-scale problems within a reasonable amount of time.  相似文献   

11.
12.
The global economic crisis has a significant impact on healthcare resource provision worldwide. The management of limited healthcare resources is further challenged by the high level of uncertainty in demand, which can lead to unbalanced utilization of the available resources and a potential deterioration of patient satisfaction in terms of longer waiting times and perceived reduced quality of services. Therefore, healthcare managers require timely and accurate tools to optimize resource utility in a complex and ever-changing patient care process. An interactive simulation-based decision support framework is presented in this paper for healthcare process improvement. Complexity and different levels of variability within the process are incorporated into the process modeling phase, followed by developing a simulation model to examine the impact of potential alternatives. As a performance management tool, balanced scorecard (BSC) is incorporated within the framework to support continual and sustainable improvement by using strategic-linked performance measures and actions. These actions are evaluated by the simulation model developed, whilst the trade-off between objectives, though somewhat conflicting, is analysed by a preference model. The preference model is designed in an interactive and iterative process considering decision makers preferences regarding the selected key performance indicators (KPIs). A detailed implementation of the framework is demonstrated on an emergency department (ED) of an adult teaching hospital in north Dublin, Ireland. The results show that the unblocking of ED outflows by in-patient bed management is more effective than increasing only the ED physical capacity or the ED workforce.  相似文献   

13.
Many multiple attribute decision analysis (MADA) problems are characterised by both quantitative and qualitative attributes with various types of uncertainties. Incompleteness (or ignorance) and vagueness (or fuzziness) are among the most common uncertainties in decision analysis. The evidential reasoning (ER) approach has been developed in the 1990s and in the recent years to support the solution of MADA problems with ignorance, a kind of probabilistic uncertainty. In this paper, the ER approach is further developed to deal with MADA problems with both probabilistic and fuzzy uncertainties.In this newly developed ER approach, precise data, ignorance and fuzziness are all modelled under the unified framework of a distributed fuzzy belief structure, leading to a fuzzy belief decision matrix. A utility-based grade match method is proposed to transform both numerical data and qualitative (fuzzy) assessment information of various formats into the fuzzy belief structure. A new fuzzy ER algorithm is developed to aggregate multiple attributes using the information contained in the fuzzy belief matrix, resulting in an aggregated fuzzy distributed assessment for each alternative. Different from the existing ER algorithm that is of a recursive nature, the new fuzzy ER algorithm provides an analytical means for combining all attributes without iteration, thus providing scope and flexibility for sensitivity analysis and optimisation. A numerical example is provided to illustrate the detailed implementation process of the new ER approach and its validity and wide applicability.  相似文献   

14.
Cross-efficiency in data envelopment analysis (DEA) models is an effective way to rank decision-making units (DMUs). The common methods to aggregate cross-efficiency do not consider the preference structure of the decision maker (DM). When a DM’s preference structure does not satisfy the “additive independence” condition, a new aggregation method must be proposed. This paper uses the evidential-reasoning (ER) approach to aggregate the cross-efficiencies obtained from cross-evaluation through the transformation of the cross-efficiency matrix to pieces of evidence. This paper provides a new method for cross-efficiency aggregation and a new way for DEA models to reflect a DM’s preference or value judgments. Additionally, this paper presents examples that demonstrate the features of cross-efficiency aggregation using the ER approach, including an empirical example of the evaluation practice of 16 basic research institutes in Chinese Academy of Sciences (CAS) in 2010 that illustrates how the ER approach can be used to aggregate the cross-efficiency matrix produced from DEA models.  相似文献   

15.
This research further develops the combined use of principal component analysis (PCA) and data envelopment analysis (DEA). The aim is to reduce the curse of dimensionality that occurs in DEA when there is an excessive number of inputs and outputs in relation to the number of decision-making units. Three separate PCA–DEA formulations are developed in the paper utilising the results of PCA to develop objective, assurance region type constraints on the DEA weights. The first model applies PCA to grouped data representing similar themes, such as quality or environmental measures. The second model, if needed, applies PCA to all inputs and separately to all outputs, thus further strengthening the discrimination power of DEA. The third formulation searches for a single set of global weights with which to fully rank all observations. In summary, it is clear that the use of principal components can noticeably improve the strength of DEA models.  相似文献   

16.
Evaluating higher education teaching performance is complex as it involves consideration of both objective and subjective criteria. The student evaluation of teaching (SET) is used to improve higher education quality. However, the traditional approaches to considering students’ responses to SET questionnaires for improving teaching quality have several shortcomings. This study proposes an integrated approach to higher education teaching evaluation that combines the analytical hierarchy process (AHP) and data envelopment analysis (DEA). The AHP allows consideration of the varying importance of each criterion of teaching performance, while DEA enables the comparison of tutors on teaching as perceived by students with a view to identifying the scope for improvement by each tutor. The proposed teaching evaluation method is illustrated using data from a higher education institution in Greece.  相似文献   

17.
群体AHP方法在复杂系统易损性分析中的应用   总被引:1,自引:0,他引:1  
群体层次分析法(Group AHP)是专家群体评判意见的综合量化方法,本文讨论了该方法在系统易损性分析中的应用,并针对群体层次分析法专家意见合成问题,提出了基于主成分分析方法计算各专家客观权重,并根据该权重合成专家评判意见的方法.应用结果表明,该方法能更好地克服人为主观判断及偏好给决策带来的影响,为复杂系统易损性定量评估提供了有效的技术途径.  相似文献   

18.
主成分分析法在基于度量的软件风险评估中的应用   总被引:5,自引:0,他引:5  
软件风险评估一般分为主观评估方法和客观评估方法两种.主观评估方法多依赖于专家知识及管理人员的经验,客观评估方法则是从软件产品本身的内在属性出发,通过对软件产品复杂性等特性的度量来进行的.本文研究了现代统计分析技术中的主成分分析法在基于度量的软件风险评估中的应用,并对面向对象开发的软件产品给出了算例,从而能帮助软件开发者或管理人员在项目管理过程中识别软件产品的高风险模块,便于有效地开展风险管理.  相似文献   

19.
主成分分析在环境质量评价中的失效问题   总被引:3,自引:1,他引:2  
在环境质量评价中,主成分分析被认为是一种客观而实用的评价方法.但大量的数据分析表明,主成分分析的评价结果并不一定符合实际情况,有时甚至完全失效,存在着一系列问题,而这些问题是由主成分分析本身的性质所决定的.  相似文献   

20.
Near infrared (NIR) spectroscopy is a rapid, non-destructive technology to predict a variety of wood properties and provides great opportunities to optimize manufacturing processes through the realization of in-line assessment of forest products. In this paper, a novel multivariate regression procedure, the hybrid model of principal component regression (PCR) and partial least squares (PLS), is proposed to develop more accurate prediction models for high-dimensional NIR spectral data. To integrate the merits of PCR and PLS, both principal components defined in PCR and latent variables in PLS are utilized in hybrid models by a common iterative procedure under the constraint that they should keep orthogonal to each other. In addition, we propose the modified sequential forward floating search method, originated in feature selection for classification problems, in order to overcome difficulties of searching the vast number of possible hybrid models. The effectiveness and efficiency of hybrid models are substantiated by experiments with three real-life datasets of forest products. The proposed hybrid approach can be applied in a wide range of applications with high-dimensional spectral data.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号