首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 125 毫秒
1.
提出了一种基于模糊数据包络分析的企业资源计划系统选型方法.以系统的实施复杂性、预计实施成本、功能匹配度和系统供应商的企业形象等作为主要评价指标,运用模糊集相关理论争方法对系统选型过程中的不确定性进行刻画和描述,对候选系统的相对有效性进行客观评价,解决了系统选型过程中的不确定性和模糊性问题.并通过实证研究说明了该方法的应用.  相似文献   

2.
不确定性理论集对分析在海水水质富营养化评价中的应用   总被引:3,自引:0,他引:3  
集对分析是处理不确定性问题的新的系统理论方法.利用该理论,建立了进行海水水质富营养化综合评价的新方法.实例研究表明,集对分析评价海水水质的富营养化程度是切实可行的,与人工神经网络方法和支持向量机方法相比,集对分析方法概念清晰,计算简便、快捷、精度高,具有较高的分辨率和较大的实用性,减少了评价过程中的人为主观因素;具有不遗失数据中间信息、评价结果与实际情况更为相符的优点,为海水环境质量综合评价提供了一种简单而适用的评价方法.  相似文献   

3.
刘志勇  王清印 《经济数学》2005,22(2):172-176
经济系统是一个动态系统,充满着各种不确定性因素.而风险源于不确定.将不确定性系统理论引入经济系统分析具有重要的理论和现实意义.本文介绍了综合处理不确定性信息的C型关联分析方法,并通过实例展现了该方法的优越性.  相似文献   

4.
针对基于k系数的裕量和不确定性量化方法只适用于正态分布的实验数据的局限性,文章分析了系统性能特征分位数与裕量和不确定性量化评估所关注的问题之间的联系,通过分位数回归求解指定的性能特征分位数及其置信限,实现了对实验数据的裕量和不确定性量化评估.新的评估方法对数据的分布形式及方差一致性没有要求,具有较强的数据适应性.示例分析对此进行了说明.  相似文献   

5.
针对状态具有多个时滞的线性不确定性系统,基于李雅普诺夫稳定性理论,矩阵不等式及线性矩阵不等式方法,设计带有多个时滞记忆的输出反馈鲁棒H∞容错控制器.研究了在执行器发生故障的情况下,多时滞不确定性系统的鲁棒H∞容错控制的问题.首先给出了多个时滞记忆的输出反馈鲁棒H∞容错控制器的综合分析,进一步给出参数不确定项的多时滞系统的渐近稳定的充分条件,以及满足给定性能指标鲁棒H∞容错控制器的设计方法.仿真结果证明了该方法的有效性和可行性.  相似文献   

6.
为确定影响维修保障信息分析结果不确定性的关键输入参数,提出一种基于熵的不确定性敏感性分析方法.首先分析了模型输出结果的不确定性的表现形式、度量方法、产生机制;其次基于条件熵思想给出分析不确定性的敏感性参数以及敏感性分析方法;然后,简单介绍了盲数理论在计算不确定性数据中的应用;最后,结合平均维修时间模型进行了案例研究,说明了基于熵的不确定性敏感性分析方法的可用性,并证明了不确定性敏感性分析与求导不能等同的结论.  相似文献   

7.
装备系统评价的模糊白化灰色关联模型   总被引:7,自引:0,他引:7  
灰色系统理论是现代工程领域解决数据不确定性和数据量不足问题的有效方法 .装备系统的评价信息往往具有模糊性和灰色性双重特征 .模糊白化灰色关联模型源于武器系统的保障性评价 ,将其应用范围扩展到一般的武器系统评价 ,提出了装备系统评价模糊白化灰色关联模型 .在定性指标模糊白化的基础上 ,对所要评价的整个系统进行灰色关联分析 ,以得到评价结果  相似文献   

8.
两类时滞系统的鲁棒性分析   总被引:1,自引:0,他引:1       下载免费PDF全文
考虑了两类具有参数不确定性的时滞系统的鲁棒稳定性问题, 给出了系统具有鲁棒性能的几个充要条件. 这些结论对于研究具有时滞的反馈控制系统有重要的理论和实践价值.  相似文献   

9.
钢筋抗拉强度的不确定度包括:钢筋直径的不确定度分量,拉力的不确定度分量,检测结果重复性引入的不确定度分量,数据修约的不确定度分量等等,因此,测量不确定度是与测量数据相联系的,联系数是处理不确定性问题的一种系统数学理论,可以用联系数来表示测量不确定度,为此,提出一种基于联系数的钢筋抗拉强度测量不确定度评定的新方法.  相似文献   

10.
李伯忍  曾金平 《应用数学》2015,28(3):617-627
本文研究不确定非线性多时滞系统的非脆弱鲁棒镇定问题.状态和输入矩阵中的参数不确定性假设是范数有界.延迟是未知的,但随时间变化范围是已知的.对所有容许的参数不确定性,非脆弱鲁棒镇定问题是设计一个记忆脆弱的状态反馈控制器使得闭环系统是鲁棒稳定.利用Lyapunov泛函和自由权矩阵方法,以线性矩阵不等式(LMI)的的形式,得到时滞依赖充分镇定判据.数值例子说明所提出的理论结果的有效性.  相似文献   

11.
Chance constrained uncertain classification via robust optimization   总被引:1,自引:0,他引:1  
This paper studies the problem of constructing robust classifiers when the training is plagued with uncertainty. The problem is posed as a Chance-Constrained Program (CCP) which ensures that the uncertain data points are classified correctly with high probability. Unfortunately such a CCP turns out to be intractable. The key novelty is in employing Bernstein bounding schemes to relax the CCP as a convex second order cone program whose solution is guaranteed to satisfy the probabilistic constraint. Prior to this work, only the Chebyshev based relaxations were exploited in learning algorithms. Bernstein bounds employ richer partial information and hence can be far less conservative than Chebyshev bounds. Due to this efficient modeling of uncertainty, the resulting classifiers achieve higher classification margins and hence better generalization. Methodologies for classifying uncertain test data points and error measures for evaluating classifiers robust to uncertain data are discussed. Experimental results on synthetic and real-world datasets show that the proposed classifiers are better equipped to handle data uncertainty and outperform state-of-the-art in many cases.  相似文献   

12.
在现代战争中,地对空雷达对抗日趋重要,对于地对空雷达对抗装备训练水平评估成为了亟待解决的问题.开展对地对空雷达对抗训练内容的分析, 构建了地对空雷达对抗训练水平评估指标体系,运用层次分析法确定个体指标权重,再将所有专家的个体指标权重集结为群体指标权重, 根据灰色关联度(GRG)计算各专家的个体指标权重与群体指标权重的接近程度从而确定专家权重,采用了权重自适应调整的方法调整专家权重和群体指标权重,最终确定指标权重.最后,用实例说明了该方法的实用性.  相似文献   

13.
In this note we present a system (OR-Library) that distributes test problems by electronic mail (e-mail). This system currently has available test problems drawn from a number of different areas of operational research.  相似文献   

14.
Green product development has become a key strategic consideration for many companies due to regulatory requirements and the public awareness of environmental protection. Life cycle assessment (LCA) is a popular tool to measure the environmental impact of new product development. Nevertheless, it is often difficult to conduct a traditional LCA at the design phase due to uncertain and/or unknown data. This research adopts the concept of LCA and introduces a comprehensive method that integrates Fuzzy Extent Analysis and Fuzzy TOPSIS for the assessment of environmental performance with respect to different product designs. Methodologically, it exhibits the superiority of the hierarchical structure and the easiness of TOPSIS implementation whilst capturing the vagueness of uncertainty. A case study concerning a consumer electronic product was presented, and data collected through a questionnaire survey were used for the design evaluation. The approach presented in this research is expected to help companies decrease development lead time by screening out poor design options.  相似文献   

15.
Obtaining reliable estimates of the parameters of a probabilistic classification model is often a challenging problem because the amount of available training data is limited. In this paper, we present a classification approach based on belief functions that makes the uncertainty resulting from limited amounts of training data explicit and thereby improves classification performance. In addition, we model classification as an active information acquisition problem where features are sequentially selected by maximizing the expected information gain with respect to the current belief distribution, thus reducing uncertainty as quickly as possible. For this, we consider different measures of uncertainty for belief functions and provide efficient algorithms for computing them. As a result, only a small subset of features need to be extracted without negatively impacting the recognition rate. We evaluate our approach on an object recognition task where we compare different evidential and Bayesian methods for obtaining likelihoods from training data and we investigate the influence of different uncertainty measures on the feature selection process.  相似文献   

16.
我国军事人才能力素质测评方法研究   总被引:1,自引:0,他引:1  
围绕我国各类军事人才能力素质的测评,以某部为实证对象,按照总政治部有关人才考评的文件精神和未来信息化战争对人才的要求,构建了军队人才测评指标体系和测评方法,并对当前军队人才培养的主要途径进行了比较分析,研究结果表明,通过构建科学的测评指标体系和测评方法,能够对部队各种人才的能力素质进行科学、定量的评估,为选好用好人才提供科学依据;研究结果也表明,当前军队自行培养人才和委托地方科研院所培养人才的方式各有优劣,需要进行资源、信息等各方面的融合互补,  相似文献   

17.
This paper considers the problem of uncertainty in reliability data that are used in many decision making processes and describes a simulation approach to dealing explicitly with this uncertainty. The causes of uncertainty are discussed for three different situations: (a) in the process of designing new systems, when failure data are not yet available, (b) after performing reliability test and gathering failure data, and (c) in mission reliability prediction. It is concluded that, in performing reliability prediction or failure rate prediction, one should use interval estimates rather than point estimates. These intervals can be used to perform component classification and then, employing simulation, to obtain tables or scattergrams for the mean time between failures of a system or for system reliability.  相似文献   

18.
This paper develops a framework for examining the effect of demand uncertainty and forecast error on unit costs and customer service levels in the supply chain, including Material Requirements Planning (MRP) type manufacturing systems. The aim is to overcome the methodological limitations and confusion that has arisen in much earlier research. To illustrate the issues, the problem of estimating the value of improving forecasting accuracy for a manufacturer was simulated. The topic is of practical importance because manufacturers spend large sums of money in purchasing and staffing forecasting support systems to achieve more accurate forecasts. In order to estimate the value a two-level MRP system with lot sizing where the product is manufactured for stock was simulated. Final product demand was generated by two commonly occurring stochastic processes and with different variances. Different levels of forecasting error were then introduced to arrive at corresponding values for improving forecasting accuracy. The quantitative estimates of improved accuracy were found to depend on both the demand generating process and the forecasting method. Within this more complete framework, the substantive results confirm earlier research that the best lot sizing rules for the deterministic situation are the worst whenever there is uncertainty in demand. However, size matters, both in the demand uncertainty and forecasting errors. The quantitative differences depend on service level and also the form of demand uncertainty. Unit costs for a given service level increase exponentially as the uncertainty in the demand data increases. The paper also estimates the effects of mis-specification of different sizes of forecast error in addition to demand uncertainty. In those manufacturing problems with high demand uncertainty and high forecast error, improved forecast accuracy should lead to substantial percentage improvements in unit costs. Methodologically, the results demonstrate the need to simulate demand uncertainty and the forecasting process separately.  相似文献   

19.
One of the open problems in the field of forward uncertainty quantification(UQ) is the ability to form accurate assessments of uncertainty having only incomplete information about the distribution of random inputs. Another challenge is to efficiently make use of limited training data for UQ predictions of complex engineering problems, particularly with high dimensional random parameters. We address these challenges by combining data-driven polynomial chaos expansions with a recently developed preconditioned sparse approximation approach for UQ problems. The first task in this two-step process is to employ the procedure developed in [1] to construct an "arbitrary" polynomial chaos expansion basis using a finite number of statistical moments of the random inputs. The second step is a novel procedure to effect sparse approximation via l1 minimization in order to quantify the forward uncertainty. To enhance the performance of the preconditioned l1 minimization problem, we sample from the so-called induced distribution, instead of using Monte Carlo (MC) sampling from the original, unknown probability measure. We demonstrate on test problems that induced sampling is a competitive and often better choice compared with sampling from asymptotically optimal measures(such as the equilibrium measure) when we have incomplete information about the distribution. We demonstrate the capacity of the proposed induced sampling algorithm via sparse representation with limited data on test functions, and on a Kirchoff plating bending problem with random Young's modulus.  相似文献   

20.
Recently, Fuzzy Grey Cognitive Maps (FGCM) has been proposed as a FCM extension. It is based on Grey System Theory, that it has become a very effective theory for solving problems within environments with high uncertainty, under discrete small and incomplete data sets. The proposed approach of learning FGCMs applies the Nonlinear Hebbian based algorithm determine the success of radiation therapy process estimating the final dose delivered to the target volume. The scope of this research is to explore an alternative decision support method using the main aspects of fuzzy logic and grey systems to cope with the uncertainty inherent in medical domain and physicians uncertainty to describe numerically the influences among concepts in medical domain. The Supervisor-FGCM, trained by NHL algorithm adapted in FGCMs, determines the treatment variables of cancer therapy and the acceptance level of final radiation dose to the target volume. Three clinical case studies were used to test the proposed methodology with meaningful and promising results and prove the efficiency of the NHL algorithm for FGCM approach.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号