首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 656 毫秒
1.
The credit scoring is a risk evaluation task considered as a critical decision for financial institutions in order to avoid wrong decision that may result in huge amount of losses. Classification models are one of the most widely used groups of data mining approaches that greatly help decision makers and managers to reduce their credit risk of granting credits to customers instead of intuitive experience or portfolio management. Accuracy is one of the most important criteria in order to choose a credit‐scoring model; and hence, the researches directed at improving upon the effectiveness of credit scoring models have never been stopped. In this article, a hybrid binary classification model, namely FMLP, is proposed for credit scoring, based on the basic concepts of fuzzy logic and artificial neural networks (ANNs). In the proposed model, instead of crisp weights and biases, used in traditional multilayer perceptrons (MLPs), fuzzy numbers are used in order to better model of the uncertainties and complexities in financial data sets. Empirical results of three well‐known benchmark credit data sets indicate that hybrid proposed model outperforms its component and also other those classification models such as support vector machines (SVMs), K‐nearest neighbor (KNN), quadratic discriminant analysis (QDA), and linear discriminant analysis (LDA). Therefore, it can be concluded that the proposed model can be an appropriate alternative tool for financial binary classification problems, especially in high uncertainty conditions. © 2013 Wiley Periodicals, Inc. Complexity 18: 46–57, 2013  相似文献   

2.
3.
This paper discusses a principal–agent problem with multi-dimensional incomplete information between a principal and an agent. Firstly, how to describe the incomplete information in such agency problem is a challenging issue. This paper characterizes the incomplete information by uncertain variable, because it has been an appropriate tool to depict subjective assessment and model human uncertainty. Secondly, the relevant literature often used expected-utility-maximization to measure the two participators’ goals. However, Ellsberg paradox indicates that expected utility criterion is not always appropriate to be regarded as decision rule. For this reason, this paper presents another decision rule based on confidence level. Instead of expected-utility-maximization, the principal’s aim is to maximize his potential income under the acceptable confidence level, and the agent’s aim depends on whether he has private information about his efforts. According to the agent’s different decision rules, three classes of uncertain agency (UA) models and their respective optimal contracts are presented. Finally, a portfolio selection problem is studied to demonstrate the modeling idea and the viability of the proposed UA models.  相似文献   

4.
We propose a framework for building graphical decision models from individual causal mechanisms. Our approach is based on the work of Simon [Simon, H.A., 1953. Causal ordering and identifiability. In: Hood, W.C., Koopmans, T.C. (Eds.), Studies in Econometric Method. Cowles Commission for Research in Economics. Monograph No. 14. John Wiley and Sons Inc., New York, NY, pp. 49–74 (Ch. III)], who proposed a causal ordering algorithm for explicating causal asymmetries among variables in a self-contained set of structural equations. We extend the causal ordering algorithm to under-constrained sets of structural equations, common during the process of problem structuring. We demonstrate that the causal ordering explicated by our extension is an intermediate representation of a modeler’s understanding of a problem and that the process of model construction consists of assembling mechanisms into self-contained causal models. We describe ImaGeNIe, an interactive modeling tool that supports mechanism-based model construction and demonstrate empirically that it can effectively assist users in constructing graphical decision models.  相似文献   

5.
In recent years there has been extensive development of the fire computer models, and its use in the study of the fire safety, fire investigation, etc. has been increased. The most important types of fire computer models are the field model and the zone model. The first model reaches a better approximation to fire dynamics, but the second one requires less computational time.Additionally, in the last years, it should be noted the great advances in information processing using artificial neural networks, and it has become a useful tool with application in very diverse fields.This paper analyzes the possibilities of develop a new fire computer model using artificial neural networks. In the first approach to this objective, a simple compartment was analyzed with a field model. After that, simulations employing General Regression Neural Network were performed. This method achieves similar results that the field model employing computational times closer to the zone models. The neural network has been trained with FDS field model and validating the resulting model with data from a full scale test. In later stages other phenomena and different types of networks will be evaluated.  相似文献   

6.
Soft set theory was originally proposed by Molodtsov as a general mathematical tool for dealing with uncertainty in 1999. Recently, researches of decision making based on soft sets have got some progress, but few people consider multi-experts situation. As such, this paper discusses multi-experts group decision making problems. Firstly, we give a concept of intuitionistic fuzzy soft matrix (IFSM) and prove some relevant properties of IFSM. Then, an adjustable approach is presented by means of median level soft set and p-quantile level soft set for dealing with decision making problems based on IFSM. Thirdly, we study aggregation methods of IFSM, give two kinds of aggregation operators and methods that how to determine experts’ weights under different situation with programming models, four corresponding algorithms have been proposed, too. Finally, a practical example has been demonstrated the reasonability and efficiency of these new algorithms.  相似文献   

7.
Supervised classification learning can be considered as an important tool for decision support. In this paper, we present a method for supervised classification learning, which ensembles decision trees obtained via convex sets of probability distributions (also called credal sets) and uncertainty measures. Our method forces the use of different decision trees and it has mainly the following characteristics: it obtains a good percentage of correct classifications and an improvement in time of processing compared with known classification methods; it not needs to fix the number of decision trees to be used; and it can be parallelized to apply it on very large data sets.  相似文献   

8.
Each optimization problem in the area of natural resources claims for a specific validation and verification (V&V) procedures which, for overwhelming majority of the models, have not been developed so far. In this paper we develop V&V procedures for the crop planning optimization models in agriculture when the randomness of harvests is considered and complex crop rotation restrictions must hold. We list the criteria for developing V&V processes in this particular case, discuss the restrictions given by the data availability and suggest the V&V procedures. To show its relevance, they are applied to recently constructed stochastic programming model aiming to serve as a decision support tool for crop plan optimization in South Moravian farm. We find that the model is verified and valid and if applied in practice—it thus offers a plausible alternative to standard decision making routine on farms which often leads to breaking the crop rotation rules.  相似文献   

9.
进化博弈决策机制设计综述   总被引:1,自引:0,他引:1  
刘伟兵  王先甲 《运筹与管理》2008,17(1):84-87,105
进化博弈论是一门交叉性强的综合性理论,在国内外已得到广泛研究和应用.本文系统论述了进化博弈的决策机制及其特点,指出了进化博弈研究的趋势,进化博弈论可作为中国科技工作者学习、研究和应用的有力工具.  相似文献   

10.
The management of Operational Risk has been a difficult task due to the lack of data and the high number of variables. In this project, we treat operational risks as multivariate variables. In order to model them, copula functions are employed, which are a widely used tool in finance and engineering for building flexible joint distributions. The purpose of this research is to propose a new methodology for modelling Operational Risks and estimating the required capital. It combines the use of graphical models and the use of copula functions along with hyper-Markov law. Historical loss data of an Italian bank is used, in order to explore the methodology’s behaviour and its potential benefits.   相似文献   

11.
Geospatial reasoning has been an essential aspect of military planning since the invention of cartography. Although maps have always been a focal point for developing situational awareness, the dawning era of network-centric operations brings the promise of unprecedented battlefield advantage due to improved geospatial situational awareness. Geographic information systems (GIS) and GIS-based decision support systems are ubiquitous within current military forces, as well as civil and humanitarian organizations. Understanding the quality of geospatial data is essential to using it intelligently. A systematic approach to data quality requires: estimating and describing the quality of data as they are collected; recording the data quality as metadata; propagating uncertainty through models for data processing; exploiting uncertainty appropriately in decision support tools; and communicating to the user the uncertainty in the final product. There are shortcomings in the state-of-the-practice in GIS applications in dealing with uncertainty. No single point solution can fully address the problem. Rather, a system-wide approach is necessary. Bayesian reasoning provides a principled and coherent framework for representing knowledge about data quality, drawing inferences from data of varying quality, and assessing the impact of data quality on modeled effects. Use of a Bayesian approach also drives a requirement for appropriate probabilistic information in geospatial data quality metadata. This paper describes our research on data quality for military applications of geospatial reasoning, and describes model views appropriate for model builders, analysts, and end users.  相似文献   

12.
This paper reports on a decision support system for assigning a liver from a donor to a recipient on a waiting-list that maximises the probability of belonging to the survival graft class after a year of transplant and/or minimises the probability of belonging to the non-survival graft class in a two objective framework. This is done with two models of neural networks for classification obtained from the Pareto front built by a multi-objective evolutionary algorithm – called MPENSGA2. This type of neural network is a new model of the generalised radial basis functions for obtaining optimal values in C (Correctly Classified Rate) and MS (Minimum Sensitivity) in the classifier, and is compared to other competitive classifiers. The decision support system has been proposed using, as simply as possible, those models which lead to making the correct decision about receptor choice based on efficient and impartial criteria.  相似文献   

13.
Interval fuzzy preference relation is a useful tool to express decision maker’s uncertain preference information. How to derive the priority weights from an interval fuzzy preference relation is an interesting and important issue in decision making with interval fuzzy preference relation(s). In this paper, some new concepts such as additive consistent interval fuzzy preference relation, multiplicative consistent interval fuzzy preference relation, etc., are defined. Some simple and practical linear programming models for deriving the priority weights from various interval fuzzy preference relations are established, and two numerical examples are provided to illustrate the developed models.  相似文献   

14.
15.
The efforts spent by researchers in the last few years in traffic modelling have been focused on the modelization of dynamic behaviour of the several components making up a transportation system.In the field of traffic assignment, a large number of models and procedures have been proposed in order to perform Dynamic Network Loading (DNL), that is the reproduction of within-day variable link performances once a corresponding Origin/Destination (O/D) demand and users' choice model has been given. These models can be used both to evaluate traffic flows and, what is more relevant, to simulate the effects of regulation strategies on users' behaviour.

In this paper, after a brief review of the state of the art in this field, a continuous dynamic network loading model is proposed; it removes some of the drawbacks of other packet approach models proposed in literature and explicitly allows the en-route modification of the followed path. An algorithmic development of the model and a set of applications on text networks are also proposed.  相似文献   

16.
DEA Models for Identifying Critical Performance Measures   总被引:1,自引:0,他引:1  
In performance evaluation, it is important to identify both the efficient frontier and the critical measures. Data envelopment analysis (DEA) has been proven an effective tool for estimating the efficient frontiers, and the optimized DEA weights may be used to identify the critical measures. However, due to multiple DEA optimal weights, a unique set of critical measures may not be obtained for each decision making unit (DMU). Based upon a set of modified DEA models, this paper develops an approach to identify the critical measures for each DMU. Using a set of four Fortune's standard performance measures, capital market value, profit, revenue and number of employees, we perform a performance comparison between the Fortune's e-corporations and 1000 traditional companies. Profit is identified as the critical measure to the performance of e-corporations while revenue the critical measure to the Fortune's 1000 companies. This finding confirms that high revenue does not necessarily mean profit for e-corporations while revenue means a stable proportion of profit for the Fortune's 1000 companies.  相似文献   

17.
Computing with words in decision making: foundations, trends and prospects   总被引:1,自引:0,他引:1  
Computing with Words (CW) methodology has been used in several different environments to narrow the differences between human reasoning and computing. As Decision Making is a typical human mental process, it seems natural to apply the CW methodology in order to create and enrich decision models in which the information that is provided and manipulated has a qualitative nature. In this paper we make a review of the developments of CW in decision making. We begin with an overview of the CW methodology and we explore different linguistic computational models that have been applied to the decision making field. Then we present an historical perspective of CW in decision making by examining the pioneer papers in the field along with its most recent applications. Finally, some current trends, open questions and prospects in the topic are pointed out.  相似文献   

18.
Thermodiffusion in molten metals, also known as thermotransport, a phenomenon in which constituent elements of an alloy separate under the influence of non-uniform temperature field, is of significance in several applications. However, due to the complex inter-particle interactions, there is no theoretical formulation that can model this phenomenon with adequate accuracy. Keeping in mind the severe deficiencies of the present day thermotransport models and an urgent need of a reliable method in several engineering applications ranging from crystal growth to integrated circuit design to nuclear reactor designs, an engineering approach has been taken in which neurocomputing principles have been employed to develop artificial neural network models to study and quantify the thermotransport phenomenon in binary metal alloys. Unlike any other thermotransport model for molten metals, the neural network approach has been validated for several types of binary alloys, viz., concentrated, dilute, isotopic and non-isotopic metals. Additionally, to establish the soundness of the model and to highlight its potential as a unified computational analysis tool, it ability to capture several thermotransport trends has been shown. Comparison with other models from the literature has also been made indicating a superior performance of this technique with respect to several other well established thermotransport models.  相似文献   

19.
Over the last years, psychological research has increasingly used computer-supported tests, especially in the analysis of complex human decision making and problem solving. The approach is to use computer-based test scenarios and to evaluate the performance of participants and correlate it to certain attributes, such as the participant's capacity to regulate emotions. However, two important questions can only be answered with the help of modern optimization methodology. The first one considers an analysis of the exact situations and decisions that led to a bad or good overall performance of test persons. The second important question concerns performance, as the choices made by humans can only be compared to one another, but not to the optimal solution, as it is unknown in general.Additionally, these test-scenarios have usually been defined on a trial-and-error basis, until certain characteristics became apparent. The more complex models become, the more likely it is that unforeseen and unwanted characteristics emerge in studies. To overcome this important problem, we propose to use mathematical optimization methodology not only as an analysis and training tool, but also in the design stage of the complex problem scenario.We present a novel test scenario, the IWR Tailorshop, with functional relations and model parameters that have been formulated based on optimization results. We also present a tailored decomposition approach to solve the resulting mixed-integer nonlinear programs with nonconvex relaxations and show some promising results of this approach.  相似文献   

20.
In this paper we consider the notion of dynamic risk measures, which we will motivate as a reasonable tool in risk management. It is possible to reformulate an example of such a risk measure in terms of the value functions of a Markov decision model (MDM). Based on this observation the model is generalized to a setting with incomplete information about the risk distribution which can be seen as model uncertainty. This issue can be incorporated in the dynamic risk measure by extending the MDM to a Bayesian decision model. Moreover, it is possible to discuss the effect of model uncertainty on the risk measure in binomial models. All investigations are illustrated by a simple but useful coin tossing game proposed by Artzner and by the classic Cox–Ross–Rubinstein model.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号