首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In this paper, a variable-precision dominance-based rough set approach (VP-DRSA) is proposed together with several VP-DRSA-based approaches to attribute reduction. The properties of VP-DRSA are shown in comparison to previous dominance-based rough set approaches. An advantage of VP-DRSA over variable-consistency dominance-based rough set approach in decision rule induction is emphasized. Some relations among the VP-DRSA-based attribute reduction approaches are investigated.  相似文献   

2.
Rough sets are efficient for data pre-processing during data mining. However, some important problems such as attribute reduction in rough sets are NP-hard and the algorithms required to solve them are mostly greedy ones. The transversal matroid is an important part of matroid theory, which provides well-established platforms for greedy algorithms. In this study, we investigate transversal matroids using the rough set approach. First, we construct a covering induced by a family of subsets and we propose the approximation operators and upper approximation number based on this covering. We present a sufficient condition under which a subset is a partial transversal, and also a necessary condition. Furthermore, we characterize the transversal matroid with the covering-based approximation operator and construct some types of circuits. Second, we explore the relationships between closure operators in transversal matroids and upper approximation operators based on the covering induced by a family of subsets. Finally, we study two types of axiomatic characterizations of the covering approximation operators based on the set theory and matroid theory, respectively. These results provide more methods for investigating the combination of transversal matroids with rough sets.  相似文献   

3.
The soft set theory, originally proposed by Molodtsov, can be used as a general mathematical tool for dealing with uncertainty. Since its appearance, there has been some progress concerning practical applications of soft set theory, especially the use of soft sets in decision making. The intuitionistic fuzzy soft set is a combination of an intuitionistic fuzzy set and a soft set. The rough set theory is a powerful tool for dealing with uncertainty, granuality and incompleteness of knowledge in information systems. Using rough set theory, this paper proposes a novel approach to intuitionistic fuzzy soft set based decision making problems. Firstly, by employing an intuitionistic fuzzy relation and a threshold value pair, we define a new rough set model and examine some fundamental properties of this rough set model. Then the concepts of approximate precision and rough degree are given and some basic properties are discussed. Furthermore, we investigate the relationship between intuitionistic fuzzy soft sets and intuitionistic fuzzy relations and present a rough set approach to intuitionistic fuzzy soft set based decision making. Finally, an illustrative example is employed to show the validity of this rough set approach in intuitionistic fuzzy soft set based decision making problems.  相似文献   

4.
Emergency decision-making is still an important issue of unconventional emergency events management. Although many studies are developed on this topic, they remain political and qualitative, and it is difficult to make them operational in practice. Therefore, this article considers a fuzzy rough set over two universes model and approach for solving such a difficulty. As is well known, an exact and scientific emergency material demand prediction can make a quick and efficient emergency rescue and realize the optimal effect. Considering the main characteristics of emergency decision-making with insufficient risk identification, incomplete and inaccuracy of available information and uncertainty of decision-making environment, the fuzzy rough set theory over two universes is used to emergency material demand prediction. We propose a model and approach to emergency material demand prediction, i.e., the fuzzy rough set model of emergency material demand prediction over two universes. We present decision rules and computing methods for the proposed model by using the risk decision-making principle of classical operational research. Finally, the validity of the approach and the applied process of the proposed model is tested by a numerical example with the background of earthquake emergency material demand forecasting.  相似文献   

5.
6.
Within the new bank regulatory context, the assessment of the credit risk of financial institutions is an important issue for supervising authorities and investors. This study explores the possibility of a developing risk assessment model for financial institutions using a multicriteria classification method. The analysis is based on publicly available financial data for UK firms. The results indicate that the proposed multicriteria methodology provides promising results compared to well known statistical methods.  相似文献   

7.
In the financial market, it is important to consider that there is a proportion of customers that have settled their debt in time zero, immediately recovering their ability to pay. In this context, in this paper, we propose a survival analysis methodology that allows the insertion of times equal to zero in scenarios where credit risk is observed. The proposed model addresses the survival analysis model of the zero-inflated cure rate which incorporates the heterogeneity of three subgroups (individuals having events in the initial time, and individuals not susceptible and susceptible to the event). In our proposal, all available survival data of customers are modeled considering that the number of competitive causes follows a Poisson distribution and the baseline risk function follows a Gompertz distribution. The model parameter estimation is obtained by the maximum likelihood estimation procedure and simulation studies are conducted to evaluate the estimators' performance. The studied methodology will be applied to a credit database provided by a financial institution in Brazil.  相似文献   

8.
The development of credit risk assessment models is often considered within a classification context. Recent studies on the development of classification models have shown that a combination of methods often provides improved classification results compared to a single-method approach. Within this context, this study explores the combination of different classification methods in developing efficient models for credit risk assessment. A variety of methods are considered in the combination, including machine learning approaches and statistical techniques. The results illustrate that combined models can outperform individual models for credit risk analysis. The analysis also covers important issues such as the impact of using different parameters for the combined models, the effect of attribute selection, as well as the effects of combining strong or weak models.  相似文献   

9.
Applying competency models to identify and develop capabilities of civil servants is now a leading strategy for every government. However, an ideal competency model usually contains too many intended competencies, impeding implementation. Recently, some scholars and experts argued that there is a maximum of eight competencies for effective assessment. Hence, how to simplify a set of competencies becomes an important issue. This study is presented as a test case to extend practical applications of rough set theory (RST) in the human resource field of Government. A well-known data mining technique, RST is a relatively new approach to this problem and is good at data reduction in qualitative analysis. Hence, the rough set approach is suitable for dealing with the qualitative problem in simplifying a set of competencies. This paper slimmed a set of competencies using RST, thus helping the Taiwan Government to better understand the perceived competency levels of its civil servants. Using the rough set analysis, this paper successfully reduced the numerous essential competencies into a more compact set, by omitting low-consensus competencies.  相似文献   

10.
We propose a new model for the aggregation of risks that is very flexible and useful in high dimensional problems. We propose a copula-based model that is both hierarchical and hybrid (HYC for short), because: (i) the dependence structure is modeled as a hierarchical copula, (ii) it unifies the idea of the clusterized homogeneous copula-based approach (CHC for short) and its limiting version (LHC for short) proposed in Bernardi and Romagnoli (2012, 2013). Based on this, we compute the loss function of a world-wide sovereign debt portfolio which accounts for a systemic dependence of all countries, in line with a global valuation of financial risks. Our approach enables us to take into account the non-exchangeable behavior of a sovereign debts’ portfolio clustered into several classes with homogeneous risk and to recover a possible risks’ hierarchy. A comparison between the HYC loss surface and those computed through a pure limiting approach, which is commonly used in high dimensional problems, is presented and the impact of the concentration and the granularity errors is appreciated. Finally the impact of an enlargement of the dependence structure is discussed, in the contest of a geographical area sub-portfolios analysis now relevant to determine the risk contributions of subgroups under the presence of a wider dependence structure. This argument is presented in relation to the evaluation of the insurance premium and the collateral related to the designed project of an euro-insurance-bond.  相似文献   

11.
12.
We present an efficient model for the simulation of phase-transformations in polycrystalline materials. As a basis, we use a thermodynamically consistent, one-dimensional phase-transformation model, which is embedded into a micro-sphere formulation in order to be able to simulate three-dimensional boundary value problems. The underlying evolution equations are solved efficiently using a newly developed explicit integration scheme that has been proved to be unconditionally A-stable. A numerical example by means of a deformation in simple shear is additionally provided in this contribution. (© 2010 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

13.
The purpose of this paper is to apply the model of stochastic networks (networks of quasi-reversible stations) to customer brand-choice behaviour for studying the market share of some service undustries such as in life insurance. We estimate the market share and the mean unit sales of a specified brand in a static market. The model involves use time, customer loyalty, brand switching, initial purchasing and market share variation.  相似文献   

14.
Ratio analysis is a useful tool of financial analysis. Nevertheless, the traditional ratio analysis is under several constraints: over-empiricist, certainty, standard of reference not useful in all circumstances, etc. Recent researches have pointed out that to overcome those constraints formal decision models can be applied. In this article, fuzzy set theory is applied to ratio analysis with respect to one of the major management problems: liquidity. This approach enables the decision maker to include his own experience and any other type of information to that obtained by the ratio. If all the possible decisions are uniform in time, it is possible to adopt them by the decision maker in each period of analysis in a programmed form through a simple model inputs combination. The approach provided in this article can be extended to other ratio or ratio sets.  相似文献   

15.
The complete set partitioning (CSP) problem is a special case of the set partitioning problem where the coefficient matrix has 2 m –1 columns, each column being a binary representation of a unique integer between 1 and 2 m –1,m1. It has wide applications in the area of corporate tax structuring in operations research. In this paper we propose a dynamic programming approach to solve the CSP problem, which has time complexityO(3 m ), wheren=2 m –1 is the size of the problem space.  相似文献   

16.
The objective of analysing a company's risk exposures is togain an understanding of the risks that the company faces. Onlythen can the likely level of future losses be estimated, anddecisions about how best to manage these risks be made. To gaina full understanding, we first need to adjust for a number ofexternal factors to ensure that all data are on a consistentbasis. The historic data can then be analysed and the levelof variability determined. After identifying appropriate probabilitydistributions for the frequency and severity of the risks, simulationscan be run to make forecasts. Once forecasts have been made, the best way to manage and financethe risks can be considered. As such decisions typically dependupon many factors, utility theory can be used to summarize theadvantage that the company will obtain from each alternativein a given situation. This will involve defining a utility functionfor the company. Methods of eliciting these utility functionsexist, including influence diagrams. Decision theory can consequentlybe applied to determine the best course of action using thecompany's utility function and its beliefs about the future.Uncertainty inherent in the information can therefore be incorporatedin the decision process rather than be ignored. The decisionwill also depend upon the ability of the company to sustaina loss from retained risks and regulatory requirements relatingto the risks.  相似文献   

17.
Credit risk measurement and management are important and current issues in the modern finance world from both the theoretical and practical perspectives. There are two major schools of thought for credit risk analysis, namely the structural models based on the asset value model originally proposed by Merton and the intensity‐based reduced form models. One of the popular credit risk models used in practice is the Binomial Expansion Technique (BET) introduced by Moody's. However, its one‐period static nature and the independence assumption for credit entities' defaults are two shortcomings for the use of BET in practical situations. Davis and Lo provided elegant ways to ease the two shortcomings of BET with their default infection and dynamic continuous‐time intensity‐based approaches. This paper first proposes a discrete‐time dynamic extension to the BET in order to incorporate the time‐dependent and time‐varying behaviour of default probabilities for measuring the risk of a credit risky portfolio. In reality, the ‘true’ default probabilities are unobservable to credit analysts and traders. Here, the uncertainties of ‘true’ default probabilities are incorporated in the context of a dynamic Bayesian paradigm. Numerical studies of the proposed model are provided.  相似文献   

18.
19.
20.
Merton’s model views equity as a call option on the asset of the firm. Thus the asset is partially observed through the equity. Then using nonlinear filtering an explicit expression for likelihood ratio for underlying parameters in terms of the nonlinear filter is obtained. As the evolution of the filter itself depends on the parameters in question, this does not permit direct maximum likelihood estimation, but does pave the way for the ‘Expectation-Maximization’ method for estimating parameters.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号