首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The importance of adequately modeling credit risk has once again been highlighted in the recent financial crisis. Defaults tend to cluster around times of economic stress due to poor macro-economic conditions, but also by directly triggering each other through contagion. Although credit default swaps have radically altered the dynamics of contagion for more than a decade, models quantifying their impact on systemic risk are still missing. Here, we examine contagion through credit default swaps in a stylized economic network of corporates and financial institutions. We analyse such a system using a stochastic setting, which allows us to exploit limit theorems to exactly solve the contagion dynamics for the entire system. Our analysis shows that, by creating additional contagion channels, CDS can actually lead to greater instability of the entire network in times of economic stress. This is particularly pronounced when CDS are used by banks to expand their loan books (arguing that CDS would offload the additional risks from their balance sheets). Thus, even with complete hedging through CDS, a significant loan book expansion can lead to considerably enhanced probabilities for the occurrence of very large losses and very high default rates in the system. Our approach adds a new dimension to research on credit contagion, and could feed into a rational underpinning of an improved regulatory framework for credit derivatives.  相似文献   

2.
Background: the credit scoring model is an effective tool for banks and other financial institutions to distinguish potential default borrowers. The credit scoring model represented by machine learning methods such as deep learning performs well in terms of the accuracy of default discrimination, but the model itself also has many shortcomings such as many hyperparameters and large dependence on big data. There is still a lot of room to improve its interpretability and robustness. Methods: the deep forest or multi-Grained Cascade Forest (gcForest) is a decision tree depth model based on the random forest algorithm. Using multidimensional scanning and cascading processing, gcForest can effectively identify and process high-dimensional feature information. At the same time, gcForest has fewer hyperparameters and has strong robustness. So, this paper constructs a two-stage hybrid default discrimination model based on multiple feature selection methods and gcForest algorithm, and at the same time, it optimizes the parameters for the lowest type II error as the first principle, and the highest AUC and accuracy as the second and third principles. GcForest can not only reflect the advantages of traditional statistical models in terms of interpretability and robustness but also take into account the advantages of deep learning models in terms of accuracy. Results: the validity of the hybrid default discrimination model is verified by three real open credit data sets of Australian, Japanese, and German in the UCI database. Conclusions: the performance of the gcForest is better than the current popular single classifiers such as ANN, and the common ensemble classifiers such as LightGBM, and CNNs in type II error, AUC, and accuracy. Besides, in comparison with other similar research results, the robustness and effectiveness of this model are further verified.  相似文献   

3.
Differential replication is a method to adapt existing machine learning solutions to the demands of highly regulated environments by reusing knowledge from one generation to the next. Copying is a technique that allows differential replication by projecting a given classifier onto a new hypothesis space, in circumstances where access to both the original solution and its training data is limited. The resulting model replicates the original decision behavior while displaying new features and characteristics. In this paper, we apply this approach to a use case in the context of credit scoring. We use a private residential mortgage default dataset. We show that differential replication through copying can be exploited to adapt a given solution to the changing demands of a constrained environment such as that of the financial market. In particular, we show how copying can be used to replicate the decision behavior not only of a model, but also of a full pipeline. As a result, we can ensure the decomposability of the attributes used to provide explanations for credit scoring models and reduce the time-to-market delivery of these solutions.  相似文献   

4.
Heng-Chih Chou 《Physica A》2007,385(1):270-280
We investigate the performance of a default risk model based on the barrier option framework with maximum likelihood estimation. We provide empirical validation of the model by showing that implied default barriers are statistically significant for a sample of construction firms in Taiwan over the period 1994-2004. We find that our model dominates the commonly adopted models, Merton model, Z-score model and ZETA model. Moreover, we test the n-year-ahead prediction performance of the model and find evidence that the prediction accuracy of the model improves as the forecast horizon decreases. Finally, we assess the effect of estimated default risk on equity returns and find that default risk is able to explain equity returns and that default risk is a variable worth considering in asset-pricing tests, above and beyond size and book-to-market.  相似文献   

5.
This article deals with the problem of probabilistic prediction of the time distance to default for a firm. To model the credit risk, the dynamics of an asset is described as a function of a homogeneous discrete time Markov chain subject to a catastrophe, the default. The behaviour of the Markov chain is investigated and the mean time to the default is expressed in a closed form. The methodology to estimate the parameters is given. Numerical results are provided to illustrate the applicability of the proposed model on real data and their analysis is discussed.  相似文献   

6.
7.
This paper investigates the dynamics of credit default swap (CDS) spread. We first find auto-correlations and cross-correlations of the CDS series and the CDS average by employing detrended cross-correlation analysis (DCCA). We then employ smooth transition autoregressive (STAR) models to characterize the regime switching behavior of 28 US corporate CDS series from January 2007 through October 2009. In each case, we find clear evidence for transitions between low-price and high-price regimes. The threshold estimations of the STAR model effectively differentiate the price regimes, where the first transition consistently coincides with the explosion of the crisis in late 2008.  相似文献   

8.
Recently, there has been a growing interest in network research, especially in the fields of biology, computer science, and sociology. It is natural to address complex financial issues such as the European sovereign debt crisis from the perspective of network. In this article, we construct a network model according to the debt–credit relations instead of using the conventional methodology to measure the default risk. Based on the model, a risk index is examined using the quarterly report of consolidated foreign claims from the Bank for International Settlements (BIS) and debt/GDP ratios among these reporting countries. The empirical results show that this index can help the regulators and practitioners not only to determine the status of interconnectivity but also to point out the degree of the sovereign debt default risk. Our approach sheds new light on the investigation of quantifying the systemic risk.  相似文献   

9.
We present a framework that allows for a systematic assessment of risk given a specific model and belief on the market. Within this framework the time evolution of risk is modeled in a twofold way. On the one hand, risk is modeled by the time discrete and nonlinear garch(1,1) process, which allows for a (time-)local understanding of its level, together with a short term forecast. On the other hand, via a diffusion approximation, the time evolution of the probability density of risk is modeled by a Fokker-Planck equation. Then, as a final step, using Bayes theorem, beliefs are conditioned on the stationary probability density function as obtained from the Fokker-Planck equation. We believe this to be a highly rigorous framework to integrate subjective judgments of future market behavior and underlying models. In order to demonstrate the approach, we apply it to risk assessment of empirical interest rate scenario methodologies, i.e. the application of Principal Component Analysis to the the dynamics of bonds. Received 1st August 2000  相似文献   

10.
We set up a structural model to study credit risk for a portfolio containing several or many credit contracts. The model is based on a jump-diffusion process for the risk factors, i.e. for the company assets. We also include correlations between the companies. We discuss that models of this type have much in common with other problems in statistical physics and in the theory of complex systems. We study a simplified version of our model analytically. Furthermore, we perform extensive numerical simulations for the full model. The observables are the loss distribution of the credit portfolio, its moments and other quantities derived thereof. We compile detailed information about the parameter dependence of these observables. In the course of setting up and analyzing our model, we also give a review of credit risk modeling for a physics audience.  相似文献   

11.
Credit default swap (CDS) has become one of the most actively traded credit derivatives, and its importance in finance markets has increased after the subprime crisis. In this study, we analyzed the correlation structure of credit risks embedded in CDS and the influence of the subprime crisis on this topological space. We found that the correlation was stronger in the cluster constructed according to the location of the CDS reference companies than in the one constructed according to their industries. The correlation both within a given cluster and between different clusters became significantly stronger after the subprime crisis. The causality test shows that the lead lag effect between the portfolios (into which reference companies are grouped by the continent where each of them is located) is reversed in direction because the portion of non-investable and investable reference companies in each portfolio has changed since then. The effect of a single impulse has increased and the response time relaxation has become prolonged after the crisis as well.  相似文献   

12.
We build a statistical ensemble representation of two economic models describing respectively, in simplified terms, a payment system and a credit market. To this purpose we adopt the Boltzmann–Gibbs distribution where the role of the Hamiltonian is taken by the total money supply (i.e. including money created from debt) of a set of interacting economic agents. As a result, we can read the main thermodynamic quantities in terms of monetary ones. In particular, we define for the credit market model a work term which is related to the impact of monetary policy on credit creation. Furthermore, with our formalism we recover and extend some results concerning the temperature of an economic system, previously presented in the literature by considering only the monetary base as a conserved quantity. Finally, we study the statistical ensemble for the Pareto distribution.  相似文献   

13.
When analyzing nonlinear stochastic systems, we deal with the chains of differential equations for the moments or cumulants of dynamic variables. To disconnect such chains, the well-known cumulant approach, which is adequate to the quasi-Gaussian expansion of the higher-order moments is used. However, this method is inefficient in the problems of Brownian diffusion in bimodal potential profiles, and the disconnection problem should be solved on the basis of bimodal probability distributions. To this end, we propose to construct bimodal model distributions, in particular, the bi-Gaussian distribution. Cumulants and the expansions of the higher-order moments for symmetric and nonsymmetric bi-Gaussian models. On this basis, we consider relaxation of probability characteristics of one-dimensional Brownian motion in the bimodal potential profile. The dependences of relaxation of the mean value and variance of particle coordinate on the potential barrier “power,” the noise intensity, and the initial distribution of particles are analyzed numerically. In particular, it is shown that relaxation proceeds by stages with different temporal scales in the case of a powerful barrier. __________ Translated from Izvestiya Vysshikh Uchebnykh Zavedenii, Radiofizika, Vol. 49, No. 8, pp. 718–729, August 2006.  相似文献   

14.
15.
T.S. Biró 《Physica A》2008,387(7):1603-1612
In this paper we study the possible microscopic origin of heavy-tailed probability density distributions for the price variation of financial instruments. We extend the standard log-normal process to include another random component in the so-called stochastic volatility models. We study these models under an assumption, akin to the Born-Oppenheimer approximation, in which the volatility has already relaxed to its equilibrium distribution and acts as a background to the evolution of the price process. In this approximation, we show that all models of stochastic volatility should exhibit a scaling relation in the time lag of zero-drift modified log-returns. We verify that the Dow-Jones Industrial Average index indeed follows this scaling. We then focus on two popular stochastic volatility models, the Heston and Hull-White models. In particular, we show that in the Hull-White model the resulting probability distribution of log-returns in this approximation corresponds to the Tsallis (t-Student) distribution. The Tsallis parameters are given in terms of the microscopic stochastic volatility model. Finally, we show that the log-returns for 30 years Dow Jones index data is well fitted by a Tsallis distribution, obtaining the relevant parameters.  相似文献   

16.
An efficient surrogate-based method for computing rare failure probability   总被引:1,自引:0,他引:1  
In this paper, we present an efficient numerical method for evaluating rare failure probability. The method is based on a recently developed surrogate-based method from Li and Xiu [J. Li, D. Xiu, Evaluation of failure probability via surrogate models, J. Comput. Phys. 229 (2010) 8966–8980] for failure probability computation. The method by Li and Xiu is of hybrid nature, in the sense that samples of both the surrogate model and the true physical model are used, and its efficiency gain relies on using only very few samples of the true model. Here we extend the capability of the method to rare probability computation by using the idea of importance sampling (IS). In particular, we employ cross-entropy (CE) method, which is an effective method to determine the biasing distribution in IS. We demonstrate that, by combining with the CE method, a surrogate-based IS algorithm can be constructed and is highly efficient for rare failure probability computation—it incurs much reduced simulation efforts compared to the traditional CE-IS method. In many cases, the new method is capable of capturing failure probability as small as 10−12 ∼ 10−6 with only several hundreds samples.  相似文献   

17.
The influence of slow processes on the probability distribution of fast random processes is investigated. By reviewing four examples we show that such influence is apparently of a universal character and that, in some cases, this universality is of multifractal form. As our examples we consider theoretically stochastic resonance, turbulent jets with acoustic forcing, and two problems studied experimentally by Shnoll on the influence of the Earth’s slow rotation on the probability distribution for the velocities of model Brownian particles and on alpha decay. In the case of stochastic resonance, the slow process is a low frequency, harmonic, external force. In the case of turbulent jets, the slow process is acoustic forcing. In the models based on Shnoll’s experiments, the slow processes are inertial forces arising from the rotation of the Earth, both about its own axis and about the Sun. It is shown that all of these slow processes cause changes in the probability distributions for the velocities of fast processes interacting with them, and that these changes are similar in form.  相似文献   

18.
拦截概率是评价反导作战效能的重要指标,对交战前任务规划至关重要。选取某一拦截弹建立其在空间的可达集区域,针对中段反导逆轨拦截情况,确立其在空间中的杀伤区。针对杀伤区内任一点可能为拦截点的情况,对某一来袭导弹的弹道轨迹,选取可拦截弧段上的理论拦截点,通过在选取不同高度上的理论拦截点上,调整拦截弹的发射位置,从而得到拦截点在杀伤区内不同高度,不同距离以及在不同的交汇角时对拦截概率的影响。为反导作战的阵地部署研究和交战策略提供了一定的参考。  相似文献   

19.
A family of probability distributions (i.e. a statistical model) is said to be sufficient for another, if there exists a transition matrix transforming the probability distributions in the former to the probability distributions in the latter. The Blackwell-Sherman-Stein (BSS) Theorem provides necessary and sufficient conditions for one statistical model to be sufficient for another, by comparing their information values in statistical decision problems. In this paper we extend the BSS Theorem to quantum statistical decision theory, where statistical models are replaced by families of density matrices defined on finite-dimensional Hilbert spaces, and transition matrices are replaced by completely positive, trace-preserving maps (i.e. coarse-grainings). The framework we propose is suitable for unifying results that previously were independent, like the BSS theorem for classical statistical models and its analogue for pairs of bipartite quantum states, recently proved by Shmaya. An important role in this paper is played by statistical morphisms, namely, affine maps whose definition generalizes that of coarse-grainings given by Petz and induces a corresponding criterion for statistical sufficiency that is weaker, and hence easier to be characterized, than Petz’s.  相似文献   

20.
Ventricular tachycardia and fibrillation are potentially lethal cardiac arrhythmias generated by high frequency, irregular spatio-temporal electrical activity. Re-entrant propagation has been demonstrated as a mechanism generating these arrhythmias in computational and in vitro animal models of these arrhythmias. Re-entry can be idealised in homogenous isotropic virtual cardiac tissues as spiral and scroll wave solutions of reaction-diffusion equations. A spiral wave in a bounded medium can be terminated if its core reaches a boundary. Ventricular tachyarrhythmias in patients are sometimes observed to spontaneously self-terminate. One possible mechanism for self-termination of a spiral wave is meander of its core to an inexcitable boundary. We have previously proposed the hypothesis that the spatial extent of meander of a re-entrant wave in the heart can be directly related to its probability of self-termination, and so inversely related to its lethality. Meander in two-dimensional virtual ventricular tissues based on the Oxsoft family of cell models, with membrane excitation parameters simulating the inherited long Q-T syndromes has been shown to be consistent with this hypothesis: the largest meander is seen in the syndrome with the lowest probability of death per arrhythmic episode. Here we extend our previous results to virtual tissues based on the Luo-Rudy family of models. Consistent with our hypothesis, for both families of models, whose different ionic mechanisms produce different patterns of meander, the LQT virtual tissue with the larger meander simulates the syndrome with the lower probability of death per episode. Further, we search the parameter space of the repolarizing currents to find their conductance parameter values that give increased meander of spiral waves. These parameters may provide targets for antiarrhythmic drugs designed to act by increasing the likelihood of self-termination of re-entrant arrhythmias. (c) 2002 American Institute of Physics.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号