首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 875 毫秒
1.
This paper explores the relationship between option markets for the S&P500 (SPX) and Chicago Board Options Exchange’s CBOE’s Volatility Index (VIX). Results are obtained by using the so-called time-spread portfolio to replicate a future contract on the squared VIX. The time-spread portfolio is interesting because it provides a model-free link between derivative prices for SPX and VIX. Time spreads can be computed from SPX put options with different maturities, which results in a term structure for squared volatility. This term structure can be compared to the VIX-squared term structure that is backed-out from VIX call options. The time-spread portfolio is also used to measure volatility-of-volatility (vol-of-vol) and the volatility leverage effect. There may emerge small differences in these measurements, depending on whether time spreads are computed with options on SPX or options on VIX. A study of 2012 daily options data shows that vol-of-vol estimates utilizing SPX data will reflect the volatility leverage effect, whereas estimates that exclusively utilize VIX options will predominantly reflect the premia in the VIX-future term structure.  相似文献   

2.
The aim of the paper is twofold. Firstly, it develops a model for risk assessment in a portfolio of life annuities with long term care benefits. These products are usually represented by a Markovian Multi-State model and are affected by both longevity and disability risks. Here, a stochastic projection model is proposed in order to represent the future evolution of mortality and disability transition intensities. Data from the Italian National Institute of Social Security (INPS) and from Human Mortality Database (HMD) are used to estimate the model parameters. Secondly, it investigates the solvency in a portfolio of enhanced pensions. To this aim a risk model based on the portfolio risk reserve is proposed and different rules to calculate solvency capital requirements for life underwriting risk are examined. Such rules are then compared with the standard formula proposed by the Solvency II project.  相似文献   

3.
Practically all organizations seek to create value by selecting and executing portfolios of actions that consume resources. Typically, the resulting value is uncertain, and thus organizations must take decisions based on ex ante estimates about what this future value will be. In this paper, we show that the Bayesian modeling of uncertainties in this selection problem serves to (i) increase the expected future value of the selected portfolio, (ii) raise the expected number of selected actions that belong to the optimal portfolio ex post, and (iii) eliminate the expected gap between the realized ex post portfolio value and the estimated ex ante portfolio value. We also propose a new project performance measure, defined as the probability that a given action belongs to the optimal portfolio. Finally, we provide analytic results to determine which actions should be re-evaluated to obtain more accurate value estimates before portfolio selection. In particular, we show that the optimal targeting of such re-evaluations can yield a much higher portfolio value in return for the total resources that are spent on the execution of actions and the acquisition of value estimates.  相似文献   

4.
The regulatory credit value adjustment (CVA) for an outstanding over-the-counter (OTC) derivative portfolio is computed based on the portfolio exposure over its lifetime. Usually, the future portfolio exposure is approximated using the Monte Carlo simulation, as the portfolio value can be driven by several market risk-factors. For derivatives, such as Bermudan swaptions, that do not have an analytical approximation for their Mark-to-Market (MtM) value, the standard market practice is to use the regression functions from the least squares Monte Carlo method to approximate their MtM along simulated scenarios. However, such approximations have significant bias and noise, resulting in inaccurate CVA charge. In this paper, we extend the Stochastic Grid Bundling Method (SGBM) for the one-factor Gaussian short rate model, to efficiently and accurately compute Expected Exposure, Potential Future exposure and CVA for Bermudan swaptions. A novel contribution of the paper is that it demonstrates how different measures, for instance spot and terminal measure, can simultaneously be employed in the SGBM framework, to significantly reduce the variance and bias of the solution.  相似文献   

5.
Index tracking is a passive investment strategy in which a fund (e.g., an ETF: exchange traded fund) manager purchases a set of assets to mimic a market index. The tracking error, i.e., the difference between the performances of the index and the portfolio, may be minimized by buying all the assets contained in the index. However, this strategy results in a considerable transaction cost and, accordingly, decreases the return of the constructed portfolio. On the other hand, a portfolio with a small cardinality may result in poor out-of-sample performance. Of interest is, thus, constructing a portfolio with good out-of-sample performance, while keeping the number of assets invested in small (i.e., sparse). In this paper, we develop a tracking portfolio model that addresses the above conflicting requirements by using a combination of L0- and L2-norms. The L2-norm regularizes the overdetermined system to impose smoothness (and hence has better out-of-sample performance), and it shrinks the solution to an equally-weighted dense portfolio. On the other hand, the L0-norm imposes a cardinality constraint that achieves sparsity (and hence a lower transaction cost). We propose a heuristic method for estimating portfolio weights, which combines a greedy search with an analytical formula embedded in it. We demonstrate that the resulting sparse portfolio has good tracking and generalization performance on historic data of weekly and monthly returns on the Nikkei 225 index and its constituent companies.  相似文献   

6.
The future returns of each securities cannot be correctly reflected by the data in the past, therefore the expert’s judgements and experiences should be considered to estimate the security returns for the future. In this paper, we propose an interval portfolio selection model in which both the returns and the risks of assets are defined as intervals. By using interval and convex analysis, we solve this model and get the noninferior solution. Finally, an example is given to illustrate our results. The interval portfolio selection model improves and generalizes the Markowitz’s mean-variance model and the results of Deng et al. (Eur J Oper Res 166(1):278–292, 2005).  相似文献   

7.
For determining an optimal portfolio allocation, parameters representing the underlying market—characterized by expected asset returns and the covariance matrix—are needed. Traditionally, these point estimates for the parameters are obtained from historical data samples, but as experts often have strong opinions about (some of) these values, approaches to combine sample information and experts’ views are sought for. The focus of this paper is on the two most popular of these frameworks—the Black-Litterman model and the Bayes approach. We will prove that—from the point of traditional portfolio optimization—the Black-Litterman is just a special case of the Bayes approach. In contrast to this, we will show that the extensions of both models to the robust portfolio framework yield two rather different robustified optimization problems.  相似文献   

8.
A Markowitz-type portfolio selection problem is to minimize a deviation measure of portfolio rate of return subject to constraints on portfolio budget and on desired expected return. In this context, the inverse portfolio problem is finding a deviation measure by observing the optimal mean-deviation portfolio that an investor holds. Necessary and sufficient conditions for the existence of such a deviation measure are established. It is shown that if the deviation measure exists, it can be chosen in the form of a mixed CVaR-deviation, and in the case of n risky assets available for investment (to form a portfolio), it is determined by a combination of (n + 1) CVaR-deviations. In the later case, an algorithm for constructing the deviation measure is presented, and if the number of CVaR-deviations is constrained, an approximate mixed CVaR-deviation is offered as well. The solution of the inverse portfolio problem may not be unique, and the investor can opt for the most conservative one, which has a simple closed-form representation.  相似文献   

9.
The expected profit or loss of a non-life insurance company is determined for the whole of its multiple business lines. This implies the study of the claims reserving problem for a portfolio consisting of several correlated run-off triangles. A popular technique to deal with such a portfolio is the multivariate chain-ladder method of Merz and Wüthrich (2008). However, it is well known that the chain-ladder method is very sensitive to outlying data. For the univariate case, we have already developed a robust version of the chain-ladder method. In this article we propose two techniques to detect and correct outlying values in a bivariate situation. The methodologies are illustrated and compared on real examples from practice.  相似文献   

10.
Capital allocation models generally assume that the risk portfolio is constructed at a single point in time, when the underwriter has full information about available underwriting opportunities. However, in practice, opportunities are not all known at the beginning but instead arrive over time. Moreover, a commitment to an opportunity is not easy to change as time passes. Thus, to optimize a portfolio, the underwriter must make decisions on opportunities as they arrive while making use of assumptions about what will arrive in the future. This paper studies capital allocation rules in this setting, finding important differences from the static setting. The pricing of an opportunity is based on an expected future marginal cost of risk associated with that opportunity—one that will be fully understood only after the risk portfolio is finalized. The risk charge for today’s opportunity is thus a probability-weighted average of the product of the marginal value of capital in future states of the world and the amount of capital consumed by the opportunity in those future states. Our numerical examples illustrate how the marginal cost of risk for an opportunity is shaped by when it arrives in time, as well as what has arrived before it.  相似文献   

11.
In mean-risk portfolio optimization, it is typically assumed that the assets follow a known distribution P 0, which is estimated from observed data. Aiming at an investment strategy which is robust against possible misspecification of P 0, the portfolio selection problem is solved with respect to the worst-case distribution within a Wasserstein-neighborhood of P 0. We review tractable formulations of the portfolio selection problem under model ambiguity, as it is called in the literature. For instance, it is known that high model ambiguity leads to equally-weighted portfolio diversification. However, it often happens that the marginal distributions of the assets can be estimated with high accuracy, whereas the dependence structure between the assets remains ambiguous. This leads to the problem of portfolio selection under dependence uncertainty. We show that in this case portfolio concentration becomes optimal as the uncertainty with respect to the estimated dependence structure increases. Hence, distributionally robust portfolio optimization can have two very distinct implications: Diversification on the one hand and concentration on the other hand.  相似文献   

12.
汪浩 《应用概率统计》2003,19(3):267-276
由于金融市场中的日周期或短周期对数回报率的样本数据多数呈现胖尾分布,于是现有的正态或对数正态分布模型都在不同程度上失效,为了准确模拟这种胖尾分布和提高投资风险估计及金融管理,本文引进了一种可根据实际金融市场数据作出调正的蒙特卡洛模拟方法.这个方法可以有效地复制金融产品价格的日周期对数回报率数据的胖尾分布.结合非参数估计方法,利用该模拟方法还得到投资高风险值以及高风险置信区间的准确估计。  相似文献   

13.
Before applying actuarial techniques to determine different subportfolios and adjusted insurance premiums for contracts that belong to a more or less heterogeneous portfolio, e.g. using credibility theory, it is worthwhile performing a statistical analysis on the relevant factors influencing the risk in the portfolio. Also the distributional behaviour of the portfolio should be examined. In this paper such a programme is presented for car insurance data using logistic regression, correspondence analysis, and statistical techniques from survival analysis. The specific mechanisms governing large claims in such portfolios will also be described. This work is based on a representative sample from Belgian car insurance data from 1989.  相似文献   

14.
A zero-one integer linear programming model is proposed for selecting and scheduling an optimal project portfolio, based on the organisation's objectives and constraints such as resource limitations and interdependence among projects. The model handles some of the issues that frequently arise in real world applications but are not addressed by previously suggested models, such as situations in which the amount of available and consumed resources varies in different periods. It also allows for interactive adjustment following the optimisation process, to provide decision makers a method for controlling portfolio selection, based on criteria that may be difficult to elicit directly. It is critical for such a system to provide fast evaluation of alternatives the decision makers may want to examine, and this requirement is addressed. The proposed model not only suggests projects that should be incorporated in the optimal portfolio, but it also determines the starting period for each project. Scheduling considerations can have a major impact on the combination of projects that can be incorporated in the portfolio, and may allow the addition of certain projects to the portfolio that could not have been selected otherwise. An example problem is described and solved with the proposed model, and some areas for future research are discussed.  相似文献   

15.
Fundamental analysis is an approach for evaluating a firm for its investment-worthiness whereby the firm's financial statements are subject to detailed investigation to predict future stock price performance. In this paper, we propose an approach to combine financial statement data using Data Envelopment Analysis to determine a relative financial strength (RFS) indicator. Such an indicator captures a firm's fundamental strength or competitiveness in comparison to all other firms in the industry/market segment. By analysing the correlation of the RFS indicator with the historical stock price returns within the industry, a well-informed assessment can be made about considering the firm in an equity portfolio. We test the proposed indicator with firms from the technology sector, using various US industries and report correlation analyses. Our preliminary computations using RFS indicator-based stock selection within mean–variance portfolio optimization demonstrate the validity of the proposed approach.  相似文献   

16.
A topic of interest in recent literature is regulatory capital requirements for consumer loan portfolios. Banks are required to hold regulatory capital for unexpected losses, while expected losses are to be covered by either provisions or future income. In this paper, we show the set of efficient operating points in the market share and profit space for a portfolio manager operating under Basel II capital requirement and under capital constraints are a union of single-cutoff-score and double-cutoff-score operating points. For a portfolio manager to increase market-share beyond the maximum allowable under a single-cutoff score policy (eg, with binding capital constraints) requires granting loans to higher than optimal risk applicants. We show this result in greater portfolio risk but without an increase in regulatory capital requirement amount. The increase in forecasted losses is assumed to be absorbed by provisions or future margin income. Given portfolio managers take on higher risk under the same regulatory capital amount, our findings call for greater focus on provision amounts and future margin income under the supervisory review pillar of Basel II. This research raises the issue of whether the design of the regulatory formula for consumer loan portfolios is flawed.  相似文献   

17.
The research on financial portfolio optimization has been originally developed by Markowitz (1952). It has been further extended in many directions, among them the portfolio insurance theory introduced by Leland and Rubinstein (1976) for the “Option Based Portfolio Insurance” (OBPI) and Perold (1986) for the “Constant Proportion Portfolio Insurance” method (CPPI). The recent financial crisis has dramatically emphasized the interest of such portfolio strategies. This paper examines the CPPI method when the multiple is allowed to vary over time. To control the risk of such portfolio management, a quantile approach is introduced together with expected shortfall criteria. In this framework, we provide explicit upper bounds on the multiple as function of past asset returns and volatilities. These values can be statistically estimated from financial data, using for example ARCH type models. We show how the multiple can be chosen in order to satisfy the guarantee condition, at a given level of probability and for various financial market conditions.  相似文献   

18.
We apply ideas from stochastic optimization for defining universal portfolios. Universal portfolios are that class of portfolios which are constructed directly from the available observations of the stocks behavior without any assumptions about their statistical properties. Cover [7] has shown that one can construct such portfolio using only observations of the past stock prices which generates the same asymptotic wealth growth as the best constant rebalanced portfolio which is constructed with the full knowledge of the future stock market behavior.In this paper we construct universal portfolios using a different set of ideas drawn from nonstationary stochastic optimization. Our portfolios yield the same asymptotic growth of wealth as the best constant rebalanced portfolio constructed with the perfect knowledge of the future and they are less demanding computationally compared to previously known universal portfolios. We also present computational evidence using New York Stock Exchange data which shows, among other things, superior performance of portfolios which explicitly take into account possible nonstationary market behavior.  相似文献   

19.
The situation of a limited availability of historical data is frequently encountered in portfolio risk estimation, especially in credit risk estimation. This makes it difficult, for example, to find statistically significant temporal structures in the data on the single asset level. By contrast, there is often a broader availability of cross-sectional data, i.e. a large number of assets in the portfolio. This paper proposes a stochastic dynamic model which takes this situation into account. The modelling framework is based on multivariate elliptical processes which model portfolio risk via sub-portfolio specific volatility indices called portfolio risk drivers. The dynamics of the risk drivers are modelled by multiplicative error models (MEMs)-as introduced by Engle [Engle, R.F., 2002. New frontiers for ARCH models. J. Appl. Econom. 17, 425-446]-or by traditional ARMA models. The model is calibrated to Moody’s KMV Credit Monitor asset returns (also known as firm-value returns) given on a monthly basis for 756 listed European companies at 115 time points from 1996 to 2005. This database is used by financial institutions to assess the credit quality of firms. The proposed risk drivers capture the volatility structure of asset returns in different industry sectors. A characteristic cyclical as well as a seasonal temporal structure of the risk drivers is found across all industry sectors. In addition, each risk driver exhibits idiosyncratic developments. We also identify correlations between the risk drivers and selected macroeconomic variables. These findings may improve the estimation of risk measures such as the (portfolio) Value at Risk. The proposed methods are general and can be applied to any series of multivariate asset or equity returns in finance and insurance.  相似文献   

20.
投资者进行投资实践时无不面临着背景风险。绝大多数以均值方差为框架的投资组合并没有考虑背景风险,其效用在实际应用中容易受到背景风险的影响。本文在含有交易费用的双目标函数模型中引入背景风险,从是否含有背景风险和背景风险偏好度大小两方面对投资组合问题展开研究,并使用智能算法得到模型的最优解,对模型进行实证分析。实证结果表明:1)当背景风险收益为0时,含有背景风险的投资组合比不含有背景风险的投资组合更能反映真实的投资环境。2) 当背景风险收益不为0时,含有背景风险的投资组合比不含有背景风险的投资组合得到更高的收益。因此,考虑背景风险后投资组合的构建优于不考虑背景风险投资组合的构建。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号