首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper examines the extent to which financial returns on market indices exhibit mean and volatility asymmetries, as a response to past information from both the U.S. market and the local market itself. In particular, we wish to assess the asymmetric effect of a combination of local and U.S. market news on volatility. To the best of the authors knowledge, this joint effect has not been considered previously. We propose a double threshold non‐linear heteroscedastic model, combined with a GJR‐GARCH effect in the conditional volatility equation, to capture jointly both mean and volatility asymmetric behaviours and the interactive effect of U.S. and local market news. In an application to five major international market indices, clear evidence of threshold non‐linearity is discovered, supporting the hypothesis of an uneven mean‐reverting pattern and volatility asymmetry, both in reaction to U.S. market news and news from the local market itself. Significant, but somewhat different, interactive effects between local and U.S. news are observed in all markets. An asymmetric pattern in the exogenous relationship between the local market and the U.S. market is also found. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

2.
This study proposes a threshold realized generalized autoregressive conditional heteroscedastic (GARCH) model that jointly models daily returns and realized volatility, thereby taking into account the bias and asymmetry of realized volatility. We incorporate this threshold realized GARCH model with skew Student‐t innovations as the observation equation, view this model as a sharp transition model, and treat the realized volatility as a proxy for volatility under this nonlinear structure. Through the Bayesian Markov chain Monte Carlo method, the model can jointly estimate the parameters in the return equation, the volatility equation, and the measurement equation. As an illustration, we conduct a simulation study and apply the proposed method to the US and Japan stock markets. Based on quantile forecasting and volatility estimation, we find that the threshold heteroskedastic framework with realized volatility successfully models the asymmetric dynamic structure. We also investigate the predictive ability of volatility by comparing the proposed model with the traditional GARCH model as well as some popular asymmetric GARCH and realized GARCH models. This threshold realized GARCH model with skew Student‐t innovations outperforms the competing risk models in out‐of‐sample volatility and Value‐at‐Risk forecasting.  相似文献   

3.
One of the issues contributing to the success of any extreme value modeling is the choice of the number of upper order statistics used for inference, or equivalently, the selection of an appropriate threshold. In this paper we propose a Bayesian predictive approach to the peaks over threshold method with the purpose of estimating extreme quantiles beyond the range of the data. In the peaks over threshold (POT) method, we assume that the threshold identifies a model with a specified prior probability, from a set of possible models. For each model, the predictive distribution of a future excess over the corresponding threshold is computed, as well as a conditional estimate for the corresponding tail probability. The unconditional tail probability for a given future extreme observation from the unknown distribution is then obtained as an average of the conditional tail estimates with weights given by the posterior probability of each model.  相似文献   

4.
In this paper, we propose some algorithms for the simulation of the distribution of certain diffusions conditioned on a terminal point. We prove that the conditional distribution is absolutely continuous with respect to the distribution of another diffusion which is easy for simulation, and the formula for the density is given explicitly. An example of parameter estimation for a Duffing–Van der Pol oscillator is given as an application.  相似文献   

5.
The accurate estimation of outstanding liabilities of an insurance company is an essential task. This is to meet regulatory requirements, but also to achieve efficient internal capital management. Over the recent years, there has been increasing interest in the utilisation of insurance data at a more granular level, and to model claims using stochastic processes. So far, this so-called ‘micro-level reserving’ approach has mainly focused on the Poisson process.In this paper, we propose and apply a Cox process approach to model the arrival process and reporting pattern of insurance claims. This allows for over-dispersion and serial dependency in claim counts, which are typical features in real data. We explicitly consider risk exposure and reporting delays, and show how to use our model to predict the numbers of Incurred-But-Not-Reported (IBNR) claims. The model is calibrated and illustrated using real data from the AUSI data set.  相似文献   

6.
The gamma distribution arises frequently in Bayesian models, but there is not an easy-to-use conjugate prior for the shape parameter of a gamma. This inconvenience is usually dealt with by using either Metropolis–Hastings moves, rejection sampling methods, or numerical integration. However, in models with a large number of shape parameters, these existing methods are slower or more complicated than one would like, making them burdensome in practice. It turns out that the full conditional distribution of the gamma shape parameter is well approximated by a gamma distribution, even for small sample sizes, when the prior on the shape parameter is also a gamma distribution. This article introduces a quick and easy algorithm for finding a gamma distribution that approximates the full conditional distribution of the shape parameter. We empirically demonstrate the speed and accuracy of the approximation across a wide range of conditions. If exactness is required, the approximation can be used as a proposal distribution for Metropolis–Hastings. Supplementary material for this article is available online.  相似文献   

7.
Abstract

The “leapfrog” hybrid Monte Carlo algorithm is a simple and effective MCMC method for fitting Bayesian generalized linear models with canonical link. The algorithm leads to large trajectories over the posterior and a rapidly mixing Markov chain, having superior performance over conventional methods in difficult problems like logistic regression with quasicomplete separation. This method offers a very attractive solution to this common problem, providing a method for identifying datasets that are quasicomplete separated, and for identifying the covariates that are at the root of the problem. The method is also quite successful in fitting generalized linear models in which the link function is extended to include a feedforward neural network. With a large number of hidden units, however, or when the dataset becomes large, the computations required in calculating the gradient in each trajectory can become very demanding. In this case, it is best to mix the algorithm with multivariate random walk Metropolis—Hastings. However, this entails very little additional programming work.  相似文献   

8.
讨论了具有散度偏大特征计数数据的建模与拟合问题.针对导致数据散度偏大的原因和常用的几类候选模型的结构,分别给出了关于嵌套模型的模型与变量同时选择的Bayes方法和关于非嵌套模型的模型检验与比较方法,并在此基础上进一步完善,提出了较为系统完整的模型与变量选择方法.实际例子说明了方法的具体实现过程和有效性.  相似文献   

9.
This paper aims to provide a practical example of assessment and propagation of input uncertainty for option pricing when using tree‐based methods. Input uncertainty is propagated into output uncertainty, reflecting that option prices are as unknown as the inputs they are based on. Option pricing formulas are tools whose validity is conditional not only on how close the model represents reality, but also on the quality of the inputs they use, and those inputs are usually not observable. We show three different approaches to integrating out the model nuisance parameters and show how this translates into model uncertainty in the tree model space for the theoretical option prices. We compare our method with classical calibration‐based results assuming that there is no options market established and no statistical model linking inputs and outputs. These methods can be applied to pricing of instruments for which there is no options market, as well as a methodological tool to account for parameter and model uncertainty in theoretical option pricing. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

10.
The modified mixture model with Markov switching volatility specification is introduced to analyze the relationship between stock return volatility and trading volume. We propose to construct an algorithm based on Markov chain Monte Carlo simulation methods to estimate all the parameters in the model using a Bayesian approach. The series of returns and trading volume of the British Petroleum stock will be analyzed. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

11.
We describe algorithms for estimating a given measure known up to a constant of proportionality, based on a large class of diffusions (extending the Langevin model) for which is invariant. We show that under weak conditions one can choose from this class in such a way that the diffusions converge at exponential rate to , and one can even ensure that convergence is independent of the starting point of the algorithm. When convergence is less than exponential we show that it is often polynomial at verifiable rates. We then consider methods of discretizing the diffusion in time, and find methods which inherit the convergence rates of the continuous time process. These contrast with the behavior of the naive or Euler discretization, which can behave badly even in simple cases. Our results are described in detail in one dimension only, although extensions to higher dimensions are also briefly described.  相似文献   

12.
We consider a network of sensors that measure the intensities of a complex plume composed of multiple absorption–diffusion source components. We address the problem of estimating the plume parameters, including the spatial and temporal source origins and the parameters of the diffusion model for each source, based on a sequence of sensor measurements. The approach not only leads to multiple‐source detection, but also the characterization and prediction of the combined plume in space and time. The parameter estimation is formulated as a Bayesian inference problem, and the solution is obtained using a Markov chain Monte Carlo algorithm. The approach is applied to a simulation study, which shows that an accurate parameter estimation is achievable. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

13.
To understand and predict chronological dependence in the second‐order moments of asset returns, this paper considers a multivariate hysteretic autoregressive (HAR) model with generalized autoregressive conditional heteroskedasticity (GARCH) specification and time‐varying correlations, by providing a new method to describe a nonlinear dynamic structure of the target time series. The hysteresis variable governs the nonlinear dynamics of the proposed model in which the regime switch can be delayed if the hysteresis variable lies in a hysteresis zone. The proposed setup combines three useful model components for modeling economic and financial data: (1) the multivariate HAR model, (2) the multivariate hysteretic volatility models, and (3) a dynamic conditional correlation structure. This research further incorporates an adapted multivariate Student t innovation based on a scale mixture normal presentation in the HAR model to tolerate for dependence and different shaped innovation components. This study carries out bivariate volatilities, Value at Risk, and marginal expected shortfall based on a Bayesian sampling scheme through adaptive Markov chain Monte Carlo (MCMC) methods, thus allowing to statistically estimate all unknown model parameters and forecasts simultaneously. Lastly, the proposed methods herein employ both simulated and real examples that help to jointly measure for industry downside tail risk.  相似文献   

14.
We consider the problems whose mathematical model is determined by some Markov chain terminating with probability one; moreover, we have to estimate linear functionals of a solution to an integral equation of the second kind with the corresponding substochastic kernel and free term [1]. To construct weighted modifications of numerical statistical models, we supplement the coordinates of the phase space with auxiliary variables whose random values functionally define the transitions in the initial chain. Having implemented each auxiliary random variable, we multiply the weight by the ratio of the corresponding densities of the initial and numerically modeled distributions. We solve the minimization problem for the variances of estimators of linear functionals by choosing the modeled distribution of the first auxiliary random variable.  相似文献   

15.
In this paper, we give an ever wider and new class of minimax estimators for the location vector of an elliptical distribution (a scale mixture of normal densities) with an unknown scale parameter. The its application to variance reduction for Monte Carlo simulation when control variates are used is considered. The results obtained thus extend (i) Berger's result concerning minimax estimation of location vectors for scale mixtures of normal densities with known scale parameter and (ii) Strawderman's result on the estimation of the normal mean with common unknown variance.Research partially supported by National Science Foundation, Grant #DMS 8901922.  相似文献   

16.
The Leontief input-output model is generalized and formulated as a generalized linear complementarity problem. Conditions for existence of solutions are given, and solution techniques are reviewed. An application of the model to choosing new technologies is suggested.  相似文献   

17.
This work deals with log‐symmetric regression models, which are particularly useful when the response variable is continuous, strictly positive, and following an asymmetric distribution, with the possibility of modeling atypical observations by means of robust estimation. In these regression models, the distribution of the random errors is a member of the log‐symmetric family, which is composed by the log‐contaminated‐normal, log‐hyperbolic, log‐normal, log‐power‐exponential, log‐slash and log‐Student‐t distributions, among others. One way to select the best family member in log‐symmetric regression models is using information criteria. In this paper, we formulate log‐symmetric regression models and conduct a Monte Carlo simulation study to investigate the accuracy of popular information criteria, as Akaike, Bayesian, and Hannan‐Quinn, and their respective corrected versions to choose adequate log‐symmetric regressions models. As a business application, a movie data set assembled by authors is analyzed to compare and obtain the best possible log‐symmetric regression model for box offices. The results provide relevant information for model selection criteria in log‐symmetric regressions and for the movie industry. Economic implications of our study are discussed after the numerical illustrations.  相似文献   

18.
We analyze a semiparametric model for data that suffer from the problems of sample selection, where some of the data are observed for only part of the sample with a probability that depends on a selection equation, and of endogeneity, where a covariate is correlated with the disturbance term. The introduction of nonparametric functions in the model permits great flexibility in the way covariates affect response variables. We present an efficient Bayesian method for the analysis of such models that allows us to consider general systems of outcome variables and endogenous regressors that are continuous, binary, censored, or ordered. Estimation is by Markov chain Monte Carlo (MCMC) methods. The algorithm we propose does not require simulation of the outcomes that are missing due to the selection mechanism, which reduces the computational load and improves the mixing of the MCMC chain. The approach is applied to a model of women’s labor force participation and log-wage determination. Data and computer code used in this article are available online.  相似文献   

19.
We calibrate and contrast the recent generalized multinomial logit model and the widely used latent class logit model approaches for studying heterogeneity in consumer purchases. We estimate the parameters of the models on panel data of household ketchup purchases, and find that the generalized multinomial logit model outperforms the best‐fitting latent class logit model in terms of the Bayesian information criterion. We compare the posterior estimates of coefficients for individual customers based on the two different models and discuss how the differences could affect marketing strategies (such as pricing), which could be affected by applying each of the models. We also describe extensions to the scale heterogeneity model that includes the effects of state dependence and purchase history. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

20.
One prey-predator model is formulated and the global behavior of its solution is analyzed. In this model, the carrying capacity of predator depends on the amount of its prey, and the Holling II functional response is involved. This model may have four classes of positive equilibriums and limit cycle. The positive equilibriums may be stable, or a saddle-node, or a saddle, or a degenerate singular point. In alpine meadow ecosystem, the dynamics of vegetation and plateau pika can be described by this model. Through simulating with virtual parameters, the cause of alpine meadow degradation and effective recovery strategy is investigated. Increasing grazing rate or decreasing plateau pika mortality may cause alpine meadow degradation. Correspondingly, reducing grazing rate and increasing plateau pika mortality may recover the degraded alpine meadow effectively.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号