首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 142 毫秒
1.
Numerous researchers have applied the martingale approach for models driven by Levy processes to study optimal investment problems. This paper considers an insurer who wants to maximize the expected utility of terminal wealth by selecting optimal investment and proportional reinsurance strategies. The insurer's risk process is modeled by a Levy process and the capital can be invested in a security market described by the standard Black-Scholes model. By the martingale approach, the closed-form solutions to the problems of expected utility maximization are derived. Numerical examples are presented to show the impact of model parameters on the optimal strategies.  相似文献   

2.
3.
Ina Schmidt 《PAMM》2009,9(1):409-410
Carbon nanotubes are increasingly getting impact as reinforcing material for polymer based nanocomposites. Hence, new modeling strategies are necessary to calculate the behavior of these materials. In the last years some attempts have been made using and developing classical micromechanical models. On the other hand numerical homogenization methods are available to tackle this problem. Examples for both types of modeling strategies are presented with focus on the nanotube geometry. The nanotubes are modeled as hollow tubes as well as as isotropic and transversely isotropic cylinders. As expected the results of numerical and analytical methods are identical for isotropic cylinder inclusions. Small deviations occur for transversely isotropic cylinders in transverse direction. In the case of hollow tube inclusions, the analytical models lead to lower stiffness values in transverse direction and for shear. The largest deviations occur for longitudinal shear with magnitudes smaller than 10%. In contrast the effort to get numerical results is enormous, so that the analytical models are still useful. (© 2009 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

4.
Advantages and limitations of the existing models for practical forecasting of stock market volatility have been identified. Support vector machine (SVM) have been proposed as a complimentary volatility model that is capable to extract information from multiscale and high-dimensional market data. Presented results for SP500 index suggest that SVM can efficiently work with high-dimensional inputs to account for volatility long-memory and multiscale effects and is often superior to the main-stream volatility models. SVM-based framework for volatility forecasting is expected to be important in the development of the novel strategies for volatility trading, advanced risk management systems, and other applications dealing with multi-scale and high-dimensional market data.  相似文献   

5.
Agents’ behavior in oligopolistic markets has traditionally been represented by equilibrium models. Recently, several approaches based on conjectural variations equilibrium models have been proposed for representing agents’ behavior in electrical power markets. These models provide insight of market equilibrium sensitivity to agents’ strategies and external variables, and therefore, they are widely applied. Unfortunately, not enough analysis has been done in how these user-supplied parameters, the conjectural variations, should be estimated. This paper proposes a parameter inference procedure based on two stages. The first stage infers historical values of the parameter by fitting the models’ results to historical market data. The second stage is based on a statistical time-series model whose objective is to forecast parameter values in future scenarios. Additionally, results of this procedure’s application to a real-size case are presented.  相似文献   

6.
Drilling optimization problems in oilfields are usually formulated and solved by using deterministic mathematical models, in which uncertain (indeterminate) factors or random issues are not taken into consideration. However, it has been widely experienced that random factors (such as those from soil layers, drill bits, and surface equipment) greatly affect the drilling performance. This paper introduces a new stochastic model for describing such random effects. This model, when used to optimization design, is more practical and provides a better characterization for real oilfield situations as compared with other deterministic models, and has been demonstrated to be more efficient in solving real design problems of drilling optimizations.  相似文献   

7.
Deterministic mine planning models along a time horizon have proved to be very effective in supporting decisions on sequencing the extraction of material in copper mines. Some of these models have been developed for, and used successfully by CODELCO, the Chilean state copper company. In this paper, we wish to consider the uncertainty in a very volatile parameter of the problem, namely, the copper price along a given time horizon. We represent the uncertainty by a multistage scenario tree. The resulting stochastic model is then converted into a mixed 0–1 Deterministic Equivalent Model using a compact representation. We first introduce the stochastic model that maximizes the expected profit along the time horizon over all scenarios (i.e., as in a risk neutral environment). We then present several approaches for risk management in a risk averse environment. Specifically, we consider the maximization of the Value-at-Risk and several variants of the Conditional Value-at-Risk (one of them is new), the maximization of the expected profit minus the weighted probability of having an undesirable scenario in the solution provided by the model, and the maximization of the expected profit subject to stochastic dominance constraints recourse-integer for a set of profiles given by the pairs of target profits and bounds on either the probability of failure or the expected profit shortfall. We present an extensive computational experience on the actual problem, by comparing the risk neutral approach, the tested risk averse strategies and the performance of the traditional deterministic approach that uses the expected value of the uncertain parameters. The results clearly show the advantage of using the risk neutral strategy over the traditional deterministic approach, as well as the advantage of using any risk averse strategy over the risk neutral one.  相似文献   

8.
We present a framework for sequential decision making in problems described by graphical models. The setting is given by dependent discrete random variables with associated costs or revenues. In our examples, the dependent variables are the potential outcomes (oil, gas or dry) when drilling a petroleum well. The goal is to develop an optimal selection strategy of wells that incorporates a chosen utility function within an approximated dynamic programming scheme. We propose and compare different approximations, from naive and myopic heuristics to more complex look-ahead schemes, and we discuss their computational properties. We apply these strategies to oil exploration over multiple prospects modeled by a directed acyclic graph, and to a reservoir drilling decision problem modeled by a Markov random field. The results show that the suggested strategies clearly improve the naive or myopic constructions used in petroleum industry today. This is useful for decision makers planning petroleum exploration policies.  相似文献   

9.
10.
Verification, validation and testing (VVT) of large systems is an important but complex process. The decisions involved have to consider on one hand the controllable variables associated with investments in appraisal and prevention activities and on the other hand the outcomes of these decisions that are associated with risk impacts and systems' failures. Typically, quantitative models of such large systems use simulation to generate distributions of possible costs and risk outcomes. Here, by assuming independence of risk impacts, we decompose the decision process into separate decisions for each VVT activity and supercede the simulation technique by simple analytical models. We explore various optimization objectives of VVT strategies such as minimum total expected cost, minimum uncertainty as well as a generalized optimization objective expressing Taguchi's expected loss function and provide explicit solutions. A numerical example based on simplified data of a case study is used to demonstrate the proposed VVT optimization procedure.  相似文献   

11.
This paper considers a stochastic facility location problem in which multiple capacitated facilities serve customers with a single product, and a stockout probabilistic requirement is stated as a chance constraint. Customer demand is assumed to be uncertain and to follow either a normal or an ambiguous distribution. We study robust approximations to the problem in order to incorporate information about the random demand distribution in the best possible, computationally tractable way. We also discuss how a decision maker’s risk preferences can be incorporated in the problem through robust optimization. Finally, we present numerical experiments that illustrate the performance of the different robust formulations. Robust optimization strategies for facility location appear to have better worst-case performance than nonrobust strategies. They also outperform nonrobust strategies in terms of realized average total cost when the actual demand distributions have higher expected values than the expected values used as input to the optimization models.  相似文献   

12.
Computational efficient methods for updating seemingly unrelated regressions models with new observations are proposed. A recursive algorithm to solve a series of updating problems is developed. The algorithm is based on orthogonal transformations and has as main computational tool the updated generalized QR decomposition (UGQRD). Strategies to compute the orthogonal factorizations by exploiting the block-sparse structure of the matrices are designed. The problems of adding and deleting exogenous variables from the seemingly unrelated regressions model have also been investigated. The solution of these problems utilize the strategies for computing the UGQRD.  相似文献   

13.
随机化策略的公平比较   总被引:11,自引:0,他引:11       下载免费PDF全文
采用随机化技术对敏感问题实施抽样调查,至今已有很多可供选择的随机化策略; 其中大多数策略是通过选择恰当的设计参数来达到改进效率的. 然而两种方案即使具有相同的设计参数值,但对个体的保护度也可能不同. 所以早期一些文献效率比较没有基于相同的对个体保护度, 因此那些比较是不公平的. 该文补充了这方面的理论。 在相同的保护度下,精确比较了这些已有的策略;得到的结果表明: 必须重新评价这些早期的随机化策略.  相似文献   

14.
Transient-state gas and oil-based mud (OBM) two-phase flow in wellbore annuli will occur during gas kick. The phase behavior of influx gas and OBM will make the gas kick during OBM drilling more complicated. There are three possible cases in an annulus: only liquid flow in the entire annulus, gas and liquid two-phase flow in part of the annulus, and gas and liquid two-phase flow in the entire annulus. First, the phase behaviors of gas and OBM in wellbore annuli are studied based on the phase behavior of methane and diesel. A multiphase transient-flow model in annuli during gas kick based on OBM is then established based on gas–liquid two-phase flow theory and on flash theory in annuli. The influences of phase behavior in annuli and annular geometry are taken into account. The local flow parameters are predicted by the hydrodynamic models and the local thermodynamic parameters are predicted by the heat-transfer models in the corresponding flow pattern. The proposed model has a better performance, compared with two other models, against the published experimental data. Finally, the variation of pit gain, well-bottom hole pressure, and gas void fraction are obtained, leading to a better understanding of the occurrence and evolution mechanism of gas kick during deepwater drilling.  相似文献   

15.
Extreme value theory has been widely used in analyzing catastrophic risk. The theory mentioned that the generalized Pareto distribution (GPD) could be used to estimate the limiting distribution of the excess value over a certain threshold; thus the tail behaviors are analyzed. However, the central behavior is important because it may affect the estimation of model parameters in GPD, and the evaluation of catastrophic insurance premiums also depends on the central behavior. This paper proposes four mixture models to model earthquake catastrophic loss and proposes Bayesian approaches to estimate the unknown parameters and the threshold in these mixture models. MCMC methods are used to calculate the Bayesian estimates of model parameters, and deviance information criterion values are obtained for model comparison. The earthquake loss of Yunnan province is analyzed to illustrate the proposed methods. Results show that the estimation of the threshold and the shape and scale of GPD are quite different. Value-at-risk and expected shortfall for the proposed mixture models are calculated under different confidence levels.  相似文献   

16.
In collective decision making, actors can use different influence strategies to get their way. Differences in influence strategies may, or may not, be connected to differences in collective outcomes. This research studies two influence strategies: the exchange strategy and the challenge strategy. In the existing literature, these strategies are analyzed and compared using simulation models in which actor behavior regarding influence attempts based on one of the strategies is modeled explicitly. Until now, these models have been tested only empirically on limited data sets. However, a theoretical test is necessary to gain more precise insights in the effect of characteristics of collective decision making situations on the collective outcomes. In the present research, computer simulations are used in a structured comparison of two competing models (the iterative exchange model and challenge model). The analyses show that the outcomes of both models are captured for a large part in the actor characteristics on the issues. Besides this, the expected directions of challenges and exchanges play a major part in explaining the outcomes of the models. This research shows that the use of simulated data allows a structured search of the input space, which led to new insights into the iterative exchange model and challenge model, and therefore in the exchange strategy and the challenge strategy.  相似文献   

17.
In two recent papers, the authors have studied conditions on the relaxation parameters in order to guarantee the stability or instability of solutions for the Taylor approximations to dual-phase-lag and three-phase-lag heat conduction equations. However, for several limit cases relating to the parameters, the kind of stability was unclear. Here, we analyze these limit cases and clarify whether we can expect exponential or slow decay for the solutions. Moreover, rather general well-posedness results for three-phase-lag models are presented. Finally, the exponential stability expected by spectral analysis is rigorously proved exemplarily.  相似文献   

18.
In model-based analysis for comparative evaluation of strategies for disease treatment and management, the model of the disease is arguably the most critical element. A fundamental challenge in identifying model parameters arises from the limitations of available data, which challenges the ability to uniquely link model parameters to calibration targets. Consequently, the calibration of disease models leads to the discovery of multiple models that are similarly consistent with available data. This phenomenon is known as calibration uncertainty and its effect is transferred to the results of the analysis. Insufficient examination of the breadth of potential model parameters can create a false sense of confidence in the model recommendation, and ultimately cast doubt on the value of the analysis. This paper introduces a systematic approach to the examination of calibration uncertainty and its impact. We begin with a model of the calibration process as a constrained optimization problem and introduce the notion of plausible models which define the uncertainty region for model parameters. We illustrate the approach using a fictitious disease, and explore various methods for interpreting the outputs obtained.  相似文献   

19.
This paper investigate a stochastic differential games for DC (defined contribution plans) pension under Vasicek stochastic interest rate. The finance market as the hypothetical counterpart, the investor as pension the leader of game. Our goal is through the game between pension plan investor and financial market, obtain optimal strategies to maximizes the expected utility of the terminal wealth. Under power utility function, by using stochastic control theory, we obtain closed-form solutions for the value function as well as the strategies. Finally, explain the research results in the economic sense, and though numerical calculation given the influence of some parameters on the optimal strategies  相似文献   

20.
A new Lee–Carter model parameterization is introduced with two advantages. First, the Lee–Carter parameters are normalized such that they have a direct and intuitive interpretation, comparable across populations. Second, the model is stated in terms of the “needed-exposure” (NE). The NE is the number required in order to get one expected death and is closely related to the “needed-to-treat” measure used to communicate risks and benefits of medical treatments. In the new parameterization, time parameters are directly interpretable as an overall across-age NE. Age parameters are interpretable as age-specific elasticities: percentage changes in the NE at a particular age in response to a percent change in the overall NE. A similar approach can be used to confer interpretability on parameters of other mortality models.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号