首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
The strategy-based approach to portfolio margining has been used for margining customer accounts for more than four decades. The risk-based approach was proposed in the mid eighties for margining some inventory accounts of brokers but permitted for margining customer accounts only in 2005. This paper presents a computational experiment with the strategy-based approach and the risk-based approach with the purpose of clarifying which one yields lower margin requirements under different scenarios. There exists a widespread opinion, cf. (Reuters 2007; Longo 2007; Smith 2008), that the risk-based approach is always a winner in this competition, and therefore the strategy-based approach must be disqualified as outdated. However, the results of our experiment with portfolios of stock options show that, in many practical situations, the strategy-based approach yields substantially lower margin requirements in comparison with the risk-based approach.  相似文献   

2.
In this study, a new approach is developed to solve the initial value problem for interval linear differential equations. In the considered problem, the coefficients and the initial values are constant intervals. In the developed approach, there is no need to define a derivative for interval-valued functions. All derivatives used in the approach are classical derivatives of real functions. The reason for this is that the solution of the problem is defined as a bunch of real functions. Such a solution concept is compatible also with the robust stability concept. Sufficient conditions are provided for the solution to be expressed analytically. In addition, on a numerical example, the solution obtained by the proposed approach is compared with the solution obtained by the generalized Hukuhara differentiability. It is shown that the proposed approach gives a new type of solution. The main advantage of the proposed approach is that the solution to the considered interval initial value problem exists and is unique, as in the real case.  相似文献   

3.
We propose a new approach to portfolio optimization by separating asset return distributions into positive and negative half-spaces. The approach minimizes a newly-defined Partitioned Value-at-Risk (PVaR) risk measure by using half-space statistical information. Using simulated data, the PVaR approach always generates better risk-return tradeoffs in the optimal portfolios when compared to traditional Markowitz mean–variance approach. When using real financial data, our approach also outperforms the Markowitz approach in the risk-return tradeoff. Given that the PVaR measure is also a robust risk measure, our new approach can be very useful for optimal portfolio allocations when asset return distributions are asymmetrical.  相似文献   

4.
This paper presents a mathematical programming based clustering approach that is applied to a digital platform company’s customer segmentation problem involving demographic and transactional attributes related to the customers. The clustering problem is formulated as a mixed-integer programming problem with the objective of minimizing the maximum cluster diameter among all clusters. In order to overcome issues related to computational complexity of the problem, we developed a heuristic approach that improves computational times dramatically without compromising from optimality in most of the cases that we tested. The performance of this approach is tested on a real problem. The analysis of our results indicates that our approach is computationally efficient and creates meaningful segmentation of data.  相似文献   

5.
This paper proposes an approach for prescribing the inter-related decisions that prescribe a process plan for a series of dual head placement machines. The goal is to provide a means of rapidly prescribing a process plan that seeks to minimize cycle time (equivalently, maximize throughput rate) by balancing workloads assigned to heads when assembling a given type of circuit card. Our approach decomposes process planning decisions into four related problems. This paper explores list-processing heuristics for the first of these problems and adapts an optimizing method for the fourth. It also integrates these two methods with others to optimize the second and third of these problems, reporting computational tests that evaluate the overall approach to all four problems and assessing the degree to which the decomposition approach is able to balance workloads assigned and the run time required to do so. Resulting cycle times are also analyzed. These test results demonstrate that the overall approach provides a means of prescribing effective process plans within a run time acceptable to process planners.  相似文献   

6.
A finite mixture model has been used to fit the data from heterogeneous populations to many applications. An Expectation Maximization (EM) algorithm is the most popular method to estimate parameters in a finite mixture model. A Bayesian approach is another method for fitting a mixture model. However, the EM algorithm often converges to the local maximum regions, and it is sensitive to the choice of starting points. In the Bayesian approach, the Markov Chain Monte Carlo (MCMC) sometimes converges to the local mode and is difficult to move to another mode. Hence, in this paper we propose a new method to improve the limitation of EM algorithm so that the EM can estimate the parameters at the global maximum region and to develop a more effective Bayesian approach so that the MCMC chain moves from one mode to another more easily in the mixture model. Our approach is developed by using both simulated annealing (SA) and adaptive rejection metropolis sampling (ARMS). Although SA is a well-known approach for detecting distinct modes, the limitation of SA is the difficulty in choosing sequences of proper proposal distributions for a target distribution. Since ARMS uses a piecewise linear envelope function for a proposal distribution, we incorporate ARMS into an SA approach so that we can start a more proper proposal distribution and detect separate modes. As a result, we can detect the maximum region and estimate parameters for this global region. We refer to this approach as ARMS annealing. By putting together ARMS annealing with the EM algorithm and with the Bayesian approach, respectively, we have proposed two approaches: an EM-ARMS annealing algorithm and a Bayesian-ARMS annealing approach. We compare our two approaches with traditional EM algorithm alone and Bayesian approach alone using simulation, showing that our two approaches are comparable to each other but perform better than EM algorithm alone and Bayesian approach alone. Our two approaches detect the global maximum region well and estimate the parameters in this region. We demonstrate the advantage of our approaches using an example of the mixture of two Poisson regression models. This mixture model is used to analyze a survey data on the number of charitable donations.  相似文献   

7.
We explore an approach to possibilistic fuzzy clustering that avoids a severe drawback of the conventional approach, namely that the objective function is truly minimized only if all cluster centers are identical. Our approach is based on the idea that this undesired property can be avoided if we introduce a mutual repulsion of the clusters, so that they are forced away from each other. We develop this approach for the possibilistic fuzzy c-means algorithm and the Gustafson–Kessel algorithm. In our experiments we found that in this way we can combine the partitioning property of the probabilistic fuzzy c-means algorithm with the advantages of a possibilistic approach w.r.t. the interpretation of the membership degrees.  相似文献   

8.
In this exploratory study, we examined the effects of a quantitative reasoning instructional approach to linear equations in two variables on community college students’ conceptual understanding, procedural fluency, and reasoning ability. This was done in comparison to the use of a traditional procedural approach for instruction on the same topic. Data were gathered from a common unit assessment that included procedural and conceptual questions. Results demonstrate that small changes in instruction focused on quantitative reasoning can lead to significant differences in students’ ability to demonstrate conceptual understanding compared to a procedural approach. The results also indicate that a quantitative reasoning approach does not appear to diminish students’ procedural skills, but that additional work is needed to understand how to best support students’ understanding of linear relationships.  相似文献   

9.
A dynamic programming approach is proposed to select optimally among a given set of products and allocate integer shelf-space units to the selected products in supermarkets. The approach is designed to consider general objective-function specifications that account for space elasticity, costs of sales, and potential demand-related marketing variables. The optimization is subject to constraints due to product supply availability, 'block' product allocation and operational requirements. A primary focus is on the development of a tractable model approach that can effectively be implemented on a microcomputer. A discussion of applications and computational experience on a microcomputer is provided to support the practical applicability of the optimization approach.  相似文献   

10.
《Applied Mathematical Modelling》2014,38(7-8):2141-2150
Zou et al. (2008) [21] presented weighted-average of all possible choice values approach of soft sets under incomplete information system in decision making. However, the approach is hard to understand and involves a great amount of computation. In order to simplify the approach, we present the simplified probability to directly instead of the incomplete information, and demonstrate the equivalence between the weighted-average of all possible choice values approach and the simplified probability approach. Finally, comparison results show that the proposed approach involves relatively less computation and is easier to implement and understand as compared with the weighted-average of all possible choice values approach.  相似文献   

11.
Risk Parity (RP), also called equally weighted risk contribution, is a recent approach to risk diversification for portfolio selection. RP is based on the principle that the fractions of the capital invested in each asset should be chosen so as to make the total risk contributions of all assets equal among them. We show here that the Risk Parity approach is theoretically dominated by an alternative similar approach that does not actually require equally weighted risk contribution of all assets but only an equal upper bound on all such risks. This alternative approach, called Equal Risk Bounding (ERB), requires the solution of a nonconvex quadratically constrained optimization problem. The ERB approach, while starting from different requirements, turns out to be strictly linked to the RP approach. Indeed, when short selling is allowed, we prove that an ERB portfolio is actually an RP portfolio with minimum variance. When short selling is not allowed, there is a unique RP portfolio and it contains all assets in the market. In this case, the ERB approach might lead to the RP portfolio or it might lead to portfolios with smaller variance that do not contain all assets, and where the risk contributions of each asset included in the portfolio is strictly smaller than in the RP portfolio. We define a new riskiness index for assets that allows to identify those assets that are more likely to be excluded from the ERB portfolio. With these tools we then provide an exact method for small size nonconvex ERB models and a very efficient and accurate heuristic for larger problems of this type. In the case of a common constant pairwise correlation among all assets, a closed form solution to the ERB model is obtained and used to perform a parametric analysis when varying the level of correlation. The practical advantages of the ERB approach over the RP strategy are illustrated with some numerical examples. Computational experience on real-world and on simulated data confirms accuracy and efficiency of our heuristic approach to the ERB model also in comparison with some state-of-the-art local and global optimization codes.  相似文献   

12.
In this article, we study an abstract constrained optimization problem that appears commonly in the optimal control of linear partial differential equations. The main emphasis of the present study is on the case when the ordering cone for the optimization problem has an empty interior. To circumvent this major difficulty, we propose a new conical regularization approach in which the main idea is to replace the ordering cone by a family of dilating cones. We devise a general regularization approach and use it to give a detailed convergence analysis for the conical regularization as well as a related regularization approach. We showed that the conical regularization approach leads to a family of optimization problems that admit regular multipliers. The approach remains valid in the setting of general Hilbert spaces and it does not require any sort of compactness or positivity condition on the operators involved. One of the main advantages of the approach is that it is amenable for numerical computations. We consider four different examples, two of them elliptic control problems with state constraints, and present numerical results that completely support our theoretical results and confirm the numerical feasibility of our approach. The motivation for the conical regularization is to overcome the difficulties associated with the lack of Slater's type constraint qualification, which is a common hurdle in numerous branches of applied mathematics including optimal control, inverse problems, vector optimization, set-valued optimization, sensitivity analysis, variational inequalities, among others.  相似文献   

13.
Pengfei Liu  Tiande Guo 《Optimization》2016,65(8):1641-1650
In 2004, Bertsimas and Sim proposed a robust approach that can control the degree of conservatism by applying a limitation Γ to the maximum number of parameters that are allowed to change. However, the robust approach can become extremely conservative even when Γ is relatively small. In this paper, we provide a theoretical analysis to explain why this extreme conservatism occurs. We further point out that the robust approach does not reach an extremely conservative state when Γ is less than k, where k is the number of nonzero components of the optimal solution of the extremely conservative robust approach. This research also shows that care must be taken when adjusting the value of Γ to control the degree of conservatism because the approach may result in greater conservatism than was intended. We subsequently apply our analysis to additive combinatorial optimization problems. Finally, we illustrate our results on numerical simulations.  相似文献   

14.
Based on the equilibrium efficient frontier data envelopment analysis (EEFDEA) approach, Fang (J Oper Res Soc 67:412–420, 2015a) developed an equivalent linear programming model to improve and strengthen the EEFDEA approach. Furthermore, Fang (2015a) indicated that his secondary goal approach can achieve a unique equilibrium efficient frontier. However, through a simple counterexample we demonstrate that Fang’s secondary goal approach cannot always achieve uniqueness of the equilibrium efficient frontier. In this paper, we propose an algorithm based on the secondary goal approach to address the problem. The proposed algorithm is proven mathematically to be an effective approach to guaranteeing the uniqueness of the equilibrium efficient frontier.  相似文献   

15.
In this paper, a novel approach, namely, the linearization‐based approach of homotopy analysis method, to analytically treat non‐linear time‐fractional PDEs is proposed. The presented approach suggests a new optimized structure of the homotopy series solution based on a linear approximation of the non‐linear problem. A comparative study between the proposed approach and standard homotopy analysis approach is illustrated by solving two examples involving non‐linear time‐fractional parabolic PDEs. The performed numerical simulations demonstrate that the linearization‐based approach reduces the computational complexity and improves the performance of the homotopy analysis method.  相似文献   

16.
This work presents a hybrid approach based on the use of genetic algorithms to solve efficiently the problem of cutting structural beams arising in a local metalwork company. The problem belongs to the class of one-dimensional multiple stock sizes cutting stock problem, namely 1-dimensional multiple stock sizes cutting stock problem. The proposed approach handles overproduction and underproduction of beams and embodies the reusability of remnants in the optimization process. Along with genetic algorithms, the approach incorporates other novel refinement algorithms that are based on different search and clustering strategies. Moreover, a new encoding with a variable number of genes is developed for cutting patterns in order to make possible the application of genetic operators. The approach is experimentally tested on a set of instances similar to those of the local metalwork company. In particular, comparative results show that the proposed approach substantially improves the performance of previous heuristics.  相似文献   

17.
Banks and other financial institutions try to compute the necessary amount of total capital that they need for absorbing stochastically dependent losses from different risk types (e.g., credit risk and market risk). Two sophisticated procedures of this so-called integrated risk management are the top-down and the bottom-up approaches. When banks apply a more sophisticated risk integration approach at all, it is usually the top-down approach where copula functions are employed for linking the marginal distributions of profit and losses resulting from different risk types. However, it is not clear at all how accurate this approach is. Assuming that the bottom-up approach corresponds to the real-word data-generating process and using a comprehensive simulation study, it is shown that the top-down approach can underestimate the necessary amount of total capital for lower credit qualities. Furthermore, the direction and strength of the stochastic dependence between the risk types, the copula function employed, and the loss definitions all have an impact on the performance of the top-down approach. In addition, a goodness-of-fit test shows that, based on time series of loss data with realistic length, it is rather difficult to decide which copula function is the right one.  相似文献   

18.
We analyse a new optimization-based approach for feature selection that uses the nested partitions method for combinatorial optimization as a heuristic search procedure to identify good feature subsets. In particular, we show how to improve the performance of the nested partitions method using random sampling of instances. The new approach uses a two-stage sampling scheme that determines the required sample size to guarantee convergence to a near-optimal solution. This approach therefore also has attractive theoretical characteristics. In particular, when the algorithm terminates in finite time, rigorous statements can be made concerning the quality of the final feature subset. Numerical results are reported to illustrate the key results, and show that the new approach is considerably faster than the original nested partitions method and other feature selection methods.  相似文献   

19.
A two-warehouse inventory model for deteriorating items with time-dependent demand has been developed. Compared with previous models, the model involves a free form time-dependent demand and a finite replenishment rate within a finite planning horizon. Rather than the heuristic approach of equal production cycle times adopted by Lee and Ma, an approach which permits variation in production cycle times is adopted to determine the number of production cycles and the times for replenishment during a finite planning horizon. Numerical examples are provided to illustrate the application of the model and the results indicate that the performance of the proposed approach is superior to that of the heuristic approach of Lee and Ma.  相似文献   

20.
The paper contains results concerning the development of a new approach to the proof of existence theorems for generalized solutions to systems of quasilinear conservation laws. This approach is based on reducing the search for a generalized solution to analyzing extremal properties of a certain set of functionals and is referred to as a variational approach. The definition of a generalized solution can be naturally reformulated in terms of the existence of critical points for a set of functionals, which is convenient within the approach proposed. The variational representation of generalized solutions, which was earlier known for Hopf-type equations, is generalized to systems of quasilinear conservation laws. The extremal properties of the functionals corresponding to systems of conservation laws are described within the variational approach, and a strategy for proving the existence theorem is outlined. In conclusion, it is shown that the variational approach can be generalized to the two-dimensional case.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号