首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 11 毫秒
1.
In this paper, I re-examine how the mean–variance analysis is consistent with its traditional theoretical foundations, namely, stochastic dominance and the expected utility theory. Then I propose a simplified version of the coarse utility theory as a new foundation. I prove that, by assuming risk aversion and the normality of asset variables, the simplified model is well behaved; indifference curves are convex and the opportunity set is concave. Therefore, there exist global optimal portfolios in the market. Finally, I prove that decision-making in accordance with the simplified model is consistent with the mean–variance analysis.  相似文献   

2.
This article reports a test of theories of payoff allocation in n‐person game‐theoretic systems. An experimental study was conducted to test the relative predictive accuracy of three solution concepts (imputation set, stable set, core) in the context of 4‐person, 2‐strategy non‐sidepayment games. Predictions from each of the three solution concepts were computed on the basis of both α‐effectiveness (von Neumann‐Morgenstern) and β‐effectiveness (Aumann), making a total of six predictive theories under test. Two important results emerged. First, the data show that the g‐imputation set was more accurate than the a‐imputation set, the β‐stable set was more accurate than the α‐stable set, and the (3‐core was more accurate than the α‐core; in other words, for each of the solutions tested, the prediction from any solution concept based on (β‐effectiveness was more accurate than the prediction from the same solution based on a‐effectiveness. Second, the β‐core was the most accurate of the six theories tested. Results are interpreted as showing that β‐effectiveness is superior to a‐effectiveness as a basis for payoff predictions in cooperative non‐sidepayment games.  相似文献   

3.
Non‐uniform binary linear subdivision schemes, with finite masks, over uniform grids, are studied. A Laurent polynomial representation is suggested and the basic operations required for smoothness analysis are presented. As an example it is shown that the interpolatory 4‐point scheme is C 1 with an almost arbitrary non‐uniform choice of the free parameter. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

4.
In this paper, we consider expected value, variance and worst–case optimization of nonlinear models. We present algorithms for computing optimal expected value, and variance policies, based on iterative Taylor expansions. We establish convergence and consider the relative merits of policies based on expected value optimization and worst–case robustness. The latter is a minimax strategy and ensures optimal cover in view of the worst–case scenario(s) while the former is optimal expected performance in a stochastic setting. Both approaches are used with a small macroeconomic model to illustrate relative performance, robustness and trade-offs between the alternative policies.  相似文献   

5.
Infrared spectroscopy has been used to study the effect of -radiation on the distribution of an applied stress over the individual polymer chains. It has been shown that, as the irradiation dose is increased, the maximum stress on the bonds which are ruptured is attained at a lower value for the mean stress applied to the sample.V. I. Lenin Tadzhik State University, Dushanbe. Translated from Mekhanika Polimerov, No. 2, pp. 214–217, March–April, 1975.  相似文献   

6.
After a discussion on what is a non‐mathematician and what is an applied mathematician, aims that have been variously suggested are presented and discussed. Attention is drawn to the importance of an understanding of model building and of mathematics as the language of science, and this leads to a plea for co‐operation between the mathematician and the non‐mathematician. Reference is made to the relevance of the ‘New Mathematics’ and to the demand for mathematical rigour, and the paper closes with a brief discussion of the importance of detailed objectives.

  相似文献   

7.
In this note, we address the problem of the existence and location of periodic solutions in nonlinear differential systems in the 3–space. Our main motivation is the study, via bifurcation theory, of periodic solutions (especially limit cycles). We study this problem in two simple polynomial (chaotic) systems: The first one, due to Muthuswamy and Chua, is the mathematical model to the simplest chaotic circuit, and the second is due to Sprott et al.  相似文献   

8.
This paper investigates the time-consistent dynamic mean–variance hedging of longevity risk with a longevity security contingent on a mortality index or the national mortality. Using an HJB framework, we solve the hedging problem in which insurance liabilities follow a doubly stochastic Poisson process with an intensity rate that is correlated and cointegrated to the index mortality rate. The derived closed-form optimal hedging policy articulates the important role of cointegration in longevity hedging. We show numerically that a time-consistent hedging policy is a smoother function in time when compared with its time-inconsistent counterpart.  相似文献   

9.
A new C interpolant is presented for the univariate Hermite interpolation problem. It differs from the classical solution in that the interpolant is of non‐polynomial nature. Its basis functions are a set of simple, compact support, transcendental functions. The interpolant can be regarded as a truncated Multipoint Taylor series. It has essential singularities at the sample points, but is well behaved over the real axis and satisfies the given functional data. The interpolant converges to the underlying real‐analytic function when (i) the number of derivatives at each point tends to infinity and the number of sample points remains finite, and when (ii) the spacing between sample points tends to zero and the number of specified derivatives at each sample point remains finite. A comparison is made between the numerical results achieved with the new method and those obtained with polynomial Hermite interpolation. In contrast with the classical polynomial solution, the new interpolant does not suffer from any ill conditioning, so it is always numerically stable. In addition, it is a much more computationally efficient method than the polynomial approach. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

10.
An initial test of the discrete‐time Markov model in the study of educational aspirations throughout high school was carried out. The design of the study permitted testing for sex differences and order effects The results indicate a good fit between the data and the model across several cohorts of students. Order effects were apparent, but sex differences in the transition probabilities were not found. Future change in aspirations appears least likely for students with a history of stable college plans, while it is most likely for those who start with non‐college aspirations and change to college plans.  相似文献   

11.
Let Figiel's reflexive Banach space which is not isomorphic to its Cartesian square. We show that the K 0group of the algebra of continuous, linear operators on contain a subgroup isomorphic to the group c 00( ) of sequences rational numbers with z n=0 eventually.  相似文献   

12.
The equal division kernel and the α‐power solution are two recently proposed theories of coalition formation in n‐person characteristic function games with sidepayments. A closed form solution for payoff disbursement is derived from the α‐power model for n‐person games with only 1, n‐1, and n‐person permissible coalitions and no weak players. It is shown that for this class of games, which are frequently employed to experimentally test theories of coalition formation, the equal division kernel is a special case of the α‐power model with α = 1/2.  相似文献   

13.
The Deepwater Horizon oil spill was an ecologically devastating event in the Gulf of Mexico, which saw the estimated release of over 4 million barrels of oil after flowing for three months in 2010. The US Fish and Wildlife Service provided a data set of 7,229 bird records. We aimed to illustrate the important features of the data set utilizing a blend of analytics and graphics executed through traditional and cloud-based software. It is concluded from the graphs that the areas containing the greatest concentration of birds were nearest to the coasts adjacent to New Orleans and the Deepwater platform. Moreover, the result of the logistic regression confirmed what is seen in bar charts, that the Laughing Gull and Brown Pelican, among many others, were most mortally impacted by the oil spill. Furthermore, additional investigation into the morbidity impact on birds over time indicates a lagging effect. A similar lag is present in the oiling of the birds where a separate time series shows the oiled/not visibly oiled birds alternating in frequency over the collection period.  相似文献   

14.
Mechanics of Composite Materials - The internal friction spectra λ = f(T) and the temperature dependence ν = f(T) of the frequency of free damping in the temperature range from...  相似文献   

15.
Arbitrage theory is used to price forward (futures) contracts in energy markets, where the underlying assets are non‐tradeable. The method is based on the so‐called ‘fitting of the yield curve’ technique from interest rate theory. The spot price dynamics of Schwartz is generalized to multidimensional correlated stochastic processes with Wiener and Lévy noise. Findings are illustrated with examples from oil and electricity markets.  相似文献   

16.

This paper considers the use of hand‐held calculators (HHCs) in schools from the pedagogical and sociological viewpoints. Using arguments based on the observed pattern of mathematical education, we discuss the effect of the use of HHCs, from the learning viewpoint and from the viewpoint of providing students with a ready competence in arithmetical manipulation. We shall also discuss the effects of the adoption of HHCs in undeveloped countries and for deprived minorities.

The topics are developed from philosophical/psychopedagogical reasoning and from developmental policy. Strategies designed to make mathematical education more immediately useful are discussed. An example is provided, and the relationship between Polya's problem‐solving approach and the use of HHCs is examined.  相似文献   

17.
《Mathematical Modelling》1987,8(2):105-115
The simulation approach to policy analysis usually concentrates on policy multipliers as a measure of the thrust of economic policy. However, this measure is inadequate for one branch of economic policy, namely, fiscal policy. The reason is that the effectiveness of fiscal policy depends, via the government budget constraint, on the method of finance. It is argued in this paper that for this very reason the conventional way of calculating simulation-based dynamic multipliers introduces a bias towards the no-crowding-out thesis. This bias arises even in models of monetarist persuasion. Furthermore, it is shown that this bias can be removed by utilizing multipliers based on optimal control. We illustrate this proposition by providing numerical results using a large-scale U.K. econometric model of international monetarist persuasion (the London Business School model, LBS). Section 1 builds up a framework through which policy optimization can be compared and evaluated to policy simulations. In Section 2 we derive and compare policy multipliers obtained through policy simulations and optimal control. Section 3 provides a numerical example with the findings being summarized in Section 4.  相似文献   

18.
Transaction costs with respect to distribution and administration play a crucial role for the performance of participating life insurance products. The aim of this paper is to investigate the impact of such initial and annual transaction costs on policyholder mean–variance preferences depending on the contract features, comparing a point-to-point guarantee, a cliquet-style guarantee, and a money-back guarantee with annual surplus component. We extend previous work by deriving analytical solutions for the maximum allowed initial transaction costs as well as the risk aversion parameter that ensure a given customer preference level for different contract types. We further conduct simulation analyses to identify key factors in regard to transaction costs. One main finding is that in the present setting, insurers can indeed charge higher costs for more complex products with cliquet-style features, and that the difference in costs between the various product types increases considerably in a low interest-rate environment. However, these results are heavily impacted and even reversed depending on the risk–return asset characteristics, as insurers with a riskier asset management strategy may no longer be able to charge higher transaction costs for complex products with a strong annual cliquet-style surplus participation component without reducing their attractiveness to customers.  相似文献   

19.
Acta Mathematicae Applicatae Sinica, English Series - Given a graph G, the adjacency matrix and degree diagonal matrix of G are denoted by A(G) and D(G), respectively. In 2017, Nikiforov[24]...  相似文献   

20.
In the space L p , 1 ≤ p < 2, on the half-line with power weight, Jackson’s inequality between the value of the best approximation of a function by even entire functions of exponential type and its modulus of continuity defined by means of a generalized shift operator is well known. The question of the sharpness of the inequality remained open. For the constant in Jackson’s inequality, we obtain a lower bound, which proves its sharpness.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号