共查询到20条相似文献,搜索用时 0 毫秒
1.
《International Journal of Approximate Reasoning》2014,55(7):1570-1574
This short paper discusses the contributions made to the featured section on Low Quality Data. We further refine the distinction between the ontic and epistemic views of imprecise data in statistics. We also question the extent to which likelihood functions can be viewed as belief functions. Finally we comment on the data disambiguation effect of learning methods, relating it to data reconciliation problems. 相似文献
2.
Marek G?golewski Przemys?aw Grzegorzewski 《International Journal of Approximate Reasoning》2011,52(9):1312-1324
A class of arity-monotonic aggregation operators, called impact functions, is proposed. This family of operators forms a theoretical framework for the so-called Producer Assessment Problem, which includes the scientometric task of fair and objective assessment of scientists using the number of citations received by their publications.The impact function output values are analyzed under right-censored and dynamically changing input data. The qualitative possibilistic approach is used to describe this kind of uncertainty. It leads to intuitive graphical interpretations and may be easily applied for practical purposes.The discourse is illustrated by a family of aggregation operators generalizing the well-known Ordered Weighted Maximum (OWMax) and the Hirsch h-index. 相似文献
3.
4.
This paper develops univariate and multivariate measures of risk aversion for correlated risks. We derive Rubinstein's measures of risk aversion from the risk premiums with correlated random initial wealth and risk. It is shown that these measures are not only consistent with those for uncorrelated or independent risks, but also have the corresponding local properties of the Arrow-Pratt measures of risk aversion. Thus Rubinstein's measures of risk aversion are the appropriate extension of the Arrow-Pratt measures of risk aversion in the univariate case. We also derive a risk aversion matrix from the risk premiums with correlated initial wealth and risk vectors. This matrix measure is the multivariate version of Rubinstein's measures and is also the generalization of Duncan's results for non-random initial wealth. The univariate and multivariate measures of risk aversion developed in this paper are applied to portfolio theory in Li and Ziemba [15].This research was partially supported by the National Research Council of Canada. 相似文献
5.
In this study we present a planning methodology for a firm whose objective is to match the random supply of annual premium fruits and vegetables from a number of contracted farms and the random demand from the retailers during the planning period. The supply uncertainty is due to the uncertainty of the maturation time, harvest time, and yield. The demand uncertainty is the uncertainty of weekly demand from the retailers. We provide a planning methodology to determine the farm areas and the seeding times for annual plants that survive for only one growing season in such a way that the expected total profit is maximized. Both the single period and the multi period cases are analyzed depending on the type of the plant. The performance of the solution methodology is evaluated by using numerical experiments. These experiments show that the proposed methodology matches random supply and random demand in a very effective way and improves the expected profit substantially compared to the planning approaches where the uncertainties are not taken into consideration. 相似文献
6.
Teaching experiments with pairs of children have generated several hypotheses about students’ construction of fractions. For example, Steffe (2004) hypothesized that robust conceptions of improper fractions depends on the development of a splitting operation. Results from teaching experiments that rely on scheme theory and Steffe's hierarchy of fraction schemes imply additional hypotheses, such as the idea that the schemes do indeed form a hierarchy. Our study constitutes the first attempt to test these hypotheses and substantiate Steffe's claims using quantitative methods. We analyze data from 84 students’ performances on written tests, in order to measure students’ development of the splitting operation and construction of fraction schemes. Our findings align with many of the hypotheses implied by teaching experiments and, additionally, suggest that students’ construction of a partitive fraction scheme facilitates the development of splitting. 相似文献
7.
The last few years have seen a significant increase in publicly available software specifically targeted to the analysis of
extreme values. This reflects the increase in the use of extreme value methodology by the general statistical community. The
software that is available for the analysis of extremes has evolved in essentially independent units, with most forming extensions
of larger software environments. An inevitable consequence is that these units are spread about the statistical landscape.
Scientists seeking to apply extreme value methods must spend considerable time and effort in determining whether the currently
available software can be usefully applied to a given problem. We attempt to simplify this process by reviewing the current
state, and suggest future approaches for software development. These suggestions aim to provide a basis for an initiative
leading to the successful creation and distribution of a flexible and extensible set of tools for extreme value practitioners
and researchers alike. In particular, we propose a collaborative framework for which cooperation between developers is of
fundamental importance.
AMS 2000 Subject Classification Primary—62P99 相似文献
8.
9.
Data envelopment analysis for environmental assessment: Comparison between public and private ownership in petroleum industry 总被引:1,自引:0,他引:1
Environmental assessment recently becomes a major policy issue in the world. This study discusses how to apply Data Envelopment Analysis (DEA) for environmental assessment. An important feature of the DEA environmental assessment is that it needs to classify outputs into desirable (good) and undesirable (bad) outputs because private and public entities often produce not only desirable outputs but also undesirable outputs as a result of their production activities. This study proposes the three types of unification for DEA environmental assessment by using non-radial DEA models. The first unification considers both an increase and a decrease in the input vector along with a decrease in the direction vector of undesirable outputs. This type of unification measures “unified efficiency”. The second unification considers a decrease in an input vector along with a decrease in the vector of undesirable outputs. This type of unification is referred to as “natural disposability” and measures “unified efficiency under natural disposability”. The third unification considers an increase in an input vector but a decrease in the vector of undesirable outputs. This type of unification is referred to as “managerial disposability” and measures “unified efficiency under managerial disposability”. All the unifications increase the vector of desirable outputs. To document their practical implications, this study has applied the proposed approach to compare the performance of national oil firms with that of international oil firms. This study identifies two important findings on the petroleum industry. One of the two findings is that national oil companies under public ownership outperform international oil companies under private ownership in terms of unified (operational and environmental) efficiency and unified efficiency under natural disposability. However, the performance of international oil companies exhibits an increasing trend in unified efficiency. The other finding is that national oil companies need to satisfy the environmental standard of its own country while international oil companies need to satisfy the international standard that is more restricted than the national standards. As a consequence, international oil companies outperform national oil companies in terms of unified efficiency under managerial disposability. 相似文献
10.
In the renewal risk model, we study the asymptotic behavior of the expected time-integrated negative part of the process. This risk measure has been introduced by Loisel (2005) [1]. Both heavy-tailed and light-tailed claim amount distributions are investigated. The time horizon may be finite or infinite. We apply the results to an optimal allocation problem with two lines of business of an insurance company. The asymptotic behavior of the two optimal initial reserves is computed. 相似文献
11.
对大学生学习概率论的障碍因素进行分析,得出影响概率论学习的障碍因素包含三方面:直观能力,概率思维品质,心理因素及情感,并分析了这几方面因素,给出克服这些障碍的相应对策,以期能从根本上改进传统教学方法,更新传统教学理念. 相似文献
12.
Portfolio risk can be decomposed into two parts, the systematic risk and the nonsystematic risk. It is well known that the nonsystematic risk can be eliminated by diversification, while the systematic risk cannot. Thus, the portfolio risk, except for that of undiversified small portfolios, is always dominated by the systematic risk. In this paper, under the mean–variance framework, we propose a model for actively allocating the systematic risk in portfolio optimization, which can also be interpreted as a model of controlling risk sensitivity in portfolio selection. Although the resulting problem is, in general, a notorious non-convex quadratically constrained quadratic program, the problem formulation is of some special structures due to the features of the defined marginal systematic risk contribution and the way to model the systematic risk via a factor model. By exploiting such special problem characteristics, we design an efficient and globally convergent branch-and-bound solution algorithm, based on a second-order cone relaxation. While empirical study demonstrates that the proposed model is a preferred tool for active portfolio risk management, numerical experiments also show that the proposed solution method is more efficient when compared to the commercial software BARON. 相似文献
13.
Disasters that occur everywhere in the most disordered way indicate that disaster entropy has reached the maximum value. Under given constraint conditions, when disaster entropy is the maximum value, the disaster loss series should follow P-III distribution. The occurrence interval of disaster loss refers to the average time interval that disaster loss of certain degree happens in the future. We could, according to the field disaster data and using P-III distribution function, calculate the value of future disaster loss with certain recurrence interval. Explicit in concept and easy to use, such a method has significant meaning in practice. 相似文献
14.
15.
Mean-risk analysis of risk aversion and wealth effects on optimal portfolios with multiple investment opportunities 总被引:1,自引:0,他引:1
In this paper, we first define risk in an axiomatic way and a class of utility functions suitable for the so-called mean-risk analysis. Then, we show that, in a portfolio selection problem with multiple risky investments, an investor who is more risk averse in the Arrow-Pratt sense prefers less risk, in the sense of this paper, with less mean return, and an investor who displays increasing (decreasing) relative risk aversion becomes more conservative (aggressive) as the initial capital increases. The risk aversion effect for diversification on optimal portfolios is also discussed. 相似文献
16.
Group decision making using fuzzy sets theory for evaluating the rate of aggregative risk in software development 总被引:3,自引:0,他引:3
Huey-Ming Lee 《Fuzzy Sets and Systems》1996,80(3):261-271
The purpose of this study is not only to build a group decision making structure model of risk in software development but also to propose two algorithms to tackle the rate of aggregative risk in a fuzzy environment by fuzzy sets theory during any phase of the life cycle. While evaluating the rate of aggregative risk, one may adjust or improve the weights or grades of the factors until she/he can accept it. Moreover, our result will be more objective and unbiased since it is generated by a group of evaluators. 相似文献
17.
A taxonomy and review of the fuzzy data envelopment analysis literature: Two decades in the making 总被引:1,自引:0,他引:1
Data envelopment analysis (DEA) is a methodology for measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. Crisp input and output data are fundamentally indispensable in conventional DEA. However, the observed values of the input and output data in real-world problems are sometimes imprecise or vague. Many researchers have proposed various fuzzy methods for dealing with the imprecise and ambiguous data in DEA. In this study, we provide a taxonomy and review of the fuzzy DEA methods. We present a classification scheme with four primary categories, namely, the tolerance approach, the α-level based approach, the fuzzy ranking approach and the possibility approach. We discuss each classification scheme and group the fuzzy DEA papers published in the literature over the past 20 years. To the best of our knowledge, this paper appears to be the only review and complete source of references on fuzzy DEA. 相似文献
18.
Concerns on security and congestion appear in security screening which is used to identify and deter potential threats (e.g., attackers, terrorists, smugglers, spies) among normal applicants wishing to enter an organization, location, or facility. Generally, in-depth screening reduces the risk of being attacked, but creates delays that may deter normal applicants and thus, decrease the welfare of the approver (authority, manager, screener). In this paper, we develop a model to determine the optimal screening policy to maximize the reward from admitting normal applicants net of the penalty from admitting bad applicants. We use an M/M/1 queueing system to capture the impact of security screening policies on system congestion and use game theory to model strategic behavior, in which potential applicants with private information can decide whether to apply based on the observed approver’s screening policy and the submission behavior of other potential applicants. We provide analytical solutions for the optimal non-discriminatory screening policy and numerical illustrations for both the discriminatory and non-discriminatory policies. In addition, we discuss more complex scenarios including imperfect screening, abandonment behavior of normal applicants, and non-zero waiting costs of attackers. 相似文献
19.
Runhuan Feng 《Insurance: Mathematics and Economics》2011,48(2):304-313
Recent developments in ruin theory have seen the growing popularity of jump diffusion processes in modeling an insurer’s assets and liabilities. Despite the variations of technique, the analysis of ruin-related quantities mostly relies on solutions to certain differential equations. In this paper, we propose in the context of Lévy-type jump diffusion risk models a solution method to a general class of ruin-related quantities. Then we present a novel operator-based approach to solving a particular type of integro-differential equations. Explicit expressions for resolvent densities for jump diffusion processes killed on exit below zero are obtained as by-products of this work. 相似文献