首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In many applications of data envelopment analysis (DEA), there is often a fixed cost or input resource which should be imposed on all decision making units (DMUs). Cook and Zhu [W.D. Cook, J. Zhu, Allocation of shared costs among decision making units: A DEA approach, Computers and Operations Research 32 (2005) 2171-2178] propose a practical DEA approach for such allocation problems. In this paper, we prove that when some special constraints are added, Cook and Zhu’s approach probably has no feasible solution. The research of this paper focuses on two main aspects: to obtain a new fixed costs or resources allocation approach by improving Cook and Zhu’s approach, and to set fixed targets according to the amount of fixed resources shared by individual DMUs. When such special constraints are attached, our model is proved to be able to achieve a feasible costs or resources allocation. Numerical results for an example from the literature are presented to illustrate our approach.  相似文献   

2.
In this paper we show that data envelopment analysis (DEA) can be viewed as maximising the average efficiency of the decision-making units (DMUs) in an organisation. Building upon this we present DEA based models for: (a) allocating fixed costs to DMUs and (b) allocating input resources to DMUs. Simultaneous to allocating input resources output targets are also decided for each DMU. Numeric results are presented for a number of example problems taken from the literature.  相似文献   

3.
An issue of considerable importance, how to allocate a common revenue in an equitable manner across a set of competing entities. This paper introduces a new approach to obtaining allocation common revenue on all decision making units (DMUs) in such a way that the relative efficiency is not changed. In this method for determining allocation common revenue dose not need to solving any linear programming. A numerical example is provided to illustrate the results of the analysis.  相似文献   

4.
Network data envelopment analysis (DEA) concerns using the DEA technique to measure the relative efficiency of a system, taking into account its internal structure. The results are more meaningful and informative than those obtained from the conventional black-box approach, where the operations of the component processes are ignored. This paper reviews studies on network DEA by examining the models used and the structures of the network system of the problem being studied. This review highlights some directions for future studies from the methodological point of view, and is inspirational for exploring new areas of application from the empirical point of view.  相似文献   

5.
Data envelopment analysis is a mathematical programming technique for identifying efficient frontiers for peer decision making units with multiple inputs and multiple outputs. These performance factors (inputs and outputs) are classified into two groups: desirable and undesirable. Obviously, undesirable factors in production process should be reduced to improve the performance. In the current paper, we present a data envelopment analysis (DEA) model in which can be used to improve the relative performance via increasing undesirable inputs and decreasing undesirable outputs.  相似文献   

6.
Efficiency measurement is an important issue for any firm or organization. Efficiency measurement allows organizations to compare their performance with their competitors’ and then develop corresponding plans to improve performance. Various efficiency measurement tools, such as conventional statistical methods and non-parametric methods, have been successfully developed in the literature. Among these tools, the data envelopment analysis (DEA) approach is one of the most widely discussed. However, problems of discrimination between efficient and inefficient decision-making units also exist in the DEA context (Adler and Yazhemsky, 2010). In this paper, a two-stage approach of integrating independent component analysis (ICA) and data envelopment analysis (DEA) is proposed to overcome this issue. We suggest using ICA first to extract the input variables for generating independent components, then selecting the ICs representing the independent sources of input variables, and finally, inputting the selected ICs as new variables in the DEA model. A simulated dataset and a hospital dataset provided by the Office of Statistics in Taiwan’s Department of Health are used to demonstrate the validity of the proposed two-stage approach. The results show that the proposed method can not only separate performance differences between the DMUs but also improve the discriminatory capability of the DEA’s efficiency measurement.  相似文献   

7.
This paper proposes a dynamic data envelopment analysis (DEA) model to measure the system and period efficiencies at the same time for multi-period systems, where quasi-fixed inputs or intermediate products are the source of inter-temporal dependence between consecutive periods. A mathematical relationship is derived in which the complement of the system efficiency is a linear combination of those of the period efficiencies. The proposed model is also more discriminative than the existing ones in identifying the systems with better performance. Taiwanese forests, where the forest stock plays the role of quasi-fixed input, are used to illustrate this approach. The results show that the method for calculating the system efficiency in the literature produces over-estimated scores when the dynamic nature is ignored. This makes it necessary to conduct a dynamic analysis whenever data is available.  相似文献   

8.
One of the most important steps in the application of modeling using data envelopment analysis (DEA) is the choice of input and output variables. In this paper, we develop a formal procedure for a “stepwise” approach to variable selection that involves sequentially maximizing (or minimizing) the average change in the efficiencies as variables are added or dropped from the analysis. After developing the stepwise procedure, applications from classic DEA studies are presented and the new managerial insights gained from the stepwise procedure are discussed. We discuss how this easy to understand and intuitively sound method yields useful managerial results and assists in identifying DEA models that include variables with the largest impact on the DEA results.  相似文献   

9.
Model misspecification has significant impacts on data envelopment analysis (DEA) efficiency estimates. This paper discusses the four most widely-used approaches to guide variable specification in DEA. We analyze efficiency contribution measure (ECM), principal component analysis (PCA-DEA), a regression-based test, and bootstrapping for variable selection via Monte Carlo simulations to determine each approach’s advantages and disadvantages. For a three input, one output production process, we find that: PCA-DEA performs well with highly correlated inputs (greater than 0.8) and even for small data sets (less than 300 observations); both the regression and ECM approaches perform well under low correlation (less than 0.2) and relatively larger data sets (at least 300 observations); and bootstrapping performs relatively poorly. Bootstrapping requires hours of computational time whereas the three other methods require minutes. Based on the results, we offer guidelines for effectively choosing among the four selection methods.  相似文献   

10.
Conventional data envelopment analysis (DEA) models assume real-valued inputs and outputs. In many occasions, some inputs and/or outputs can only take integer values. In some cases, rounding the DEA solution to the nearest whole number can lead to misleading efficiency assessments and performance targets. This paper develops the axiomatic foundation for DEA in the case of integer-valued data, introducing new axioms of “natural disposability” and “natural divisibility”. We derive a DEA production possibility set that satisfies the minimum extrapolation principle under our refined set of axioms. We also present a mixed integer linear programming formula for computing efficiency scores. An empirical application to Iranian university departments illustrates the approach.  相似文献   

11.
Data envelopment analysis (DEA) is a useful tool of efficiency measurement for firms and organizations. Kao and Hwang (2008) take into account the series relationship of the two sub-processes in a two-stage production process, and the overall efficiency of the whole process is the product of the efficiencies of the two sub-processes. To find the largest efficiency of one sub-process while maintaining the maximum overall efficiency of the whole process, Kao and Hwang (2008) propose a solution procedure to accomplish this purpose. Nevertheless, one needs to know the overall efficiency of the whole process before calculating the sub-process efficiency. In this note, we propose a method that is able to find the sub-process and overall efficiencies simultaneously.  相似文献   

12.
Benefit-cost analysis is required by law and regulation throughout the federal government. Robert Dorfman (1996) declares ‘Three prominent shortcomings of benefit-cost analysis as currently practiced are (1) it does not identify the population segments that the proposed measure benefits or harms (2) it attempts to reduce all comparisons to a single dimension, generally dollars and cents and (3) it conceals the degree of inaccuracy or uncertainty in its estimates.’ The paper develops an approach for conducting benefit-cost analysis derived from data envelopment analysis (DEA) that overcomes each of Dorfman's objections. The models and methodology proposed give decision makers a tool for evaluating alternative policies and projects where there are multiple constituencies who may have conflicting perspectives. This method incorporates multiple incommensurate attributes while allowing for measures of uncertainty. An application is used to illustrate the method. This work was funded by grant N00014-99-1-0719 from the Office of Naval Research  相似文献   

13.
We propose new efficiency tests which are based on traditional DEA models and take into account portfolio diversification. The goal is to identify the investment opportunities that perform well without specifying our attitude to risk. We use general deviation measures as the inputs and return measures as the outputs. We discuss the choice of the set of investment opportunities including portfolios with limited number of assets. We compare the optimal values (efficiency scores) of all proposed tests leading to the relations between the sets of efficient opportunities. Strength of the tests is then discussed. We test the efficiency of 25 world financial indices using new DEA models with CVaR deviation measures.  相似文献   

14.
Conventional data envelopment analysis (DEA) models only consider the inputs supplied to the system and the outputs produced from the system in measuring efficiency, ignoring the operations of the internal processes. The results thus obtained sometimes are misleading. This paper discusses the efficiency measurement and decomposition of general multi-stage systems, where each stage consumes exogenous inputs and intermediate products (produced from the preceding stage) to produce exogenous outputs and intermediate products (for the succeeding stage to use). A relational model is developed to measure the system and stage efficiencies at the same time. By transforming the system into a series of parallel structures, the system efficiency is decomposed into the product of a modification of the stage efficiencies. Efficiency decomposition enables decision makers to identify the stages that cause the inefficiency of the system, and to effectively improve the performance of the system. An example of an electricity service system is used to explain the idea of efficiency decomposition.  相似文献   

15.
Conventional data envelopment analysis (DEA) for measuring the efficiency of a set of decision making units (DMUs) requires the input/output data to be constant. In reality, however, many observations are stochastic in nature; consequently, the resulting efficiencies are stochastic as well. This paper discusses how to obtain the efficiency distribution of each DMU via a simulation technique. The case of Taiwan commercial banks shows that, firstly, the number of replications in simulation analysis has little effect on the estimation of efficiency means, yet 1000 replications are recommended to produce reliable efficiency means and 2000 replications for a good estimation of the efficiency distributions. Secondly, the conventional way of using average data to represent stochastic variables results in efficiency scores which are different from the mean efficiencies of the presumably true efficiency distributions estimated from simulation. Thirdly, the interval-data approach produces true efficiency intervals yet the intervals are too wide to provide valuable information. In conclusion, when multiple observations are available for each DMU, the stochastic-data approach produces more reliable and informative results than the average-data and interval-data approaches do.  相似文献   

16.
Data envelopment analysis (DEA) is a technique for evaluating relative efficiencies of peer decision making units (DMUs) which have multiple performance measures. These performance measures have to be classified as either inputs or outputs in DEA. DEA assumes that higher output levels and/or lower input levels indicate better performance. This study is motivated by the fact that there are performance measures (or factors) that cannot be classified as an input or output, because they have target levels with which all DMUs strive to achieve in order to attain the best practice, and any deviations from the target levels are not desirable and may indicate inefficiency. We show how such performance measures with target levels can be incorporated in DEA. We formulate a new production possibility set by extending the standard DEA production possibility set under variable returns-to-scale assumption based on a set of axiomatic properties postulated to suit the case of targeted factors. We develop three efficiency measures by extending the standard radial, slacks-based, and Nerlove–Luenberger measures. We illustrate the proposed model and efficiency measures by applying them to the efficiency evaluation of 36 US universities.  相似文献   

17.
Evaluating the performance of activities or organization by common data envelopment analysis models requires crisp input/output data. However, the precise inputs and outputs of production processes cannot be always measured. Thus, the data envelopment analysis measurement containing fuzzy data, called “fuzzy data envelopment analysis”, has played an important role in the evaluation of efficiencies of real applications. This paper focuses on the fuzzy CCR model and proposes a new method for determining the lower bounds of fuzzy inputs and outputs. This improves the weak efficiency frontiers of the corresponding production possibility set. Also a numerical example illustrates the capability of the proposed method.  相似文献   

18.
The efficiency of decision processes which can be divided into two stages has been measured for the whole process as well as for each stage independently by using the conventional data envelopment analysis (DEA) methodology in order to identify the causes of inefficiency. This paper modifies the conventional DEA model by taking into account the series relationship of the two sub-processes within the whole process. Under this framework, the efficiency of the whole process can be decomposed into the product of the efficiencies of the two sub-processes. In addition to this sound mathematical property, the case of Taiwanese non-life insurance companies shows that some unusual results which have appeared in the independent model do not exist in the relational model. In other words, the relational model developed in this paper is more reliable in measuring the efficiencies and consequently is capable of identifying the causes of inefficiency more accurately. Based on the structure of the model, the idea of efficiency decomposition can be extended to systems composed of multiple stages connected in series.  相似文献   

19.
Data Envelopment Analysis (DEA) is a very effective method to evaluate the relative efficiency of decision-making units (DMUs). Since the data of production processes cannot be precisely measured in some cases, the uncertain theory has played an important role in DEA. This paper attempts to extend the traditional DEA models to a fuzzy framework, thus producing a fuzzy DEA model based on credibility measure. Following is a method of ranking all the DMUs. In order to solve the fuzzy model, we have designed the hybrid algorithm combined with fuzzy simulation and genetic algorithm. When the inputs and outputs are all trapezoidal or triangular fuzzy variables, the model can be transformed to linear programming. Finally, a numerical example is presented to illustrate the fuzzy DEA model and the method of ranking all the DMUs.  相似文献   

20.
This paper considers allocation rules. First, we demonstrate that costs allocated by the Aumann–Shapley and the Friedman–Moulin cost allocation rules are easy to determine in practice using convex envelopment of registered cost data and parametric programming. Second, from the linear programming problems involved it becomes clear that the allocation rules, technically speaking, allocate the non-zero value of the dual variable for a convexity constraint on to the output vector. Hence, the allocation rules can also be used to allocate inefficiencies in non-parametric efficiency measurement models such as Data Envelopment Analysis (DEA). The convexity constraint of the BCC model introduces a non-zero slack in the objective function of the multiplier problem and we show that the cost allocation rules discussed in this paper can be used as candidates to allocate this slack value on to the input (or output) variables and hence enable a full allocation of the inefficiency on to the input (or output) variables as in the CCR model.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号