首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 408 毫秒
1.
Data envelopment analysis (DEA) is a method for measuring the efficiency of peer decision making units (DMUs), where the internal structures of DMUs are treated as a black-box. Recently DEA has been extended to examine the efficiency of DMUs that have two-stage network structures or processes, where all the outputs from the first stage are intermediate measures that make up the inputs to the second stage. The resulting two-stage DEA model not only provides an overall efficiency score for the entire process, but also yields an efficiency score for each of the individual stages. The current paper develops a Nash bargaining game model to measure the performance of DMUs that have a two-stage structure. Under Nash bargaining theory, the two stages are viewed as players and the DEA efficiency model is a cooperative game model. It is shown that when only one intermediate measure exists between the two stages, our newly developed Nash bargaining game approach yields the same results as applying the standard DEA approach to each stage separately. Two real world data sets are used to demonstrate our bargaining game model.  相似文献   

2.
Data envelopment analysis (DEA) is a linear programming problem approach for evaluating the relative efficiency of peer decision making units (DMUs) that have multiple inputs and outputs. DMUs can have a two-stage structure where all the outputs from the first stage are the only inputs to the second stage, in addition to the inputs to the first stage and the outputs from the second stage. The outputs from the first stage to the second stage are called intermediate measures. This paper examines relations and equivalence between two existing DEA approaches that address measuring the performance of two-stage processes.  相似文献   

3.
DEA model with shared resources and efficiency decomposition   总被引:2,自引:0,他引:2  
Data envelopment analysis (DEA) has proved to be an excellent approach for measuring performance of decision making units (DMUs) that use multiple inputs to generate multiple outputs. In many real world scenarios, DMUs have a two-stage network process with shared input resources used in both stages of operations. For example, in hospital operations, some of the input resources such as equipment, personnel, and information technology are used in the first stage to generate medical record to track treatments, tests, drug dosages, and costs. The same set of resources used by first stage activities are used to generate the second-stage patient services. Patient services also use the services generated by the first stage operations of housekeeping, medical records, and laundry. These DMUs have not only inputs and outputs, but also intermediate measures that exist in-between the two-stage operations. The distinguishing characteristic is that some of the inputs to the first stage are shared by both the first and second stage, but some of the shared inputs cannot be conveniently split up and allocated to the operations of the two stages. Recognizing this distinction is critical for these types of DEA applications because measuring the efficiency of the production for first-stage outputs can be misleading and can understate the efficiency if DEA fails to consider that some of the inputs generate other second-stage outputs. The current paper develops a set of DEA models for measuring the performance of two-stage network processes with non splittable shared inputs. An additive efficiency decomposition for the two-stage network process is presented. The models are developed under the assumption of variable returns to scale (VRS), but can be readily applied under the assumption of constant returns to scale (CRS). An application is provided.  相似文献   

4.
It is well known that super-efficiency data envelopment analysis (DEA) approach can be infeasible under the condition of variable returns to scale (VRS). By extending of the work of Chen (2005), the current study develops a two-stage process for calculating super-efficiency scores regardless whether the standard VRS super-efficiency mode is feasible or not. The proposed approach examines whether the standard VRS super-efficiency DEA model is infeasible. When the model is feasible, our approach yields super-efficiency scores that are identical to those arising from the original model. For efficient DMUs that are infeasible under the super-efficiency model, our approach yields super-efficiency scores that characterize input savings and/or output surpluses. The current study also shows that infeasibility may imply that an efficient DMU does not exhibit super-efficiency in inputs or outputs. When infeasibility occurs, it can be necessary that (i) both inputs and outputs be decreased to reach the frontier formed by the remaining DMUs under the input-orientation and (ii) both inputs and outputs be increased to reach the frontier formed by the remaining DMUs under the output-orientation. The newly developed approach is illustrated with numerical examples.  相似文献   

5.
In data envelopment analysis (DEA), efficient decision making units (DMUs) are of primary importance as they define the efficient frontier. The current paper develops a new sensitivity analysis approach for a category DMUs and finds the stability radius for all efficient DMUs. By means of combining some classic DEA models and with the condition that the efficiency scores of efficient DMUs remain unchanged, we are able to determine what perturbations of the data can be tolerated before efficient DMUs become inefficient. Our approach generalizes the conventional sensitivity analysis approach in which the inputs of efficient DMUs increase and their outputs decrease, while the inputs of inefficient DMUs decrease and their outputs increase. We find the maximum quantity of perturbations of data so that all first level efficient DMUs remain at the same level.  相似文献   

6.
Data envelopment analysis (DEA) is basically a linear programming based technique used for measuring the relative performance of organizational units, referred to as decision-making units (DMUs), where the presence of multiple inputs and outputs makes comparisons difficult. The ability of identifying frontier DMUs prior to the DEA calculation is of extreme importance to an effective and efficient DEA computation. In this paper, a method for identifying the efficient frontier is introduced. Then, the efficiency score and returns to scale (RTS) characteristic of DMUs will be produced by means of the equation of efficient frontier.  相似文献   

7.
Data envelopment analysis (DEA) is a data-oriented approach for evaluating the performances of a set of peer entities called decision-making units (DMUs), whose performance is determined based on multiple measures. The traditional DEA, which is based on the concept of efficiency frontier (output frontier), determines the best efficiency score that can be assigned to each DMU. Based on these scores, DMUs are classified into DEA-efficient (optimistic efficient) or DEA-non-efficient (optimistic non-efficient) units, and the DEA-efficient DMUs determine the efficiency frontier. There is a comparable approach which uses the concept of inefficiency frontier (input frontier) for determining the worst relative efficiency score that can be assigned to each DMU. DMUs on the inefficiency frontier are specified as DEA-inefficient or pessimistic inefficient, and those that do not lie on the inefficient frontier, are declared to be DEA-non-inefficient or pessimistic non-inefficient. In this paper, we argue that both relative efficiencies should be considered simultaneously, and any approach that considers only one of them will be biased. For measuring the overall performance of the DMUs, we propose to integrate both efficiencies in the form of an interval, and we call the proposed DEA models for efficiency measurement the bounded DEA models. In this way, the efficiency interval provides the decision maker with all the possible values of efficiency, which reflect various perspectives. A numerical example is presented to illustrate the application of the proposed DEA models.  相似文献   

8.
This paper investigates efficiency measurement in a two-stage data envelopment analysis (DEA) setting. Since 1978, DEA literature has witnessed the expansion of the original concept to encompass a wide range of theoretical and applied research areas. One such area is network DEA, and in particular two-stage DEA. In the conventional closed serial system, the only role played by the outputs from Stage 1 is to behave as inputs to Stage 2. The current paper examines a variation of that system. In particular, we consider settings where the set of final outputs comprises not only those that result from Stage 2, but can include, in addition, certain outputs from the previous (first) stage. The difficulty that this situation creates is that such outputs are attempting to play both an input and output role in the same stage. We develop a DEA-based methodology that is designed to handle what we term ‘time-staged outputs’. We then examine an application of this concept where the DMUs are schools of business.  相似文献   

9.
Efficiency could be not only the ratio of weighted sum of outputs to that of inputs but also that of weighted sum of inputs to that of outputs. When the previous efficiency measures the best relative efficiency within the range of no more than one, the decision-making units (DMUs) who get the optimum value of one perform best among all the DMUs. If the previous efficiency is measured within the range of no less than one, the DMUs who get the optimum value of one perform worst among all the DMUs. When the later efficiency is measured within the range of no more than one, the DMUs who get the optimum value of one perform worst among all the DMUs. If the later efficiency is measured within the range of no less than one, the DMUs who get the optimum value of one perform best among all the DMUs. This paper mainly studies an interval DEA model with later efficiency, in which efficiency is measured within the range of an interval, whose upper bound is set to one and the lower bound is determined by introducing a virtual ideal DMU, whose performance is definitely superior to any DMUs. The efficiencies, obtained from interval DEA model, turn out to be all intervals and are referred to as interval efficiencies, which combine the best and the worst relative efficiency in a reasonable manner to give an overall assessment of performances for all DMUs. Assessor's preference information on input and output weights is also incorporated into interval DEA model reasonably and conveniently. Through an example, some differences are found from the ranking results obtained from interval DEA model and bounded DEA model using the Hurwicz criterion approach to rank the interval efficiencies.  相似文献   

10.
This paper develops a DEA (data envelopment analysis) model to accommodate competition over outputs. In the proposed model, the total output of all decision making units (DMUs) is fixed, and DMUs compete with each other to maximize their self-rated DEA efficiency score. In the presence of competition over outputs, the best-practice frontier deviates from the classical DEA frontier. We also compute the efficiency scores using the proposed fixed sum output DEA (FSODEA) models, and discuss the competition strategy selection rule. The model is illustrated using a hypothetical data set under the constant returns to scale assumption and medal data from the 2000 Sydney Olympics under the variable returns to scale assumption.  相似文献   

11.
Data Envelopment Analysis (DEA) is a technique based on mathematical programming for evaluating the efficiency of homogeneous Decision Making Units (DMUs). In this technique inefficient DMUs are projected on to the frontier which constructed by the best performers. Centralized Resource Allocation (CRA) is a method in which all DMUs are projected on to the efficient frontier through solving just one DEA model. The intent of this paper is to present the Stochastic Centralized Resource Allocation (SCRA) in order to allocate centralized resources where inputs and outputs are stochastic. The concept discussed throughout this paper is illustrated using the aforementioned example.  相似文献   

12.
In data envelopment analysis (DEA) efficient decision making units (DMUs) are of primary importance as they define the efficient frontier. The current paper develops a new sensitivity analysis approach for the basic DEA models, such as, those proposed by Charnes, Cooper and Rhodes (CCR), Banker, Charnes and Cooper (BCC) and additive models, when variations in the data are simultaneously considered for all DMUs. By means of modified DEA models, in which the specific DMU under examination is excluded from the reference set, we are able to determine what perturbations of the data can be tolerated before efficient DMUs become inefficient. Our approach generalises the usual sensitivity analysis approach developed in which perturbations of the data are only applied to the test DMU while all the remaining DMUs remain fixed. In our framework data are allowed to vary simultaneously for all DMUs across different subsets of inputs and outputs. We study the relations of the infeasibility of modified DEA models employed and the robustness of DEA models. It is revealed that the infeasibility means stability. The empirical applications demonstrate that DEA efficiency classifications are robust with respect to possible data errors, particularly in the convex DEA case.  相似文献   

13.
A modified super-efficiency DEA model for infeasibility   总被引:1,自引:0,他引:1  
The super-efficiency data envelopment analysis (DEA) model is obtained when a decision making unit (DMU) under evaluation is excluded from the reference set. This model provides for a measure of stability of the “efficient” status for frontier DMUs. Under the assumption of variable returns to scale (VRS), the super efficiency model can be infeasible for some efficient DMUs, specifically those at the extremities of the frontier. The current study develops an approach to overcome infeasibility issues. It is shown that when the model is feasible, our approach yields super-efficiency scores that are equivalent to those arising from the original model. For efficient DMUs that are infeasible under the super-efficiency model, our approach yields optimal solutions and scores that characterize the extent of super-efficiency in both inputs and outputs. The newly developed approach is illustrated with two real world data sets.  相似文献   

14.
We improve the efficiency interval of a DMU by adjusting its given inputs and outputs. The Interval DEA model has been formulated to obtain an efficiency interval consisting of evaluations from both the optimistic and pessimistic viewpoints. DMUs which are not rated as efficient in the conventional sense are improved so that their lower bounds become as large as possible under the condition that their upper bounds attain the maximum value one. The adjusted inputs and outputs keep each other balanced by improving the lower bound of efficiency interval, since the lower bound becomes small if all the inputs and outputs are not proportioned. In order to improve the lower bound of efficiency interval, different target points are defined for different DMUs. The target point can be regarded as a kind of benchmark for the DMU. First, a new approach to improvement by adjusting only outputs or inputs is proposed. Then, the combined approach to improvement by adjusting both inputs and outputs simultaneously is proposed. Lastly, numerical examples are shown to illustrate our proposed approaches.  相似文献   

15.
Variations on the theme of slacks-based measure of efficiency in DEA   总被引:1,自引:0,他引:1  
In DEA, there are typically two schemes for measuring efficiency of DMUs; radial and non-radial. Radial models assume proportional change of inputs/outputs and usually remaining slacks are not directly accounted for inefficiency. On the other hand, non-radial models deal with slacks of each input/output individually and independently, and integrate them into an efficiency measure, called slacks-based measure (SBM). In this paper, we point out shortcomings of the SBM and propose four variants of the SBM model. The original SBM model evaluates efficiency of DMUs referring to the furthest frontier point within a range. This results in the hardest score for the objective DMU and the projection may go to a remote point on the efficient frontier which may be inappropriate as the reference. In an effort to overcome this shortcoming, we first investigate frontier (facet) structure of the production possibility set. Then we propose Variation I that evaluates each DMU by the nearest point on the same frontier as the SBM found. However, there exist other potential facets for evaluating DMUs. Therefore we propose Variation II that evaluates each DMU from all facets. We then employ clustering methods to classify DMUs into several groups, and apply Variation II within each cluster. This Variation III gives more reasonable efficiency scores with less effort. Lastly we propose a random search method (Variation IV) for reducing the burden of enumeration of facets. The results are approximate but practical in usage.  相似文献   

16.
Data envelopment analysis (DEA) is the leading technique for measuring the relative efficiency of decision-making units (DMUs) on the basis of multiple inputs and multiple outputs. In this technique, the weights for inputs and outputs are estimated in the best advantage for each unit so as to maximize its relative efficiency. But, this flexibility in selecting the weights deters the comparison among DMUs on a common base. For dealing with this difficulty, Kao and Hung (2005) proposed a compromise solution approach for generating common weights under the DEA framework. The proposed multiple criteria decision-making (MCDM) model was derived from the original non-linear DEA model. This paper presents an improvement to Kao and Hung's approach by means of introducing an MCDM model which is derived from a new linear DEA model.  相似文献   

17.
This article compares two approaches in aggregating multiple inputs and multiple outputs in the evaluation of decision making units (DMUs), data envelopment analysis (DEA) and principal component analysis (PCA). DEA, a non-statistical efficiency technique, employs linear programming to weight the inputs/outputs and rank the performance of DMUs. PCA, a multivariate statistical method, combines new multiple measures defined by the inputs/outputs. Both methods are applied to three real world data sets that characterize the economic performance of Chinese cities and yield consistent and mutually complementary results. Nonparametric statistical tests are employed to validate the consistency between the rankings obtained from DEA and PCA.  相似文献   

18.
This paper discusses and reviews the use of super-efficiency approach in data envelopment analysis (DEA) sensitivity analyses. It is shown that super-efficiency score can be decomposed into two data perturbation components of a particular test frontier decision making unit (DMU) and the remaining DMUs. As a result, DEA sensitivity analysis can be done in (1) a general situation where data for a test DMU and data for the remaining DMUs are allowed to vary simultaneously and unequally and (2) the worst-case scenario where the efficiency of the test DMU is deteriorating while the efficiencies of the other DMUs are improving. The sensitivity analysis approach developed in this paper can be applied to DMUs on the entire frontier and to all basic DEA models. Necessary and sufficient conditions for preserving a DMU’s efficiency classification are developed when various data changes are applied to all DMUs. Possible infeasibility of super-efficiency DEA models is only associated with extreme-efficient DMUs and indicates efficiency stability to data perturbations in all DMUs.  相似文献   

19.
Based on the minimal reduction strategy, Yang et al. (2011) developed a fixed-sum output data envelopment analysis (FSODEA) approach to evaluate the performance of decision-making units (DMUs) with fixed-sum outputs. However, in terms of such a strategy, all DMUs compete over fixed-sum outputs with “no memory” that will result in differing efficient frontiers’ evaluations. To address the problem, in this study, we propose an equilibrium efficiency frontier data envelopment analysis (EEFDEA) approach, by which all DMUs with fixed-sum outputs can be evaluated based on a common platform (or equilibrium efficient frontier). The proposed approach can be divided into two stages. Stage 1 constructs a common evaluation platform via two strategies: an extended minimal adjustment strategy and an equilibrium competition strategy. The former ensures that original efficient DMUs are still efficient, guaranteeing the existence of a common evaluation platform. The latter makes all DMUs achieve a common equilibrium efficient frontier. Then, based on the common equilibrium efficient frontier, Stage 2 evaluates all DMUs with their original inputs and outputs. Finally, we illustrate the proposed approach by using two numerical examples.  相似文献   

20.
The objective of the present paper is to propose a novel pair of data envelopment analysis (DEA) models for measurement of relative efficiencies of decision-making units (DMUs) in the presence of non-discretionary factors and imprecise data. Compared to traditional DEA, the proposed interval DEA approach measures the efficiency of each DMU relative to the inefficiency frontier, also called the input frontier, and is called the worst relative efficiency or pessimistic efficiency. On the other hand, in traditional DEA, the efficiency of each DMU is measured relative to the efficiency frontier and is called the best relative efficiency or optimistic efficiency. The pair of proposed interval DEA models takes into account the crisp, ordinal, and interval data, as well as non-discretionary factors, simultaneously for measurement of relative efficiencies of DMUs. Two numeric examples will be provided to illustrate the applicability of the interval DEA models.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号