首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
An underlying assumption in DEA is that the weights coupled with the ratio scales of the inputs and outputs imply linear value functions. In this paper, we present a general modeling approach to deal with outputs and/or inputs that are characterized by nonlinear value functions. To this end, we represent the nonlinear virtual outputs and/or inputs in a piece-wise linear fashion. We give the CCR model that can assess the efficiency of the units in the presence of nonlinear virtual inputs and outputs. Further, we extend the models with the assurance region approach to deal with concave output and convex input value functions. Actually, our formulations indicate a transformation of the original data set to an augmented data set where standard DEA models can then be applied, remaining thus in the grounds of the standard DEA methodology. To underline the usefulness of such a new development, we revisit a previous work of one of the authors dealing with the assessment of the human development index on the light of DEA.  相似文献   

2.
Traditionally, data envelopment analysis models assume total flexibility in weight selection, though this assumption can lead to several variables being ignored in determining the efficiency score. Existing methods constrain weight selection to a predefined range, thus removing possible feasible solutions. As such, in this paper we propose the symmetric weight assignment technique (SWAT) that does not affect feasibility and rewards decision making units (DMUs) that make a symmetric selection of weights. This allows for a method of weight restrictions that does not require preference constraints on the variables. Moreover, we show that the SWAT method may be used to differentiate among efficient DMUs.  相似文献   

3.
Cook and Zhu (2007) introduced an innovative method to deal with flexible measures. Toloo (2009) found a computational problem in their approach and tackled this issue. Amirteimoori and Emrouznejad (2012) claimed that both Cook and Zhu (2007) and Toloo (2009) models overestimate the efficiency. In this response, we prove that their claim is incorrect and there is no overestimate in these approaches.  相似文献   

4.
Efficiency measurement is an important issue for any firm or organization. Efficiency measurement allows organizations to compare their performance with their competitors’ and then develop corresponding plans to improve performance. Various efficiency measurement tools, such as conventional statistical methods and non-parametric methods, have been successfully developed in the literature. Among these tools, the data envelopment analysis (DEA) approach is one of the most widely discussed. However, problems of discrimination between efficient and inefficient decision-making units also exist in the DEA context (Adler and Yazhemsky, 2010). In this paper, a two-stage approach of integrating independent component analysis (ICA) and data envelopment analysis (DEA) is proposed to overcome this issue. We suggest using ICA first to extract the input variables for generating independent components, then selecting the ICs representing the independent sources of input variables, and finally, inputting the selected ICs as new variables in the DEA model. A simulated dataset and a hospital dataset provided by the Office of Statistics in Taiwan’s Department of Health are used to demonstrate the validity of the proposed two-stage approach. The results show that the proposed method can not only separate performance differences between the DMUs but also improve the discriminatory capability of the DEA’s efficiency measurement.  相似文献   

5.
Benefit-cost analysis is required by law and regulation throughout the federal government. Robert Dorfman (1996) declares ‘Three prominent shortcomings of benefit-cost analysis as currently practiced are (1) it does not identify the population segments that the proposed measure benefits or harms (2) it attempts to reduce all comparisons to a single dimension, generally dollars and cents and (3) it conceals the degree of inaccuracy or uncertainty in its estimates.’ The paper develops an approach for conducting benefit-cost analysis derived from data envelopment analysis (DEA) that overcomes each of Dorfman's objections. The models and methodology proposed give decision makers a tool for evaluating alternative policies and projects where there are multiple constituencies who may have conflicting perspectives. This method incorporates multiple incommensurate attributes while allowing for measures of uncertainty. An application is used to illustrate the method. This work was funded by grant N00014-99-1-0719 from the Office of Naval Research  相似文献   

6.
Data envelopment analysis (DEA) is a linear programming methodology to evaluate the relative technical efficiency for each member of a set of peer decision making units (DMUs) with multiple inputs and multiple outputs. It has been widely used to measure performance in many areas. A weakness of the traditional DEA model is that it cannot deal with negative input or output values. There have been many studies exploring this issue, and various approaches have been proposed.  相似文献   

7.
The Law of One Price (LoOP) states that all firms face the same prices for their inputs and outputs under market equilibrium. Taken here as a normative condition for ‘efficiency prices’, this law has powerful implications for productive efficiency analysis, which have remained unexploited thus far. This paper shows how LoOP-based weight restrictions can be incorporated in Data Envelopment Analysis (DEA). Utilizing the relation between industry-level and firm-level cost efficiency measures, we propose to apply a set of input prices that is common for all firms and that maximizes the cost efficiency of the industry. Our framework allows for firm-specific output weights and for variable returns-to-scale, and preserves the linear programming structure of the standard DEA. We apply the proposed methodology to the evaluation of the research efficiency of economics departments of Dutch Universities. This application shows that the methodology is computationally tractable for practical efficiency analysis, and that it helps in deepening the DEA analysis.  相似文献   

8.
Based on the minimal reduction strategy, Yang et al. (2011) developed a fixed-sum output data envelopment analysis (FSODEA) approach to evaluate the performance of decision-making units (DMUs) with fixed-sum outputs. However, in terms of such a strategy, all DMUs compete over fixed-sum outputs with “no memory” that will result in differing efficient frontiers’ evaluations. To address the problem, in this study, we propose an equilibrium efficiency frontier data envelopment analysis (EEFDEA) approach, by which all DMUs with fixed-sum outputs can be evaluated based on a common platform (or equilibrium efficient frontier). The proposed approach can be divided into two stages. Stage 1 constructs a common evaluation platform via two strategies: an extended minimal adjustment strategy and an equilibrium competition strategy. The former ensures that original efficient DMUs are still efficient, guaranteeing the existence of a common evaluation platform. The latter makes all DMUs achieve a common equilibrium efficient frontier. Then, based on the common equilibrium efficient frontier, Stage 2 evaluates all DMUs with their original inputs and outputs. Finally, we illustrate the proposed approach by using two numerical examples.  相似文献   

9.
The concept of efficiency in data envelopment analysis (DEA) is defined as weighted sum of outputs/weighted sum of inputs. In order to calculate the maximum efficiency score, each decision making unit (DMU)’s inputs and outputs are assigned to different weights. Hence, the classical DEA allows the weight flexibility. Therefore, even if they are important, the inputs or outputs of some DMUs can be assigned zero (0) weights. Thus, these inputs or outputs are neglected in the evaluation. Also, some DMUs may be defined as efficient even if they are inefficient. This situation leads to unrealistic results. Also to eliminate the problem of weight flexibility, weight restrictions are made in DEA. In our study, we proposed a new model which has not been published in the literature. We describe it as the restricted data envelopment analysis ((ARIII(COR))) model with correlation coefficients. The aim for developing this new model, is to take into account the relations between variables using correlation coefficients. Also, these relations were added as constraints to the CCR and BCC models. For this purpose, the correlation coefficients were used in the restrictions of input–output each one alone and their combination together. Inputs and outputs are related to the degree of correlation between each other in the production. Previous studies did not take into account the relationship between inputs/outputs variables. So, only with expert opinions or an objective method, weight restrictions have been made. In our study, the weights for input and output variables were determined, according to the correlations between input and output variables. The proposed new method is different from other methods in the literature, because the efficiency scores were calculated at the level of correlations between the input and/or output variables.  相似文献   

10.
In a recent paper by Mostafaee and Saljooghi [Mostafaee, A., Saljooghi, F.H., 2010. Cost efficiency in data envelopment analysis with data uncertainty. European Journal of Operational Research, 202, 595–603], the authors extend the classical cost efficiency model to address data uncertainty. They claim that the upper bound of the cost efficiency can be obtained at extreme points when the input prices appear in the form of ranges. In this paper, we present our counterexamples and comments on the contention by Mostafaee and Saljooghi.  相似文献   

11.
Data envelopment analysis (DEA) is a useful tool of efficiency measurement for firms and organizations. Kao and Hwang (2008) take into account the series relationship of the two sub-processes in a two-stage production process, and the overall efficiency of the whole process is the product of the efficiencies of the two sub-processes. To find the largest efficiency of one sub-process while maintaining the maximum overall efficiency of the whole process, Kao and Hwang (2008) propose a solution procedure to accomplish this purpose. Nevertheless, one needs to know the overall efficiency of the whole process before calculating the sub-process efficiency. In this note, we propose a method that is able to find the sub-process and overall efficiencies simultaneously.  相似文献   

12.
One of the typical issues in financial literature is that the market tends to be overly pessimistic about value stocks, many of which are past losers. Therefore, over-reactions might capture by measuring earnings surprise vary with past return levels. In this paper, we propose a new index for an effective investment strategy to capture the return-reversal effect using both Data Envelopment Analysis (DEA) and Inverted DEA in order to consider the above characteristics of the market. Our investment strategy using the new index exhibits better performance than the naive return-reversal strategy that only uses past returns or earnings surprise. In addition, the correlations between our new index and commonly used value indices are insignificant, and the value indices cannot represent the over-valued (under-valued) situations perfectly. Hence, considering both proposed and value indices like book-to-price one, we could select value stocks more effectively than by using only one of these indices.  相似文献   

13.
Model misspecification has significant impacts on data envelopment analysis (DEA) efficiency estimates. This paper discusses the four most widely-used approaches to guide variable specification in DEA. We analyze efficiency contribution measure (ECM), principal component analysis (PCA-DEA), a regression-based test, and bootstrapping for variable selection via Monte Carlo simulations to determine each approach’s advantages and disadvantages. For a three input, one output production process, we find that: PCA-DEA performs well with highly correlated inputs (greater than 0.8) and even for small data sets (less than 300 observations); both the regression and ECM approaches perform well under low correlation (less than 0.2) and relatively larger data sets (at least 300 observations); and bootstrapping performs relatively poorly. Bootstrapping requires hours of computational time whereas the three other methods require minutes. Based on the results, we offer guidelines for effectively choosing among the four selection methods.  相似文献   

14.
Evaluating the performance of activities or organization by common data envelopment analysis models requires crisp input/output data. However, the precise inputs and outputs of production processes cannot be always measured. Thus, the data envelopment analysis measurement containing fuzzy data, called “fuzzy data envelopment analysis”, has played an important role in the evaluation of efficiencies of real applications. This paper focuses on the fuzzy CCR model and proposes a new method for determining the lower bounds of fuzzy inputs and outputs. This improves the weak efficiency frontiers of the corresponding production possibility set. Also a numerical example illustrates the capability of the proposed method.  相似文献   

15.
This paper evaluates the impact of location on hotel efficiency using a sample of 400 Spanish hotels, the novel aspect being that location is considered at the tourist destination level. Moreover, for the first time, the location variables are based on the main theoretical models concerning location in the hotel sector, namely geographical positioning models, agglomeration and urbanization economic models and competitive environment models. The methodology consists of a four-stage data envelopment analysis (DEA) model that decomposes super-efficiency in the portion attributable to the tourist destination and the portion attributable to hotel management. Then, managerial efficiency is regressed against hotel characteristics, while tourist destination efficiency is explained by the characteristic of each location. The findings highlight the importance of tourist destinations, providing novel empirical support for the propositions of the main location models. Indeed, the tourist destination is the main cause of differences in the level of efficiency among hotels. The occupancy level, degree of seasonality and market concentration are the variables with the greater impact on efficiency.  相似文献   

16.
Manufacturing decision makers have to deal with a large number of reports and metrics for evaluating the performance of manufacturing systems. Since the metrics provide different and at times conflicting assessments, it is hard for the manufacturing decision makers to track and improve overall manufacturing system performance. This research presents a data envelopment analysis (DEA) based approach for performance measurement and target setting of manufacturing systems. The approach is applied to two different manufacturing environments. The performance peer groups identified using DEA are utilized to set performance targets and to guide performance improvement efforts. The DEA scores are checked against past process modifications that led to identified performance changes. Limitations of the DEA based approach are presented when considering measures that are influenced by factors outside of the control of the manufacturing decision makers. The potential of a DEA based generic performance measurement approach for manufacturing systems is provided.  相似文献   

17.
To impose the law of one price (LoOP) restrictions, which state that all firms face the same input prices, Kuosmanen, Cherchye, and Sipiläinen (2006) developed the top-down and bottom-up approaches to maximizing the industry-level cost efficiency. However, the optimal input shadow prices generated by the above approaches need not be unique, which influences the distribution of the efficiency indices at the individual firm level. To solve this problem, in this paper, we developed a pair of two-level mathematical programming models to calculate the upper and lower bounds of cost efficiency for each firm in the case of non-unique LoOP prices while keeping the industry cost efficiency optimal. Furthermore, a base-enumerating algorithm is proposed to solve the lower bound models of the cost efficiency measure, which are bi-level linear programs and NP-hard problems. Lastly, a numerical example is used to demonstrate the proposed approach.  相似文献   

18.
Conventional data envelopment analysis (DEA) for measuring the efficiency of a set of decision making units (DMUs) requires the input/output data to be constant. In reality, however, many observations are stochastic in nature; consequently, the resulting efficiencies are stochastic as well. This paper discusses how to obtain the efficiency distribution of each DMU via a simulation technique. The case of Taiwan commercial banks shows that, firstly, the number of replications in simulation analysis has little effect on the estimation of efficiency means, yet 1000 replications are recommended to produce reliable efficiency means and 2000 replications for a good estimation of the efficiency distributions. Secondly, the conventional way of using average data to represent stochastic variables results in efficiency scores which are different from the mean efficiencies of the presumably true efficiency distributions estimated from simulation. Thirdly, the interval-data approach produces true efficiency intervals yet the intervals are too wide to provide valuable information. In conclusion, when multiple observations are available for each DMU, the stochastic-data approach produces more reliable and informative results than the average-data and interval-data approaches do.  相似文献   

19.
Recently new models of data envelopment analysis (DEA) were introduced that incorporate production trade-offs between inputs and outputs or based on them weight restrictions. In this paper, we develop a computational procedure suitable for the practical application of such models. We show that the standard two-stage optimisation procedure used in DEA to test the full efficiency of units and identify their efficient targets may work incorrectly in the new models. The modified procedure consists of three stages: the first evaluates the radial efficiency of the unit, the second identifies its efficient target, and the third its reference set of efficient peers. Each stage requires solving one linear program for each unit.  相似文献   

20.
This research attempts to solve the problem of dealing with missing data via the interface of Data Envelopment Analysis (DEA) and human behavior. Missing data is under continuing discussion in various research fields, especially those highly dependent on data. In practice and research, some necessary data may not be obtained in many cases, for example, procedural factors, lack of needed responses, etc. Thus the question of how to deal with missing data is raised. In this paper, modified DEA models are developed to estimate the appropriate value of missing data in its interval, based on DEA and Inter-dimensional Similarity Halo Effect. The estimated value of missing data is determined by the General Impression of original DEA efficiency. To evaluate the effectiveness of this method, the impact factor is proposed. In addition, the advantages of the proposed approach are illustrated in comparison with previous methods.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号