首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 515 毫秒
1.
Data envelopment analysis (DEA) is a technique for evaluating relative efficiencies of peer decision making units (DMUs) which have multiple performance measures. These performance measures have to be classified as either inputs or outputs in DEA. DEA assumes that higher output levels and/or lower input levels indicate better performance. This study is motivated by the fact that there are performance measures (or factors) that cannot be classified as an input or output, because they have target levels with which all DMUs strive to achieve in order to attain the best practice, and any deviations from the target levels are not desirable and may indicate inefficiency. We show how such performance measures with target levels can be incorporated in DEA. We formulate a new production possibility set by extending the standard DEA production possibility set under variable returns-to-scale assumption based on a set of axiomatic properties postulated to suit the case of targeted factors. We develop three efficiency measures by extending the standard radial, slacks-based, and Nerlove–Luenberger measures. We illustrate the proposed model and efficiency measures by applying them to the efficiency evaluation of 36 US universities.  相似文献   

2.
In data envelopment analysis (DEA), operating units are compared on their outputs relative to their inputs. The identification of an appropriate input–output set is of decisive significance if assessment of the relative performance of the units is not to be biased. This paper reports on a novel approach used for identifying a suitable input–output set for assessing central administrative services at universities. A computer-supported group support system was used with an advisory board to enable the analysts to extract information pertaining to the boundaries of the unit of assessment and the corresponding input–output variables. The approach provides for a more comprehensive and less inhibited discussion of input–output variables to inform the DEA model.  相似文献   

3.
The author's previous paper on this topic showed that the Duckworth/Lewis methodology has considerable potential for providing relevant and objective measures of performance in one-day cricket. The present paper evaluates how these measures can apply in the longer term. This assessment is based upon several series of matches involving the main players from most of the major cricketing nations. Although data used are not exhaustive, they are of sufficient quantity to provide strong indications of the long-term viability of the measures, thus leading to the realistic expectation of inclusion of the methodology into the existing player-rankings or the development of an independent set of performance indicators that reflect more validly the value of players' inputs to matches. Such measures are expected to be of valuable use to team management, media commentators and to the public.  相似文献   

4.
In conventional data envelopment analysis it is assumed that the input versus output status of each chosen performance measures is known. In some conditions finding a statue of some variables from the point view of input or output is very difficult; these variables treat as both an input and output and are called flexible measures. This paper proposes a new model based on translog output distance function for classifying inputs and outputs and evaluating the performance of decision-making units by considering flexible measures. Monte Carlo simulation is applied to evaluate the presented model comparing with that of the recent model found in the literature. The result shows that the measure efficiencies of our model are statistically closer to true efficiencies and have higher rank correlation with true efficiencies. Also results obtained from simulated data show that there are high correlation between our model and that of the recent model.  相似文献   

5.
6.
Using data envelopment analysis (DEA) in conjunction with stochastic frontier analysis (SFA), the aim of this study was to measure the relative efficiency of quality management (QM) practices in Turkish public and private universities. Based on the extant literature, a set of nine critical QM factors and seven performance indicators for Turkish universities were identified as input and output variables, respectively. SFA confirmed the existence of significant relationships between QM factors and performance indicators. DEA findings indicated that private universities with higher levels of QM efficiency on stakeholder-focus indicators achieved better performance in terms of fulfilling the expectations of their stakeholders. In contrast, public universities were more successful in managing QM practices for a superior teaching and research performance. Finally, after eliminating the managerial discrepancies, no significant structural efficiency difference was found between these two groups of universities through stakeholder-focus model, though some significant variation was noted in both factor-efficiency and total-efficiency models. As for total-efficiency model, we may infer that the structural differences found in favour of public universities for factor-efficiencies are counterbalanced by private universities which tend to focus more on their stakeholders in managing QM applications.  相似文献   

7.
While Data Envelopment Analysis (DEA) has many attractions as a technique for analysing the efficiency of educational organisations, such as schools and universities, care must be taken in its use whenever its assumption of convexity of the prevailing technology and associated production possibility set may not hold. In particular, if the convexity assumption does not hold, DEA may overstate the scope for improvements in technical efficiency through proportional increases in all educational outputs and understate the importance of improvements in allocative efficiency from changing the educational output mix. The paper therefore examines conditions under which the convexity assumption is not guaranteed, particularly when the performance evaluation includes measures related to the assessed quality of the educational outputs. Under such conditions, there is a need to deploy other educational efficiency assessment tools, including an alternative non-parametric output-orientated technique and a more explicit valuation function for educational outputs, in order to estimate the shape of the efficiency frontier and both technical and allocative efficiency.  相似文献   

8.
Qualitative factors in data envelopment analysis: A fuzzy number approach   总被引:1,自引:0,他引:1  
Qualitative factors are difficult to mathematically manipulate when calculating the efficiency in data envelopment analysis (DEA). The existing methods of representing the qualitative data by ordinal variables and assigning values to obtain efficiency measures only superficially reflect the precedence relationship of the ordinal data. This paper treats the qualitative data as fuzzy numbers, and uses the DEA multipliers associated with the decision making units (DMUs) being evaluated to construct the membership functions. Based on Zadeh’s extension principle, a pair of two-level mathematical programs is formulated to calculate the α-cuts of the fuzzy efficiencies. Fuzzy efficiencies contain more information for making better decisions. A performance evaluation of the chemistry departments of 52 UK universities is used for illustration. Since the membership functions are constructed from the opinion of the DMUs being evaluated, the results are more representative and persuasive.  相似文献   

9.
Theoretical consideration of technical efficiency has existed since Koopmans [10] defined it for production possibilities for which it is not possible to increase any output without simultaneously increasing any input, ceteris paribus. The nonparametric approach to efficiency measurement known as Data Envelopment Analysis is based on the index of Farrell [9], which measures radial reduction in all inputs consistent with observed output. Even after Farrell efficiency is achieved, however, there may exist additional slack in individual inputs, suggesting that the Farrell index does not necessarily measure Koopmans inefficiency. To solve this problem, the non-radial Russell measure was introduced. This paper shows that problems may arise with the Russell measure due to restrictive assumptions on the implicit weighting of inputs and outputs. This paper develops a new measure, the Weighted Russell measure, that relaxes this assumption. Using simulated data, the new measure is shown to be preferred to existing methods. In addition, the new method is applied to analyze the performance of New York State school districts.  相似文献   

10.
We propose a systematic approach to analyzing academic research performance at universities and research institutes. The analysis is based on identifying a set of decision-relevant (abstract) criteria. The scales for these criteria are defined by means of concrete indicators, all which are, however, not necessarily quantitative. Qualitative information is quantified using appropriate analytical tools. Once the criteria and indicators have been agreed upon and quantified, data on the research units is collected and a value efficiency analysis is performed. The efficiency of research units is defined in the spirit of data envelopment analysis (DEA), complemented with decision maker's (DM's) (rector in the European university system) preference information. This information is obtained by asking the DM to locate a point on the efficient frontier having the most preferred combination of input and output values. Our approach and the accompanying decision support system enables a university to allocate resources more efficiently than previously to its research units. Using data from the Helsinki school of economics, we illustrate how the approach works.  相似文献   

11.
The goal of factor screening is to find the really important inputs (factors) among the many inputs that may be changed in a realistic simulation experiment. A specific method is sequential bifurcation (SB), which is a sequential method that changes groups of inputs simultaneously. SB is most efficient and effective if the following assumptions are satisfied: (i) second-order polynomials are adequate approximations of the input/output functions implied by the simulation model; (ii) the signs of all first-order effects are known; and (iii) if two inputs have no important first-order effects, then they have no important second-order effects either (heredity property). This paper examines SB for random simulation with multiple responses (outputs), called multi-response SB (MSB). This MSB selects groups of inputs such that—within a group—all inputs have the same sign for a specific type of output, so no cancellation of first-order effects occurs. To obtain enough replicates (replications) for correctly classifying a group effect or an individual effect as being important or unimportant, MSB applies Wald’s sequential probability ratio test (SPRT). The initial number of replicates in this SPRT is also selected efficiently by MSB. Moreover, MSB includes a procedure to validate the three assumptions of MSB. The paper evaluates the performance of MSB through extensive Monte Carlo experiments that satisfy all MSB assumptions, and through a case study representing a logistic system in China; the results are very promising.  相似文献   

12.
In conventional data envelopment analysis it is assumed that the input versus output status of each of the chosen performance measures is known. In some situations, however, certain performance measures can play either input or output roles. We refer to these performance measures as flexible measures. This paper presents a modification of the standard constant returns to scale DEA model to accommodate such flexible measures. Both an individual DMU model and an aggregate model are suggested as methodologies for deriving the most appropriate designations for flexible measures. We illustrate the application of these models in two practical problem settings.  相似文献   

13.
This paper presents the results of a survey of OR group managers to examine the success and survival of OR groups in UK industry, commerce and the public sector, which was sponsored by the OR Society. The aims of the survey were threefold: to gain some understanding of the demographics of UK OR groups in the mid-1990s as compared to the evidence collected 10 years earlier by the Commission; secondly, to establish how OR groups work, the type of projects they carry out, the clients they work for and the organisations they work in; and finally, to gain an understanding of how OR groups are managed, including the factors that OR managers believed to be important in ensuring their group's continuing success. In a nutshell, this paper presents a snap shot of OR group management in the mid-1990s.  相似文献   

14.
In fisheries, capacity analysis has largely been limited to measuring physical capacity, defined as the maximum amount of output that can be produced per unit of time, given existing plant and equipment and unrestricted availability of variable inputs. An economic measure of capacity can be defined as the maximum revenue attainable for the given fixed inputs, using relevant outputs and output prices. This paper examines these two approaches to capacity by applying data envelopment analysis to physical and economic input/output data for Danish North Sea trawlers. The economic and physical measures are compared and contrasted using correlation analysis. An innovative analysis into the composition of possible revenue increments is also presented and reasons for economic inefficiency of vessels are identified.  相似文献   

15.
We consider queueing, fluid and inventory processes whose dynamics are determined by general point processes or random measures that represent inputs and outputs. The state of such a process (the queue length or inventory level) is regulated to stay in a finite or infinite interval – inputs or outputs are disregarded when they would lead to a state outside the interval. The sample paths of the process satisfy an integral equation; the paths have finite local variation and may have discontinuities. We establish the existence and uniqueness of the process based on a Skorohod equation. This leads to an explicit expression for the process on the doubly-infinite time axis. The expression is especially tractable when the process is stationary with stationary input–output measures. This representation is an extension of the classical Loynes representation of stationary waiting times in single-server queues with stationary inputs and services. We also describe several properties of stationary processes: Palm probabilities of the processes at jump times, Little laws for waiting times in the system, finiteness of moments and extensions to tandem and treelike networks.  相似文献   

16.
Performance measurement systems along the lines of the EFQM and the balanced scorecard have developed rapidly in recent years, and now occupy much management time and effort. There is limited evidence that performance improvement has received proportionate attention. Six organisations selected for their success were studied using a grounded theory approach based on interviews with management accountants and operations managers in each of the organisations. It is clear that they are all making strenuous efforts to use their performance measurement systems but with a focus on the ‘good enough’ rather than the detail. This gave managers in these organisations the time and space to concentrate on the use of performance measures on forward looking relevance, understanding and action, rather than retrospective and detailed control. This approach was promoted by senior managers and was based on their ability to see the business in simple terms and their understanding of the key drivers of business performance.  相似文献   

17.
This paper aims to investigate the capacity utilization (CU) of selected large Chinese iron and steel enterprises by using the data envelopment analysis (DEA) approach. In this paper, we first propose a DEA model with an unrestricted capacity directional output distance function by incorporating two joint cost disposability relations and differentiating between non-emission-causing inputs and emission-causing inputs. Second, we define the status of decision-making units (DMUs) with excess capacity and design a method to identify insufficient variable inputs. We use a group of 48 largest Chinese iron and steel enterprises as our sample to investigate their performance in terms of CU indicator. The main findings are summarised and can be used to support the corresponding policymaking for the managers of both government and enterprises.  相似文献   

18.
Moment-independent importance measures are increasingly used by practitioners to understand how output uncertainty may be shared between a set of stochastic inputs. Computing Borgonovo's sensitivity indices for a large group of inputs is still a challenging problem due to the curse of dimensionality and it is addressed in this article. An estimation scheme taking the most of recent developments in copula theory is developed. Furthermore, the concept of Shapley value is used to derive new sensitivity indices, which makes the interpretation of Borgonovo's indices much easier. The resulting importance measure offers a double advantage compared with other existing methods since it allows to quantify the impact exerted by one input variable on the whole output distribution after taking into account all possible dependencies and interactions with other variables. The validity of the proposed methodology is established on several analytical examples and the benefits in terms of computational efficiency are illustrated with real-life test cases such as the study of the water flow through a borehole. In addition, a detailed case study dealing with the atmospheric re-entry of a launcher first stage is completed.  相似文献   

19.
The input set for the Analytic Hierarchy Process consists of the integers less than ten and their reciprocals. A perturbation in the input may cause significant change in the output vector, particularly when the dimension of the input matrix is small (n ≤ 4) and the inputs are close together. We examine that change using numerical computational schemes. The cardinal values of the output vector are seen to suffer significant perturbation due to small integral changes on all the inputs. Ordinal perturbations are less pronounced and can be better quantified by a numerical scheme. Known indices of consistency provide an indicator of truly poor input data. For an unskilled modeler, they can provide a false sense of assurance, but these consistency measures do not protect the modeler from consistently inaccurate inputs.  相似文献   

20.
Real-time software systems with tight performance requirements are abundant. These systems frequently use many different algorithms and if any one of these algorithms was to experience behavior that is atypical because of the input, the entire system may not be able to meet its performance requirements. Unfortunately, it is algorithmically intractable, if not unsolvable, to find the inputs which would cause worst-case behavior. If inputs can be identified that make the system take, say, ten times longer compared to the time it usually takes, that information is valuable for some systems. In this paper, we present a method for finding inputs that perform much worse than the average input to different algorithms. We use the simulated annealing heuristic search method and show that this method is successful in finding worst-case inputs to several sorting algorithms, using several measures of an algorithm’s runtime.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号