首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Pseudolinearity and efficiency   总被引:9,自引:0,他引:9  
First order and second order characterizations of pseudolinear functions are derived. For a nonlinear programming problem involving pseudolinear functions only, it is proved that every efficient solution is properly efficient under some mild conditions.  相似文献   

2.
This paper incorporates cones on virtual multipliers of inputs and outputs into DEA analysis. Cone DEA models are developed to generalize the dual of the BCC models as well as congestion models. Input-output data and/or numbers of DMUs for BCC models are inadequate to capture many aspects where judgments, expert opinions, and other external information should be taken into analysis. Cone DEA models, on the other hand, offer improved definitions of efficiency over general cone and polyhedral cone structures. The relationships between cone models and BCC models as well as those between cone models and congestion models are discussed in the development. Two numerical examples are provided to illustrate our findings.  相似文献   

3.
We consider unconstrained finite dimensional multi-criteria optimization problems, where the objective functions are continuously differentiable. Motivated by previous work of Brosowski and da Silva (1994), we suggest a number of tests (TEST 1–4) to detect, whether a certain point is a locally (weakly) efficient solution for the underlying vector optimization problem or not. Our aim is to show: the points, at which none of the TESTs 1–4 can be applied, form a nowhere dense set in the state space. TESTs 1 and 2 are exactly those proposed by Brosowski and da Silva. TEST 3 deals with a local constant behavior of at least one of the objective functions. TEST 4 includes some conditions on the gradients of objective functions satisfied locally around the point of interest. It is formulated as a Conjecture. It is proven under additional assumptions on the objective functions, such as linear independence of the gradients, convexity or directional monotonicity. This work was partially supported by grant 55681 of the CONACyT.  相似文献   

4.
Bridging radial and non-radial measures of efficiency in DEA   总被引:2,自引:0,他引:2  
Data envelopment analysis (DEA) has been utilized worldwide for measuring efficiencies of banks, telecommunications, electric utilities and so forth. Yet, the existing models have some well-known shortcomings that limit their usefulness. In DEA we have two fundamental approaches to measuring efficiency with very different characteristics; radial and non-radial. We demonstrate a method for linking these two approaches in a unified framework called Connected-SBM (slacks-based measure). It includes two scalar parameters, and by changing the parameter values we can relocate the analysis anywhere between the radial and the non-radial models. An appropriate choice of these parameters can overcome the key shortcomings inherent in the two approaches, namely, proportionality and mixed patterns of slacks.  相似文献   

5.
《Optimization》2012,61(1-4):369-385
In this paper, we are concerned with global efficiency in multiobjective optimization. After exposing a property of a cone-subconvexlike function, we prove that a local weakly efficient solution, a local efficient solution and a local properly efficient solution are respectively a global weakly efficient solution, a global efficient solution and a global properly efficient solution of a multiobjective programming problem if cone- subconvexlikeness or cone-pre-invexity is assumed  相似文献   

6.
This article proposes a new method for measuring an aggregative efficiency of multiple period production systems. Every organization or firm generates a time series of data that represent its performances in the resource utilization and output production over multiple periods, and often desires an aggregated measure of efficiency for several periods of interest. Data envelopment analysis (DEA) has become an accepted and well-known approach to evaluating efficiency performance in a wide range of cases. However, most of the DEA studies have dealt primarily with ways to gauge the efficiency of production in only a single period so this efficiency reflects the insufficient or partial performance of multiple period productions. The new method is developed through extensions of the concept of Debreu–Farrell technical efficiency and is applied to evaluating the efficiency of cable TV service units with 3-year data.  相似文献   

7.
This paper studies the sales of a single indivisible object where bidders have continuous valuations. In Grigorieva et al. [14] it was shown that, in this setting, query auctions necessarily allocate inefficiently in equilibrium. In this paper we propose a new sequential auction, called the c-fraction auction. We show the existence of an ex-post equilibrium, called bluff equilibrium, in which bidders behave truthfully except for particular constellations of observed bids at which it is optimal to pretend a slightly higher valuation. We show c-fraction auctions guarantee approximate efficiency at any desired level of accuracy, independent of the number of bidders, when bidders choose to play the bluff equilibrium. We discuss the running time and the efficiency in the bluff equilibrium. We show that by changing the parameter c of the auction we can trade off efficiency against running time.  相似文献   

8.
9.
Data envelopment analysis (DEA) has proven to be a useful tool for assessing efficiency or productivity of organizations which is of vital practical importance in managerial decision making. While DEA assumes exact input and output data, the development of imprecise DEA (IDEA) broadens the scope of applications to efficiency evaluations involving imprecise information which implies various forms of ordinal and bounded data possibly or often occurring in practice. The primary purpose of this article is to characterize the variable efficiency in IDEA. Since DEA describes a pair of primal and dual models, also called envelopment and multiplier models, we can basically consider two IDEA models: One incorporates imprecise data into envelopment model and the other includes the same imprecise data in multiplier model. The issues of rising importance are thus the relationships between the two models and how to solve them. The groundwork we will make includes a duality study, which makes it possible to characterize the efficiency solutions from the two models and link with the efficiency bounds and classifications that some of the published IDEA studies have done. The other purposes are to present computational aspects of the efficiency bounds and how to interpret the efficiency solutions. The computational method developed here extends the previous IDEA method to effectively incorporate a more general form of strict ordinal data and partial orders in its framework, which in turn overcomes some drawbacks of the previous approaches. The interpretation of the resulting efficiency is also important but we have never seen it before.  相似文献   

10.
Quantile regression for robust bank efficiency score estimation   总被引:1,自引:0,他引:1  
We discuss quantile regression techniques as a robust and easy to implement alternative for estimating Farell technical efficiency scores. The quantile regression approach estimates the production process for benchmark banks located at top conditional quantiles. Monte Carlo simulations reveal that even when generating data according to the assumptions of the stochastic frontier model (SFA), efficiency estimates obtained from quantile regressions resemble SFA-efficiency estimates. We apply the SFA and the quantile regression approach to German bank data for three banking groups, commercial banks, savings banks and cooperative banks to estimate efficiency scores based on a simple value added function and a multiple-input–multiple-output cost function. The results reveal that the efficient (benchmark) banks have production and cost elasticities which differ considerably from elasticities obtained from conditional mean functions and stochastic frontier functions.  相似文献   

11.
Additive efficiency decomposition in two-stage DEA   总被引:1,自引:0,他引:1  
Kao and Hwang (2008) [Kao, C., Hwang, S.-N., 2008. Efficiency decomposition in two-stage data envelopment analysis: An application to non-life insurance companies in Taiwan. European Journal of Operational Research 185 (1), 418–429] develop a data envelopment analysis (DEA) approach for measuring efficiency of decision processes which can be divided into two stages. The first stage uses inputs to generate outputs which become the inputs to the second stage. The first stage outputs are referred to as intermediate measures. The second stage then uses these intermediate measures to produce outputs. Kao and Huang represent the efficiency of the overall process as the product of the efficiencies of the two stages. A major limitation of this model is its applicability to only constant returns to scale (CRS) situations. The current paper develops an additive efficiency decomposition approach wherein the overall efficiency is expressed as a (weighted) sum of the efficiencies of the individual stages. This approach can be applied under both CRS and variable returns to scale (VRS) assumptions. The case of Taiwanese non-life insurance companies is revisited using this newly developed approach.  相似文献   

12.
During the late 1970s and 1980s, volatility in the demand for natural gas in the United States created havoc in the industry's transmission sector. Managers of firms were presented with the task of improving productive efficiency in an uncertain and regulated industry. We measure the relative performance of 20 pipelines, using Data Envelopment Analysis (DEA) to produce a regulation constrained minimum cost frontier. With the results we construct a Fisher Productivity Index (FPI). Results show that despite productive efficiency decline for most years, most firms experienced improvement in technical efficiency and technical progress in most years. Most of the productive efficiency decline was due to scale diseconomies.  相似文献   

13.
The advent of Internet banking and phone banking is changing the role of bank branches from a predominantly transaction-based one to a sales-oriented role. This paper reports on an assessment of the branches of a Portuguese bank in terms of their performance in their new roles in three different areas: Their efficiency in fostering the use of new transaction channels, their efficiency in increasing sales and their customer base, and their efficiency in generating profits. Service quality is also a major issue in service organisations like bank branches, and therefore we analyse the way this dimension of performance has been accounted for in the literature and take it into account in our empirical application. We have used data envelopment analysis (DEA) for the different performance assessments, but we depart from traditional DEA models in some cases. Performance comparisons on each dimension allowed us to identify benchmark bank branches and also problematic bank branches. In addition, we found positive links between operational and profit efficiency and also between transactional and operational efficiency. Service quality is positively related with operational and profit efficiency.  相似文献   

14.
Data envelopment analysis (DEA) is a useful tool of efficiency measurement for firms and organizations. Kao and Hwang (2008) take into account the series relationship of the two sub-processes in a two-stage production process, and the overall efficiency of the whole process is the product of the efficiencies of the two sub-processes. To find the largest efficiency of one sub-process while maintaining the maximum overall efficiency of the whole process, Kao and Hwang (2008) propose a solution procedure to accomplish this purpose. Nevertheless, one needs to know the overall efficiency of the whole process before calculating the sub-process efficiency. In this note, we propose a method that is able to find the sub-process and overall efficiencies simultaneously.  相似文献   

15.
This paper uses the nonparametric DEA methodology to estimate cost and profit efficiency of Indian banks during the post-reform period. The results show considerable variation in average levels of profit efficiency across various ownership categories of banks. In general, state owned banks are found to be more efficient than their private counter parts. Further, efficiency tends to be low among the small banks (assets up to Rs. 50 billion), indicating that at the existing scale of operations, these banks are operating far below the efficient frontier. We also examine the distribution of efficiency using nonparametric kernel density estimates. The analysis reveals a rightward-shift of the efficiency distribution over the years. A major part of this shift comes from the state owned banks. Based on the conditional distribution, the study finds strong evidence of ownership explaining the efficiency differential of banks. Additionally, bank size and product-mix are also found to be important, although to a lesser extent.  相似文献   

16.
The current non–parametric method of measuring productive efficiency of input–output systems is generalized here in the stochastic case in terms of an information theory approach based on the concept of entropy. Use of maximum entropy as a method of finding the most probable distribution of the input–output data set and as a predictive criterion is illustrated for production systems with multiple inputs and outputs.  相似文献   

17.
Conventional two-stage data envelopment analysis (DEA) models measure the overall performance of a production system composed of two stages (processes) in a specified period of time, where variations in different periods are ignored. This paper takes the operations of individual periods into account to develop a multi-period two-stage DEA model, which is able to measure the overall and period efficiencies at the same time, with the former expressed as a weighted average of the latter. Since the efficiency of a two-stage system in a period is the product of the two process efficiencies, the overall efficiency of a decision making unit (DMU) in the specified period of time can be decomposed into the process efficiency of each period. Based on this decomposition, the sources of inefficiency in a DMU can be identified. The efficiencies measured from the model can also be used to calculate a common-weight global Malmquist productivity index (MPI) between two periods, in that the overall MPI is the product of the two process MPIs. The non-life insurance industry in Taiwan is used to verify the proposed model, and to explain why some companies performed unsatisfactorily in the specified period of time.  相似文献   

18.
DEA model with shared resources and efficiency decomposition   总被引:2,自引:0,他引:2  
Data envelopment analysis (DEA) has proved to be an excellent approach for measuring performance of decision making units (DMUs) that use multiple inputs to generate multiple outputs. In many real world scenarios, DMUs have a two-stage network process with shared input resources used in both stages of operations. For example, in hospital operations, some of the input resources such as equipment, personnel, and information technology are used in the first stage to generate medical record to track treatments, tests, drug dosages, and costs. The same set of resources used by first stage activities are used to generate the second-stage patient services. Patient services also use the services generated by the first stage operations of housekeeping, medical records, and laundry. These DMUs have not only inputs and outputs, but also intermediate measures that exist in-between the two-stage operations. The distinguishing characteristic is that some of the inputs to the first stage are shared by both the first and second stage, but some of the shared inputs cannot be conveniently split up and allocated to the operations of the two stages. Recognizing this distinction is critical for these types of DEA applications because measuring the efficiency of the production for first-stage outputs can be misleading and can understate the efficiency if DEA fails to consider that some of the inputs generate other second-stage outputs. The current paper develops a set of DEA models for measuring the performance of two-stage network processes with non splittable shared inputs. An additive efficiency decomposition for the two-stage network process is presented. The models are developed under the assumption of variable returns to scale (VRS), but can be readily applied under the assumption of constant returns to scale (CRS). An application is provided.  相似文献   

19.
介绍一种新的用于计算函数跳跃值的集中因子.文中得到的结果与经典的Lukács定理以及A.Gelb和E.Tadmor所得的一些结果有关.  相似文献   

20.
1. Main ResultThe SK-model is one of the welLknown mean field models for spin glasses and Talagrandlllreported some advances of this model. Concretely, the SK-model can be described as follows.SK-model (without external field). For a natural nUInber N, letWhere (gij)1相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号