In recent years there has been an exponential growth in the number of publications related to theory and applications of Data Envelopment Analysis (DEA). Charnes, Cooper, and Rhodes (1978) introduced DEA as a tool for measuring efficiency and productivity of decision making units. DEA has immediately been recognized as a modern tool for performance measurement. Since then, a large and considerable amount of articles has been appeared, including significant breakthroughs in theory and a great portion of works on DEA applications, both public and private sectors, to assess the efficiency and productivity of their activities. Although there have been several bibliographic collections reported, a comprehensive analysis and listing of DEA-related articles covering its first four decades of history is still missing. This paper, thus, aims to report an extensive listing of DEA-related articles including theory and methodology developments and "real" applications in diversified scenarios from 1978 to end of 2016. Some summary statistics of the publications' growth, the most utilized academic journals, authorship analysis, as well as keywords analysis are also provided.
Optimization modeling has become a powerful tool to tackle emergency logistics problems since its first adoption in maritime disaster situations in the 1970s. Using techniques of content analysis, this paper reviews optimization models utilized in emergency logistics. Disaster operations can be performed before or after disaster occurrence. Short-notice evacuation, facility location, and stock pre-positioning are drafted as the main pre-disaster operations, while relief distribution and casualty transportation are categorized as post-disaster operations. According to these operations, works in the literature are broken down into three parts: facility location, relief distribution and casualty transportation, and other operations. For the first two parts, the literature is structured and analyzed based on the model types, decisions, objectives, and constraints. Finally, through the content analysis framework, several research gaps are identified and future research directions are proposed.
Since the original Data Envelopment Analysis (DEA) study by Charnes et al. [Measuring the efficiency of decision-making units. European Journal of Operational Research 1978;2(6):429–44], there has been rapid and continuous growth in the field. As a result, a considerable amount of published research has appeared, with a significant portion focused on DEA applications of efficiency and productivity in both public and private sector activities. While several bibliographic collections have been reported, a comprehensive listing and analysis of DEA research covering its first 30 years of history is not available. This paper thus presents an extensive, if not nearly complete, listing of DEA research covering theoretical developments as well as “real-world” applications from inception to the year 2007. A listing of the most utilized/relevant journals, a keyword analysis, and selected statistics are presented.
Radial measures of efficiency estimated using linear programming (LP) methods can be biased since slack in the constraints defining the technology suggests that at least one input can be reduced, or one output can be expanded, even though a firm is deemed to be “technically efficient.” In this paper, we propose a directional slacks-based measure of technical inefficiency to account for the potential of slack in technological constraints. When no such slacks exist, directional slacks-based inefficiency collapses to the directional technology distance function. Our proposed measure helps to generalize some of the existing slacks-based measures of inefficiency. We examine the financial services provided by Japanese cooperative Shinkin banks, and estimate their inefficiency during the period 2002–2005. This inefficiency declined slightly during the period. We thus propose that slack is an important source of inefficiency which is often not captured by the directional technology distance function.
Disaster relief presents many unique logistics challenges, with problems including damaged transportation infrastructure, limited communication, and coordination of multiple agents. Central to disaster relief logistics is the distribution of life-saving commodities to beneficiaries. Operations research models have potential to help relief agencies save lives and money, maintain standards of humanitarianism and fairness and maximize the use of limited resources amid post-disaster chaos. Through interviews with aid organizations, reviews of their publications, and a literature review of operations research models in transportation of relief goods, this paper provides an analysis of the use of such models from the perspective of both practitioners and academics. With the complexity of disaster relief distribution and the relatively small number of journal articles written on it, this is an area with potential for helping relief organizations and for tremendous growth in operations research.
This paper proposes a logistics model for delivery of prioritized items in disaster relief operations. It considers multi-items, multi-vehicles, multi-periods, soft time windows, and a split delivery strategy scenario, and is formulated as a multi-objective integer programming model. To effectively solve this model we limit the number of available tours. Two heuristic approaches are introduced for this purpose. The first approach is based on a genetic algorithm, while the second approach is developed by decomposing the original problem. We compare these two approaches via a computational study. The multi-objective problem is converted to a single-objective problem by the weighted sum method. A case study is presented to illustrate the potential applicability of our model. Also, presented is a comparison of our model with that proposed in a recent paper by Balcik et al. . The results show that our proposed model outperforms theirs in terms of delivering prioritized items over several time periods.
The goal of this research is to develop a comprehensive model that describes the integrated logistics operations in response to natural disasters. We propose a mathematical model that controls the flow of several relief commodities from the sources through the supply chain and until they are delivered to the hands of recipients. The structure of the network is in compliance with FEMA's complex logistics structure. The proposed model not only considers details such as vehicle routing and pick up or delivery schedules; but also considers finding the optimal locations for several layers of temporary facilities as well as considering several capacity constraints for each facility and the transportation system. Such an integrated model provides the opportunity for a centralized operation plan that can eliminate delays and assign the limited resources to the best possible use. A set of numerical experiments is designed to test the proposed formulation and evaluate the properties of the optimization problem. The numerical analysis shows the capabilities of the model to handle the large-scale relief operations with adequate details. However, the problem size and difficulty grows rapidly by extending the length of the operations or when the equity among recipients is considered. In these cases, it is suggested to find fast solution algorithms and heuristic methods in future research. ► Math model is developed for integrated logistics operations in disaster response. ► The network structure is in compliance with FEMA's complex logistics structure. ► Optimal commodity flows, vehicle routing, and facility location are integrated. ► Numerical analysis shows ability to handle large-scale operations with details.
Natural disasters often result in large numbers of evacuees being temporarily housed in schools, churches, and other shelters. The sudden influx of people seeking shelter creates demands for emergency supplies, which must be delivered quickly. A dynamic allocation model is constructed to optimize pre-event planning for meeting short-term demands (over approximately the first 72 h) for emergency supplies under uncertainty about what demands will have to be met and where those demands will occur. The model also includes requirements for reliability in the solutions – i.e., the solution must ensure that all demands are met in scenarios comprising at least 100 % of all outcomes. A case study application using shelter locations in North Carolina and a set of hurricane threat scenarios is used to illustrate the model and how it supports an emergency relief strategy. ► This paper presents a pre-event disaster mitigation strategy for shelter readiness. ► The problem is formulated as a dynamic allocation model. ► The model considers demand uncertainty and supply reliability. ► Experiments show the impact of reliability in the solution. ► A case study of shelter preparation is used to illustrate the model.
This paper proposes a new framework based on the combination of the dynamic DEA, meta-frontier analysis theory, and truncated regression model, and then focuses on the efficiency evaluation of regional high-tech industries in China. For all of the overall technical efficiency, technical efficiency, and scale efficiency scores, the east area is always in the lead, with the central and west areas obviously lagging behind. The eastern area has the highest technology level, whereas the west and central areas fall behind in turn. However, the meta-technology ratio of the west area has rapidly increased and presents a trend of catching up with the east. The variables of GRP per capital, total exports and imports, highway mileage per capita, and ratio of tertiary industry to GRP have positive relationships with technical efficiency, and the time trend exhibits a negative coefficient.
This paper provides the first taxonomy of hospital efficiency studies that uses data envelopment analysis (DEA) and related techniques. We provide a systematic review of 79 such studies published from 1984–2004 that represent 12 countries. Only studies written in English are considered. A cross-national comparison reveals significant differences with respect to important study characteristics such as type of DEA model selected and choice of input and output categories. Compared with US studies, European efforts are more likely to measure allocative rather than technical efficiency, use longitudinal data, and use fewer observations. We take a longitudinal perspective that illustrates the of this research, as well as its diffusion across disciplines. Our taxonomy can be used by policy makers and researchers to review past and assemble new, DEA models.
In the event of a catastrophic bio-terror attack, major urban centers need to efficiently distribute large amounts of medicine to the population. In this paper, we consider a facility location problem to determine the points in a large city where medicine should be handed out to the population. We consider locating capacitated facilities in order to maximize coverage, taking into account a distance-dependent coverage function and demand uncertainty. We formulate a special case of the maximal covering location problem (MCLP) with a loss function, to account for the distance-sensitive demand, and chance-constraints to address the demand uncertainty. This model decides the locations to open, and the supplies and demand assigned to each location. We solve this problem with a locate-allocate heuristic. We illustrate the use of the model by solving a case study of locating facilities to address a large-scale emergency of a hypothetical anthrax attack in Los Angeles County.
This paper describes a software package for computing non-parametric efficiency estimates, making inference, and testing hypotheses in frontier models. Commands are provided for bootstrapping as well as computation of some new, robust estimators of efficiency, etc.
Disasters are extraordinary situations that require significant logistical deployment to transport equipment and humanitarian goods in order to help and provide relief to victims. An efficient response helps to reduce the social, economic and environmental impacts. In this paper, we define and formulate a practical transportation problem often encountered by crisis managers in emergency situations. Since optimal solutions to such a formulation may be achieved only for very small-size instances, we developed an efficient genetic algorithm to deal with realistic situations. This algorithm produces near optimal solutions in relatively short computation times and is fast enough to be used interactively in a decision-support system, providing high-quality transportation plans to emergency managers.
This study focuses on the allocation of R&D resources in R&D active firms. We utilize the input oriented constant (CRS) and variable (VRS) returns to scale data efficiency analysis models to evaluate the efficiency of firms. Scale efficiency and the respective types of returns to scale have been examined by using DEA models with ratio inputs and outputs. We pay attention to the global frontier and the firm's own sector and size frontiers. We highlight the sources of inefficiency and suggestions are proposed to improve efficiencies of R&D resources allocation. The analysis is based on a representative set of (quasi-) permanent R&D active firms in Belgium. We consider R&D related inputs in the year 2009 and include firm performance in terms of turnover and net added value per employee in a four year time span. The paper highlights that on average, R&D active firms suffer from both technical inefficiency and scale size problems since the average of the CRS and the VRS efficiency are low, and also the average of scale efficiency is modest. According to firm size, small-sized firms suffer from scale and technical inefficiency. Medium-sized firms endure scale inefficiency rather than technical inefficiency. Large firms present a higher average scale efficiency and technical efficiency. According to sector of activity, firms in specialized supplier industries tend to outperform other firms in terms of average scale efficiency and average technical efficiency. Firms in science based industries are found to underperform on average in terms of VRS and scale efficiency.
This study investigates the capacity utilization (CU) of Chinese manufacturing industries, using a CU indicator based on data envelopment analysis and directional distance functions (DDFs). The inputs are separated into variable inputs and a quasi-fixed input to measure the gap of DDFs, which indicated either under-utilization of inputs or overcapacity. Moreover, we define an indicator for CU change over time and introduced the corresponding decomposition. We note that, during the study time period (2007–2010), the CU of Chinese manufacturing industries improved, which implies that Chinese manufacturing industries expanded their production and got closer to their capacity during the examined period. The driving force of this improvement is technical changes. The higher average CU values of light manufacturing industries than that of the heavy industries and the extremely high CU values of two light industries reveal a severe overcapacity problem in the light industries. We also provided the methods and conduct analysis on determining optimal variable inputs and the type of the overcapacity on specific DMUs. The bootstrap regression procedures are employed to test the influence of environmental variables on CU values. Finally, we provide policy implications and suggestions for policymakers who oversee the development of Chinese manufacturing industries.
This paper examines efficiency of producing quality in hospitals between 2009 and 2013 using Dynamic Network Data Envelopment Analysis (DEA) and the hospital characteristics that contribute to this efficiency. Dynamic Network DEA was used to compute efficiency scores for hospital sub-divisions i.e. medical/surgical care (patient visits, surgeries and discharges) and quality. Pearson's correlation test was performed to assess if there are trade-offs or synergies between the efficiency of producing quality and efficiency of producing medical/surgical care. Multinomial logistic regression was performed to determine hospital and market characteristics that contribute to efficiency in production of quality outputs. Efficiency of quality production improved significantly between 2009 and 2013 with no trade-off between efficiency of producing quality outputs and efficiency of producing medical care. Urban and teaching hospitals were less likely to improve efficiency of quality production.
Different types of plants are used to generate electricity in the US: single-, multi-, and mixed-electricity plants. In this paper, we question the best type/design of plants for both renewable and non-renewable electricity. To do so, we suggest a new index that takes the form of a Malmquist productivity index. The specificity of our new index is that it offers the option to investigate the performances and the causes of the performance changes for each type of electricity separately; this is not possible when relying on more standard indexes. Moreover, our new index takes the links between the inputs and the outputs into account, and is nonparametric in nature. Using our index, we study the performances of more than 5000 plants for the period 2000–2012. Our findings reveal that single-electricity plants perform better for renewable electricity, while multi-electricity plants perform better for non-renewable electricity. This is coherent with the decreasing importance of multi-electricity plants in the US, and the increasing importance of single-electricity plants producing renewable electricity. Furthermore, our results do not suggest that combining renewable and non-renewable electricity generations within a plant improves the performance of the plants. Finally, we demonstrate that the reasons for the changes in performance are different for each type of electricity and plant.
One of the key objectives of humanitarian logistics is to guarantee the timely delivery of supplies to people affected by disasters during the response phase. In this regard, it is fundamental to design appropriate models to minimize the social costs of response operations to distribute essential supplies to populations in need. In addition to merely cover logistics cost, social costs include deprivations costs, which are an increasing function of deprivation time, derived from the human suffering caused by the lack of access to a good or a service. This research uses the theory of discrete choices to assess deprivation costs due to the time spent waiting for the delivery of a basket of basic supplies, defined as the changes in the welfare of people affected by disasters. To this end, we designed a stated choice survey, applied to people living in areas affected by floods and earthquakes in Colombia. The estimated models consider the influence of individual's socioeconomic characteristics and random effects on the deprivation cost functions. The functions have a nonlinear structure, strictly increasing, and convex on the deprivation time. The results are useful for estimating the social costs of humanitarian relief operations.