943 resultados para Efficiency models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Performance evaluation in conventional data envelopment analysis (DEA) requires crisp numerical values. However, the observed values of the input and output data in real-world problems are often imprecise or vague. These imprecise and vague data can be represented by linguistic terms characterised by fuzzy numbers in DEA to reflect the decision-makers' intuition and subjective judgements. This paper extends the conventional DEA models to a fuzzy framework by proposing a new fuzzy additive DEA model for evaluating the efficiency of a set of decision-making units (DMUs) with fuzzy inputs and outputs. The contribution of this paper is threefold: (1) we consider ambiguous, uncertain and imprecise input and output data in DEA, (2) we propose a new fuzzy additive DEA model derived from the a-level approach and (3) we demonstrate the practical aspects of our model with two numerical examples and show its comparability with five different fuzzy DEA methods in the literature. Copyright © 2011 Inderscience Enterprises Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Emrouznejad et al. (2010) proposed a Semi-Oriented Radial Measure (SORM) model for assessing the efficiency of Decision Making Units (DMUs) by Data Envelopment Analysis (DEA) with negative data. This paper provides a necessary and sufficient condition for boundedness of the input and output oriented SORM models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Guest editorial Ali Emrouznejad is a Senior Lecturer at the Aston Business School in Birmingham, UK. His areas of research interest include performance measurement and management, efficiency and productivity analysis as well as data mining. He has published widely in various international journals. He is an Associate Editor of IMA Journal of Management Mathematics and Guest Editor to several special issues of journals including Journal of Operational Research Society, Annals of Operations Research, Journal of Medical Systems, and International Journal of Energy Management Sector. He is in the editorial board of several international journals and co-founder of Performance Improvement Management Software. William Ho is a Senior Lecturer at the Aston University Business School. Before joining Aston in 2005, he had worked as a Research Associate in the Department of Industrial and Systems Engineering at the Hong Kong Polytechnic University. His research interests include supply chain management, production and operations management, and operations research. He has published extensively in various international journals like Computers & Operations Research, Engineering Applications of Artificial Intelligence, European Journal of Operational Research, Expert Systems with Applications, International Journal of Production Economics, International Journal of Production Research, Supply Chain Management: An International Journal, and so on. His first authored book was published in 2006. He is an Editorial Board member of the International Journal of Advanced Manufacturing Technology and an Associate Editor of the OR Insight Journal. Currently, he is a Scholar of the Advanced Institute of Management Research. Uses of frontier efficiency methodologies and multi-criteria decision making for performance measurement in the energy sector This special issue aims to focus on holistic, applied research on performance measurement in energy sector management and for publication of relevant applied research to bridge the gap between industry and academia. After a rigorous refereeing process, seven papers were included in this special issue. The volume opens with five data envelopment analysis (DEA)-based papers. Wu et al. apply the DEA-based Malmquist index to evaluate the changes in relative efficiency and the total factor productivity of coal-fired electricity generation of 30 Chinese administrative regions from 1999 to 2007. Factors considered in the model include fuel consumption, labor, capital, sulphur dioxide emissions, and electricity generated. The authors reveal that the east provinces were relatively and technically more efficient, whereas the west provinces had the highest growth rate in the period studied. Ioannis E. Tsolas applies the DEA approach to assess the performance of Greek fossil fuel-fired power stations taking undesirable outputs into consideration, such as carbon dioxide and sulphur dioxide emissions. In addition, the bootstrapping approach is deployed to address the uncertainty surrounding DEA point estimates, and provide bias-corrected estimations and confidence intervals for the point estimates. The author revealed from the sample that the non-lignite-fired stations are on an average more efficient than the lignite-fired stations. Maethee Mekaroonreung and Andrew L. Johnson compare the relative performance of three DEA-based measures, which estimate production frontiers and evaluate the relative efficiency of 113 US petroleum refineries while considering undesirable outputs. Three inputs (capital, energy consumption, and crude oil consumption), two desirable outputs (gasoline and distillate generation), and an undesirable output (toxic release) are considered in the DEA models. The authors discover that refineries in the Rocky Mountain region performed the best, and about 60 percent of oil refineries in the sample could improve their efficiencies further. H. Omrani, A. Azadeh, S. F. Ghaderi, and S. Abdollahzadeh presented an integrated approach, combining DEA, corrected ordinary least squares (COLS), and principal component analysis (PCA) methods, to calculate the relative efficiency scores of 26 Iranian electricity distribution units from 2003 to 2006. Specifically, both DEA and COLS are used to check three internal consistency conditions, whereas PCA is used to verify and validate the final ranking results of either DEA (consistency) or DEA-COLS (non-consistency). Three inputs (network length, transformer capacity, and number of employees) and two outputs (number of customers and total electricity sales) are considered in the model. Virendra Ajodhia applied three DEA-based models to evaluate the relative performance of 20 electricity distribution firms from the UK and the Netherlands. The first model is a traditional DEA model for analyzing cost-only efficiency. The second model includes (inverse) quality by modelling total customer minutes lost as an input data. The third model is based on the idea of using total social costs, including the firm’s private costs and the interruption costs incurred by consumers, as an input. Both energy-delivered and number of consumers are treated as the outputs in the models. After five DEA papers, Stelios Grafakos, Alexandros Flamos, Vlasis Oikonomou, and D. Zevgolis presented a multiple criteria analysis weighting approach to evaluate the energy and climate policy. The proposed approach is akin to the analytic hierarchy process, which consists of pairwise comparisons, consistency verification, and criteria prioritization. In the approach, stakeholders and experts in the energy policy field are incorporated in the evaluation process by providing an interactive mean with verbal, numerical, and visual representation of their preferences. A total of 14 evaluation criteria were considered and classified into four objectives, such as climate change mitigation, energy effectiveness, socioeconomic, and competitiveness and technology. Finally, Borge Hess applied the stochastic frontier analysis approach to analyze the impact of various business strategies, including acquisition, holding structures, and joint ventures, on a firm’s efficiency within a sample of 47 natural gas transmission pipelines in the USA from 1996 to 2005. The author finds that there were no significant changes in the firm’s efficiency by an acquisition, and there is a weak evidence for efficiency improvements caused by the new shareholder. Besides, the author discovers that parent companies appear not to influence a subsidiary’s efficiency positively. In addition, the analysis shows a negative impact of a joint venture on technical efficiency of the pipeline company. To conclude, we are grateful to all the authors for their contribution, and all the reviewers for their constructive comments, which made this special issue possible. We hope that this issue would contribute significantly to performance improvement of the energy sector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Latent topics derived by topic models such as Latent Dirichlet Allocation (LDA) are the result of hidden thematic structures which provide further insights into the data. The automatic labelling of such topics derived from social media poses however new challenges since topics may characterise novel events happening in the real world. Existing automatic topic labelling approaches which depend on external knowledge sources become less applicable here since relevant articles/concepts of the extracted topics may not exist in external sources. In this paper we propose to address the problem of automatic labelling of latent topics learned from Twitter as a summarisation problem. We introduce a framework which apply summarisation algorithms to generate topic labels. These algorithms are independent of external sources and only rely on the identification of dominant terms in documents related to the latent topic. We compare the efficiency of existing state of the art summarisation algorithms. Our results suggest that summarisation algorithms generate better topic labels which capture event-related context compared to the top-n terms returned by LDA. © 2014 Association for Computational Linguistics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The notion model of development and distribution of software (MDDS) is introduced and its role for the efficiency of the software products is stressed. Two classical MDDS are presented and some attempts to adapt them to the contemporary trends in web-based software design are described. Advantages and shortcomings of the obtained models are outlined. In conclusion the desired features of a better MDDS for web-based solutions are given.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data envelopment analysis (DEA) is the most widely used methods for measuring the efficiency and productivity of decision-making units (DMUs). The need for huge computer resources in terms of memory and CPU time in DEA is inevitable for a large-scale data set, especially with negative measures. In recent years, wide ranges of studies have been conducted in the area of artificial neural network and DEA combined methods. In this study, a supervised feed-forward neural network is proposed to evaluate the efficiency and productivity of large-scale data sets with negative values in contrast to the corresponding DEA method. Results indicate that the proposed network has some computational advantages over the corresponding DEA models; therefore, it can be considered as a useful tool for measuring the efficiency of DMUs with (large-scale) negative data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In non-linear random effects some attention has been very recently devoted to the analysis ofsuitable transformation of the response variables separately (Taylor 1996) or not (Oberg and Davidian 2000) from the transformations of the covariates and, as far as we know, no investigation has been carried out on the choice of link function in such models. In our study we consider the use of a random effect model when a parameterized family of links (Aranda-Ordaz 1981, Prentice 1996, Pregibon 1980, Stukel 1988 and Czado 1997) is introduced. We point out the advantages and the drawbacks associated with the choice of this data-driven kind of modeling. Difficulties in the interpretation of regression parameters, and therefore in understanding the influence of covariates, as well as problems related to loss of efficiency of estimates and overfitting, are discussed. A case study on radiotherapy usage in breast cancer treatment is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Incorporating Material Balance Principle (MBP) in industrial and agricultural performance measurement systems with pollutant factors has been on the rise in recent years. Many conventional methods of performance measurement have proven incompatible with the material flow conditions. This study will address the issue of eco-efficiency measurement adjusted for pollution, taking into account materials flow conditions and the MBP requirements, in order to provide ‘real’ measures of performance that can serve as guides when making policies. We develop a new approach by integrating slacks-based measure to enhance the Malmquist Luenberger Index by a material balance condition that reflects the conservation of matter. This model is compared with a similar model, which incorporates MBP using the trade-off approach to measure productivity and eco-efficiency trends of power plants. Results reveal similar findings for both models substantiating robustness and applicability of the proposed model in this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Performance analysis has become a vital part of the management practices in the banking industry. There are numerous applications using DEA models to estimate efficiency in banking, and most of them assume that inputs and outputs are known with absolute precision. Here, we propose new Fuzzy-DEA α-level models to assess underlying uncertainty. Further, bootstrap truncated regressions with fixed factors are used to measure the impact of each model on the efficiency scores and to identify the most relevant contextual variables on efficiency. The proposed models have been demonstrated using an application in Mozambican banks to handle the underlying uncertainty. Findings reveal that fuzziness is predominant over randomness in interpreting the results. In addition, fuzziness can be used by decision-makers to identify missing variables to help in interpreting the results. Price of labor, price of capital, and market-share were found to be the significant factors in measuring bank efficiency. Managerial implications are addressed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the main objectives in restructuring power industry is enhancing the efficiency of power facilities. However, power generation industry, which plays a key role in the power industry, has a noticeable share in emission amongst all other emission-generating sectors. In this study, we have developed some new Data Envelopment Analysis models to find efficient power plants based on less fuel consumption, combusting less polluting fuel types, and incorporating emission factors in order to measure the ecological efficiency trend. We then applied these models to measuring eco-efficiency during an eight-year period of power industry restructuring in Iran. Results reveal that there has been a significant improvement in eco-efficiency, cost efficiency and allocative efficiency of the power plants during the restructuring period. It is also shown that despite the hydro power plants look eco-efficient; the combined cycle ones have been more allocative efficient than the other power generation technologies used in Iran.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A tanulmány a PPP különböző strukturális modelljeinek csoportosítását mutatja be, az egyes típusok rövid rendszerező áttekintésével. A tipológiák vizsgálata hasznos ahhoz, hogy a PPP projektek struktúrájának kialakításakor a különböző lehetőségeket mérlegelni tudjuk. Többféle megközelítésben lehet a modelleket tipizálni. Az együttműködés célja alapján a hatékonyság-, a minőség- és a finanszírozás-orientált modellek a legelterjedtebbek, a kockázatmegosztás módja alapján BOT, DBFO és koncessziós változatok, a haszonmegosztás szabályozása alapján árplafon-szabályozású, közvetlen haszonszabályozású, fixdíjas és árnyékáras megoldások a leginkább bevettek. A tanulmány ezek elemző bemutatása alapján arra a következtetésre jut, hogy a gyakorlati megoldások a legtöbb esetben az elméleti típusok valamilyen kombinációját tartalmazzák, a konkrét eset feltételeinek megfelelően. Így a gyakorlatban a fix tipológiák helyett alkalmasabb úgy megközelítenünk a PPP-t, mint egy folyamatosan változó, a helyi igényekhez idomuló jelenséget. A haszonszabályozó tipológia kapcsán a tanulmány melléklete rövid áttekintést nyújt a PPP esetében kritikus méltányos haszon becslésének lehetséges megoldásairól is. = This study shows a categorization of the different structural models of Public-Private Partnership (PPP) projects. The typologies are useful to assess the available options when decisions on PPP project structures are made. There are different categorizing aspects. Based ont he key purpose of the partnership there are efficiency, quality and financing focused models. From a risk sharing point of view, BOT, DBFO and concession models are most typical. Regarding the regulation of returns price-cap models, ’open book’ models, fixed price and shadow pricing models are most common. Based on the analytical assessment of these, they study concludes that actual projects are mostly a combination of theoretical types, as required by the given case. Therefore in practice, it is more appropriate to approach PPP projects as a constantly shaping concept, adjustable to particular conditions. Supporting the approaches to the regulation of returns, an appendix of the study summarizes the different methods to estimate fair returns, a critical issue in PPP projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Koopmans gyakorlati problémák megoldása során szerzett tapasztalatait általánosítva fogott hozzá a lineáris tevékenységelemzési modell kidolgozásához. Meglepődve tapasztalta, hogy a korabeli közgazdaságtan nem rendelkezett egységes, kellően egzakt termeléselmélettel és fogalomrendszerrel. Úttörő dolgozatában ezért - mintegy a lineáris tevékenységelemzési modell elméleti kereteként - lerakta a technológiai halmazok fogalmán nyugvó axiomatikus termeléselmélet alapjait is. Nevéhez fűződik a termelési hatékonyság és a hatékonysági árak fogalmának egzakt definíciója, s az egymást kölcsönösen feltételező viszonyuk igazolása a lineáris tevékenységelemzési modell keretében. A hatékonyság manapság használatos, pusztán műszaki szempontból értelmezett definícióját Koopmans csak sajátos esetként tárgyalta, célja a gazdasági hatékonyság fogalmának a bevezetése és elemzése volt. Dolgozatunkban a lineáris programozás dualitási tételei segítségével rekonstruáljuk ez utóbbira vonatkozó eredményeit. Megmutatjuk, hogy egyrészt bizonyításai egyenértékűek a lineáris programozás dualitási tételeinek igazolásával, másrészt a gazdasági hatékonysági árak voltaképpen a mai értelemben vett árnyékárak. Rámutatunk arra is, hogy a gazdasági hatékonyság értelmezéséhez megfogalmazott modellje az Arrow-Debreu-McKenzie-féle általános egyensúlyelméleti modellek közvetlen előzményének tekinthető, tartalmazta azok szinte minden lényeges elemét és fogalmát - az egyensúlyi árak nem mások, mint a Koopmans-féle hatékonysági árak. Végezetül újraértelmezzük Koopmans modelljét a vállalati technológiai mikroökonómiai leírásának lehetséges eszközeként. Journal of Economic Literature (JEL) kód: B23, B41, C61, D20, D50. /===/ Generalizing from his experience in solving practical problems, Koopmans set about devising a linear model for analysing activity. Surprisingly, he found that economics at that time possessed no uniform, sufficiently exact theory of production or system of concepts for it. He set out in a pioneering study to provide a theoretical framework for a linear model for analysing activity by expressing first the axiomatic bases of production theory, which rest on the concept of technological sets. He is associated with exact definition of the concept of production efficiency and efficiency prices, and confirmation of their relation as mutual postulates within the linear model of activity analysis. Koopmans saw the present, purely technical definition of efficiency as a special case; he aimed to introduce and analyse the concept of economic efficiency. The study uses the duality precepts of linear programming to reconstruct the results for the latter. It is shown first that evidence confirming the duality precepts of linear programming is equal in value, and secondly that efficiency prices are really shadow prices in today's sense. Furthermore, the model for the interpretation of economic efficiency can be seen as a direct predecessor of the Arrow–Debreu–McKenzie models of general equilibrium theory, as it contained almost every essential element and concept of them—equilibrium prices are nothing other than Koopmans' efficiency prices. Finally Koopmans' model is reinterpreted as a necessary tool for microeconomic description of enterprise technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A cikkben paneladatok segítségével a magyar gabonatermesztő üzemek 2001 és 2009 közötti technikai hatékonyságát vizsgáljuk. A technikai hatékonyság szintjének becslésére egy hagyományos sztochasztikus határok modell (SFA) mellett a látens csoportok modelljét (LCM) használjuk, amely figyelembe veszi a technológiai különbségeket is. Eredményeink arra utalnak, hogy a technológiai heterogenitás fontos lehet egy olyan ágazatban is, mint a szántóföldi növénytermesztés, ahol viszonylag homogén technológiát alkalmaznak. A hagyományos, azonos technológiát feltételező és a látens osztályok modelljeinek összehasonlítása azt mutatja, hogy a gabonatermesztő üzemek technikai hatékonyságát a hagyományos modellek alábecsülhetik. _____ The article sets out to analyse the technical efficiency of Hungarian crop farms between 2001 and 2009, using panel data and employing both standard stochastic frontier analysis and the latent class model (LCM) to estimate technical efficiency. The findings suggest that technological heterogeneity plays an important role in the crop sector, though it is traditionally assumed to employ homogenous technology. A comparison of standard SFA models that assumes the technology is common to all farms and LCM estimates highlights the way the efficiency of crop farms can be underestimated using traditional SFA models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective for physics based modeling of the power converter components is to design the whole converter with respect to physical and operational constraints. Therefore, all the elements and components of the energy conversion system are modeled numerically and combined together to achieve the whole system behavioral model. Previously proposed high frequency (HF) models of power converters are based on circuit models that are only related to the parasitic inner parameters of the power devices and the connections between the components. This dissertation aims to obtain appropriate physics-based models for power conversion systems, which not only can represent the steady state behavior of the components, but also can predict their high frequency characteristics. The developed physics-based model would represent the physical device with a high level of accuracy in predicting its operating condition. The proposed physics-based model enables us to accurately develop components such as; effective EMI filters, switching algorithms and circuit topologies [7]. One of the applications of the developed modeling technique is design of new sets of topologies for high-frequency, high efficiency converters for variable speed drives. The main advantage of the modeling method, presented in this dissertation, is the practical design of an inverter for high power applications with the ability to overcome the blocking voltage limitations of available power semiconductor devices. Another advantage is selection of the best matching topology with inherent reduction of switching losses which can be utilized to improve the overall efficiency. The physics-based modeling approach, in this dissertation, makes it possible to design any power electronic conversion system to meet electromagnetic standards and design constraints. This includes physical characteristics such as; decreasing the size and weight of the package, optimized interactions with the neighboring components and higher power density. In addition, the electromagnetic behaviors and signatures can be evaluated including the study of conducted and radiated EMI interactions in addition to the design of attenuation measures and enclosures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the importance of mangrove ecosystems in the global carbon budget, the relationships between environmental drivers and carbon dynamics in these forests remain poorly understood. This limited understanding is partly a result of the challenges associated with in situ flux studies. Tower-based CO2 eddy covariance (EC) systems are installed in only a few mangrove forests worldwide, and the longest EC record from the Florida Everglades contains less than 9 years of observations. A primary goal of the present study was to develop a methodology to estimate canopy-scale photosynthetic light use efficiency in this forest. These tower-based observations represent a basis for associating CO2 fluxes with canopy light use properties, and thus provide the means for utilizing satellite-based reflectance data for larger scale investigations. We present a model for mangrove canopy light use efficiency utilizing the enhanced green vegetation index (EVI) derived from the Moderate Resolution Imaging Spectroradiometer (MODIS) that is capable of predicting changes in mangrove forest CO2 fluxes caused by a hurricane disturbance and changes in regional environmental conditions, including temperature and salinity. Model parameters are solved for in a Bayesian framework. The model structure requires estimates of ecosystem respiration (RE), and we present the first ever tower-based estimates of mangrove forest RE derived from nighttime CO2 fluxes. Our investigation is also the first to show the effects of salinity on mangrove forest CO2 uptake, which declines 5% per each 10 parts per thousand (ppt) increase in salinity. Light use efficiency in this forest declines with increasing daily photosynthetic active radiation, which is an important departure from the assumption of constant light use efficiency typically applied in satellite-driven models. The model developed here provides a framework for estimating CO2 uptake by these forests from reflectance data and information about environmental conditions.