922 resultados para Data Envelopment Analysis (DEA), scale efficiency, technical efficiency
Resumo:
This special issue of the Journal of the Operational Research Society is dedicated to papers on the related subjects of knowledge management and intellectual capital. These subjects continue to generate considerable interest amongst both practitioners and academics. This issue demonstrates that operational researchers have many contributions to offer to the area, especially by bringing multi-disciplinary, integrated and holistic perspectives. The papers included are both theoretical as well as practical, and include a number of case studies showing how knowledge management has been implemented in practice that may assist other organisations in their search for a better means of managing what is now recognised as a core organisational activity. It has been accepted by a growing number of organisations that the precise handling of information and knowledge is a significant factor in facilitating their success but that there is a challenge in how to implement a strategy and processes for this handling. It is here, in the particular area of knowledge process handling that we can see the contributions of operational researchers most clearly as is illustrated in the papers included in this journal edition. The issue comprises nine papers, contributed by authors based in eight different countries on five continents. Lind and Seigerroth describe an approach that they call team-based reconstruction, intended to help articulate knowledge in a particular organisational. context. They illustrate the use of this approach with three case studies, two in manufacturing and one in public sector health care. Different ways of carrying out reconstruction are analysed, and the benefits of team-based reconstruction are established. Edwards and Kidd, and Connell, Powell and Klein both concentrate on knowledge transfer. Edwards and Kidd discuss the issues involved in transferring knowledge across frontières (borders) of various kinds, from those borders within organisations to those between countries. They present two examples, one in distribution and the other in manufacturing. They conclude that trust and culture both play an important part in facilitating such transfers, that IT should be kept in a supporting role in knowledge management projects, and that a staged approach to this IT support may be the most effective. Connell, Powell and Klein consider the oft-quoted distinction between explicit and tacit knowledge, and argue that such a distinction is sometimes unhelpful. They suggest that knowledge should rather be regarded as a holistic systemic property. The consequences of this for knowledge transfer are examined, with a particular emphasis on what this might mean for the practice of OR Their view of OR in the context of knowledge management very much echoes Lind and Seigerroth's focus on knowledge for human action. This is an interesting convergence of views given that, broadly speaking, one set of authors comes from within the OR community, and the other from outside it. Hafeez and Abdelmeguid present the nearest to a 'hard' OR contribution of the papers in this special issue. In their paper they construct and use system dynamics models to investigate alternative ways in which an organisation might close a knowledge gap or skills gap. The methods they use have the potential to be generalised to any other quantifiable aspects of intellectual capital. The contribution by Revilla, Sarkis and Modrego is also at the 'hard' end of the spectrum. They evaluate the performance of public–private research collaborations in Spain, using an approach based on data envelopment analysis. They found that larger organisations tended to perform relatively better than smaller ones, even though the approach used takes into account scale effects. Perhaps more interesting was that many factors that might have been thought relevant, such as the organisation's existing knowledge base or how widely applicable the results of the project would be, had no significant effect on the performance. It may be that how well the partnership between the collaborators works (not a factor it was possible to take into account in this study) is more important than most other factors. Mak and Ramaprasad introduce the concept of a knowledge supply network. This builds on existing ideas of supply chain management, but also integrates the design chain and the marketing chain, to address all the intellectual property connected with the network as a whole. The authors regard the knowledge supply network as the natural focus for considering knowledge management issues. They propose seven criteria for evaluating knowledge supply network architecture, and illustrate their argument with an example from the electronics industry—integrated circuit design and fabrication. In the paper by Hasan and Crawford, their interest lies in the holistic approach to knowledge management. They demonstrate their argument—that there is no simple IT solution for organisational knowledge management efforts—through two case study investigations. These case studies, in Australian universities, are investigated through cultural historical activity theory, which focuses the study on the activities that are carried out by people in support of their interpretations of their role, the opportunities available and the organisation's purpose. Human activities, it is argued, are mediated by the available tools, including IT and IS and in this particular context, KMS. It is this argument that places the available technology into the knowledge activity process and permits the future design of KMS to be improved through the lessons learnt by studying these knowledge activity systems in practice. Wijnhoven concentrates on knowledge management at the operational level of the organisation. He is concerned with studying the transformation of certain inputs to outputs—the operations function—and the consequent realisation of organisational goals via the management of these operations. He argues that the inputs and outputs of this process in the context of knowledge management are different types of knowledge and names the operation method the knowledge logistics. The method of transformation he calls learning. This theoretical paper discusses the operational management of four types of knowledge objects—explicit understanding; information; skills; and norms and values; and shows how through the proposed framework learning can transfer these objects to clients in a logistical process without a major transformation in content. Millie Kwan continues this theme with a paper about process-oriented knowledge management. In her case study she discusses an implementation of knowledge management where the knowledge is centred around an organisational process and the mission, rationale and objectives of the process define the scope of the project. In her case they are concerned with the effective use of real estate (property and buildings) within a Fortune 100 company. In order to manage the knowledge about this property and the process by which the best 'deal' for internal customers and the overall company was reached, a KMS was devised. She argues that process knowledge is a source of core competence and thus needs to be strategically managed. Finally, you may also wish to read a related paper originally submitted for this Special Issue, 'Customer knowledge management' by Garcia-Murillo and Annabi, which was published in the August 2002 issue of the Journal of the Operational Research Society, 53(8), 875–884.
Resumo:
Several indices of plant capacity utilization based on the concept of best practice frontier have been proposed in the literature (Fare et al. 1992; De Borger and Kerstens, 1998). This paper suggests an alternative measure of capacity utilization change based on Generalized Malmquist index, proposed by Grifell-Tatje' and Lovell in 1998. The advantage of this specification is that it allows the measurement of productivity growth ignoring the nature of scale economies. Afterwards, this index is used to measure capacity change of a panel of Italian firms over the period 1989-94 using Data Envelopment Analysis and then its abilities of explaining the short-run movements of output are assessed.
Resumo:
The paper reports on preliminary results of an ongoing research aiming at development of an automatic procedure for recognition of discourse-compositional structure of scientific and technical texts, which is required in many NLP applications. The procedure exploits as discourse markers various domain-independent words and expressions that are specific for scientific and technical texts and organize scientific discourse. The paper discusses features of scientific discourse and common scientific lexicon comprising such words and expressions. Methodological issues of development of a computer dictionary for common scientific lexicon are concerned; basic principles of its organization are described as well. Main steps of the discourse-analyzing procedure based on the dictionary and surface syntactical analysis are pointed out.
Resumo:
* The research is supported partly by INTAS: 04-77-7173 project, http://www.intas.be
Resumo:
En l’absence de mesure précise et unique de l’efficience pour les joueurs de hockey, la présente étude a pour objectifs d’évaluer l’efficience des joueurs dans la Ligue Nationale de Hockey (LNH) et de montrer comment celle-ci peut affecter la décision de racheter le contrat d’un joueur. Pour ce faire, les statistiques individuelles des joueurs de la LNH pour les saisons 2007-2008 à 2010-2011 sont utilisées. Pour estimer l’efficience, la méthode de l’enveloppement de données (DEA) avec bootstrap est utilisée. Les inputs incluent le salaire et le nombre de minutes de jeu, alors que les outputs incluent la contribution défensive et offensive de chaque joueur. Pour estimer l’association entre l’efficience individuelle et la probabilité d’un rachat de contrat, une régression logistique est utilisée. L’analyse des données montre que parmi 3 159 observations, l’efficience moyenne est de 0,635. L’efficience moyenne est similaire pour toutes les positions et toutes les saisons. Un lien positif et fort est trouvé entre le nombre de points au classement général d’une équipe et l’efficience moyenne des joueurs qui la compose (coefficient de corrélation=0,43, valeur-p<0,01). Les joueurs avec une efficience plus élevée ont une probabilité plus faible de voir leur contrat racheté (rapport des chances=0,01, valeur-p<0,01). La présente étude conclut donc que la plupart des joueurs de hockey dans la LNH ont un degré d’inefficience non négligeable, qu’une efficience plus élevée est associée à une meilleure performance au niveau de l’équipe et que les joueurs efficients ont une probabilité plus faible de voir leur contrat racheté.
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Exatas, Departamento de Ciência da Computação, 2015.
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Exatas, Departamento de Matemática, Programa de Mestrado Profissional em Matemática em Rede Nacional, 2016.
Resumo:
With the exponential growth of the usage of web-based map services, the web GIS application has become more and more popular. Spatial data index, search, analysis, visualization and the resource management of such services are becoming increasingly important to deliver user-desired Quality of Service. First, spatial indexing is typically time-consuming and is not available to end-users. To address this, we introduce TerraFly sksOpen, an open-sourced an Online Indexing and Querying System for Big Geospatial Data. Integrated with the TerraFly Geospatial database [1-9], sksOpen is an efficient indexing and query engine for processing Top-k Spatial Boolean Queries. Further, we provide ergonomic visualization of query results on interactive maps to facilitate the user’s data analysis. Second, due to the highly complex and dynamic nature of GIS systems, it is quite challenging for the end users to quickly understand and analyze the spatial data, and to efficiently share their own data and analysis results with others. Built on the TerraFly Geo spatial database, TerraFly GeoCloud is an extra layer running upon the TerraFly map and can efficiently support many different visualization functions and spatial data analysis models. Furthermore, users can create unique URLs to visualize and share the analysis results. TerraFly GeoCloud also enables the MapQL technology to customize map visualization using SQL-like statements [10]. Third, map systems often serve dynamic web workloads and involve multiple CPU and I/O intensive tiers, which make it challenging to meet the response time targets of map requests while using the resources efficiently. Virtualization facilitates the deployment of web map services and improves their resource utilization through encapsulation and consolidation. Autonomic resource management allows resources to be automatically provisioned to a map service and its internal tiers on demand. v-TerraFly are techniques to predict the demand of map workloads online and optimize resource allocations, considering both response time and data freshness as the QoS target. The proposed v-TerraFly system is prototyped on TerraFly, a production web map service, and evaluated using real TerraFly workloads. The results show that v-TerraFly can accurately predict the workload demands: 18.91% more accurate; and efficiently allocate resources to meet the QoS target: improves the QoS by 26.19% and saves resource usages by 20.83% compared to traditional peak load-based resource allocation.
Resumo:
Background To identify those characteristics of self-management interventions in patients with heart failure (HF) that are effective in influencing health-related quality of life, mortality, and hospitalizations. Methods and Results Randomized trials on self-management interventions conducted between January 1985 and June 2013 were identified and individual patient data were requested for meta-analysis. Generalized mixed effects models and Cox proportional hazard models including frailty terms were used to assess the relation between characteristics of interventions and health-related outcomes. Twenty randomized trials (5624 patients) were included. Longer intervention duration reduced mortality risk (hazard ratio 0.99, 95% confidence interval [CI] 0.97–0.999 per month increase in duration), risk of HF-related hospitalization (hazard ratio 0.98, 95% CI 0.96–0.99), and HF-related hospitalization at 6 months (risk ratio 0.96, 95% CI 0.92–0.995). Although results were not consistent across outcomes, interventions comprising standardized training of interventionists, peer contact, log keeping, or goal-setting skills appeared less effective than interventions without these characteristics. Conclusion No specific program characteristics were consistently associated with better effects of self-management interventions, but longer duration seemed to improve the effect of self-management interventions on several outcomes. Future research using factorial trial designs and process evaluations is needed to understand the working mechanism of specific program characteristics of self-management interventions in HF patients.
Resumo:
The Nature-Based Solutions (NBS) concept and approach were developed to simultaneously face challenges such as risk mitigation and biodiversity conservation and restoration. NBSs have been endorsed by major International Organizations such as the EU, the FAO and World Bank that are pushing to enable a mainstreaming process. However, a shift from traditional engineering “grey” solutions to wider and standard adoption of NBS encounters technical, social, cultural, and normative barriers that have been identified with a qualitative content analysis of policy documents, reports and expert interviews. The case of the region Emilia-Romagna was studied by developing an analytical framework that brought together the social-ecological context, the governance system and the characteristics of specific NBSs.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia do Ambiente, perfil de Engenharia Ecológica
Resumo:
In a series of papers (Tang, Chin and Rao, 2008; and Tang, Petrie and Rao 2006 & 2007), we have tried to improve on a mortality-based health status indicator, namely age-at-death (AAD), and its associated health inequality indicators that measure the distribution of AAD. The main contribution of these papers is to propose a frontier method to separate avoidable and unavoidable mortality risks. This has facilitated the development of a new indicator of health status, namely the Realization of Potential Life Years (RePLY). The RePLY measure is based on the concept of a “frontier country” that, by construction, has the lowest mortality risks for each age-sex group amongst all countries. The mortality rates of the frontier country are used as a proxy for the unavoidable mortality rates, and the residual between the observed mortality rates and the unavoidable mortality rates are considered as avoidable morality rates. In this approach, however, countries at different levels of development are benchmarked against the same frontier country without considering their heterogeneity. The main objective of the current paper is to control for national resources in estimating (conditional) unavoidable and avoidable mortality risks for individual countries. This allows us to construct a new indicator of health status – Realization of Conditional Potential Life Years (RCPLY). The paper presents empirical results from a dataset of life tables for 167 countries from the year 2000, compiled and updated by the World Health Organization. Measures of national average health status and health inequality based on RePLY and RCPLY are presented and compared.
Resumo:
El análisis empírico del presente trabajo trata de evaluar los resultados asociables al gasto sanitario español y como consecuencia calibrar las distintas orientaciones para el crecimiento futuro del gasto sanitario público. A la vista del estudio de los potenciales impactos procedente de las variaciones de gasto en los márgenes sobre los niveles actuales de recursos, se trataría de valorar los siguientes objetivos: a) Las mejoras de eficiencia (performance) del sistema sanitario en su conjunto; b) La mejor consecución de cotas de equidad (tanto en contribuciones finacieras como en el acceso a prestaciones); c) El incremento de la capacidad de respuesta por parte de la oferta y del dispositivo asistencial del sistema a las necesidades percibidas (índice de responsiveness).Se trata de aproximar con ello la valoración de cuantos recursos se requerirían para cerrar la brecha entre los niveles de gasto observados en España y las mejores prácticas observadas de la muestra estimada. Ello equivale a la cuantificación del gasto sanitario adicional necesario por punto marginal de ganancia en el valor de los indicadores observados, de acuerdo con los valores estimados en el análisis empírico.Para la definición de las potencialidades en resultados, la aproximación no paramétrica del análisis envolvente de datos (AED, Data Envelopment Analysis) es particularmente apropiada para ello. El objetivo es evaluar distancias para los valores españoles respecto del benchmarking derivado de la estimación, y cuantificar con ello el coste de cierre de brecha óptimo (entre los outputs de dimensiones múltiples considerados y los diversos recursos puestos a disposición del sistema sanitario).