811 resultados para Benchmarking Tool
Resumo:
El mundo económico actual se desarrolla dentro del marco de la globalización, que ha llevado a una mayor internacionalización y competitividad de cada uno de los mercados del mundo. En esta investigación se analizarán las mejores prácticas que poseen las mejores escuelas de administración del mundo. Gran parte de éstas cuentan con al menos una de las tres acreditaciones internacionales más importantes en el ámbito de la administración, las cuales brindan un mayor posicionamiento en el mercado de la educación superior. Por lo tanto, la aplicación de la herramienta Benchmarking brindará un mayor entendimiento de las variables y mejores prácticas que han llevado al éxito en la obtención de éstas acreditaciones por parte de aquellas universidades del mundo, y además permitirá que la Facultad de Administración de la Universidad del Rosario logre comprender sus fortalezas y debilidades y emprenda el camino del mejoramiento continuo en la búsqueda de éstas acreditaciones internacionales y posicionamiento en el mercado.
Resumo:
Purpose – The purpose of this paper is to analyze the way in which the knowledge competitiveness of regions is measured and further introduces the World Knowledge Competitiveness Index (WKCI) benchmarking tool. Design/methodology/approach – The methodology consists of an econometric analysis of key indicators relating to the concept of knowledge competitiveness for 125 regions from across the globe consisting of 55 representatives from North America, 45 from Europe and 25 from Asia and Oceania. Findings – The key to winning the super competitive race in the knowledge-based economy is investment in the future: research and development, and education and training. It is found that the majority of the high-performing regional economies in the USA have a knowledge competitive edge over their counterparts in Europe and Asia. Research limitations/implications – To an extent, the research is limited by the availability of comparable indicators and metrics at the regional level that extend across the globe. Whilst comparative data are often accessible at the national level, regional data sources remain underdeveloped. Practical implications – The WKCI has become internationally recognized as an important instrument for economic development policymakers and regional investment promotion agents as they create and refine their strategies and targets. In particular, it has provided a benchmark that allows regions to compare their knowledge competitiveness with other regions for around the world and not only their own nation or continent. Originality/value – The WKCI is the first composite and relative measure of the knowledge competitiveness of the globe's best performing regions.
Resumo:
To assess quality of care of women with severe maternal morbidity and to identify associated factors. This is a national multicenter cross-sectional study performing surveillance for severe maternal morbidity, using the World Health Organization criteria. The expected number of maternal deaths was calculated with the maternal severity index (MSI) based on the severity of complication, and the standardized mortality ratio (SMR) for each center was estimated. Analyses on the adequacy of care were performed. 17 hospitals were classified as providing adequate and 10 as nonadequate care. Besides almost twofold increase in maternal mortality ratio, the main factors associated with nonadequate performance were geographic difficulty in accessing health services (P < 0.001), delays related to quality of medical care (P = 0.012), absence of blood derivatives (P = 0.013), difficulties of communication between health services (P = 0.004), and any delay during the whole process (P = 0.039). This is an example of how evaluation of the performance of health services is possible, using a benchmarking tool specific to Obstetrics. In this study the MSI was a useful tool for identifying differences in maternal mortality ratios and factors associated with nonadequate performance of care.
Resumo:
Performance indicators in the public sector have often been criticised for being inadequate and not conducive to analysing efficiency. The main objective of this study is to use data envelopment analysis (DEA) to examine the relative efficiency of Australian universities. Three performance models are developed, namely, overall performance, performance on delivery of educational services, and performance on fee-paying enrolments. The findings based on 1995 data show that the university sector was performing well on technical and scale efficiency but there was room for improving performance on fee-paying enrolments. There were also small slacks in input utilisation. More universities were operating at decreasing returns to scale, indicating a potential to downsize. DEA helps in identifying the reference sets for inefficient institutions and objectively determines productivity improvements. As such, it can be a valuable benchmarking tool for educational administrators and assist in more efficient allocation of scarce resources. In the absence of market mechanisms to price educational outputs, which renders traditional production or cost functions inappropriate, universities are particularly obliged to seek alternative efficiency analysis methods such as DEA.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
Työn tavoitteena on selvittää välillisten kunnossapitokustannusten rakennetta, selkeyttää kunnossapitokonseptin myyntiä ja luoda benchmarking- työkalu kunnossapitoliiketoiminnan jatkuvaan kehittämiseen. Kunnossapidon palvelukonseptin tarjoamiseen liittyen tutkimuksessa kartoitetaan kunnossapidon asiakkaan näkemyksiä kunnossapidon vaikutuksista asiakkaan edustamassa organisaatiossa, jolloin voidaan saada työkaluja kunnossapitotuotteiden ja kunnossapitokonseptien edelleen kehittämiseksi paremmin asiakkaan tarpeita tyydyttävään suuntaan ja näin nostaa asiakkaan ostaman palvelun laatua ja saada tuotetuksi lisäarvoa asiakkaalle. Tutkimuksen perusteella voidaan todeta kunnossapidon vaikutuksen yrityksen talouteen ja menestykseen olevan huomattava. Kun tarkastellaan kunnossapitoa vain välittömien kustannusten osalta, ei yrityksen toimintoja voida optimoida tehokkaimman mahdollisen toiminnan mahdollistavalla tavalla. Kunnossapito voidaan siis kehittää aidosti lisäarvoa tuottavaksi palveluksi, kun saadaan epäsuoria kustannuksia asteittain vähennetyksi.
Resumo:
This thesis evaluates methods for obtaining high performance in applications running on the mobile Java platform. Based on the evaluated methods, an optimization was done to a Java extension API running on top the Symbian operating system. The API provides location-based services for mobile Java applications. As a part of this thesis, the JNI implementation in Symbian OS was also benchmarked. A benchmarking tool was implemented in the analysis phase in order to implement extensive performance test set. Based on the benchmark results, it was noted that the landmarks implementation of the API was performing very slowly with large amounts of data. The existing implementation proved to be very inconvenient for optimization because the early implementers did not take performance and design issues into consideration. A completely new architecture was implemented for the API in order to provide scalable landmark initialization and data extraction by using lazy initialization methods. Additionally, runtime memory consumption was also an important part of the optimization. The improvement proved to be very efficient based on the measurements after the optimization. Most of the common API use cases performed extremely well compared to the old implementation. Performance optimization is an important quality attribute of any piece of software especially in embedded mobile devices. Typically, projects get into trouble with performance because there are no clear performance targets and knowledge how to achieve them. Well-known guidelines and performance models help to achieve good overall performance in Java applications and programming interfaces.
Resumo:
The goal of this research – which is to critically analyze current theories and methods of intangible assets evaluation and potentially develop and test new methodology based on the practical example(s) in the IT industry. Having this goal in mind the main research questions in this paper will be: What are advantages and disadvantages of the current practices of measurement intellectual capital or valuation of intangible assets? How to properly measure intellectual capital in IT? Resulting method exhibits a new unique approach to the IC measurement and potentially even larger field of application. Despite the fact that in this particular research, I focused my attention on IT (Software and Internet services cluster – to be exact), the logic behind the method is applicable within any industry since the method is designed to be fully compliant with measurement theory and thus can be properly scaled for any application. Building a new method is a difficult and iterative process: in the current iteration the method stands out as rather a theoretical concept rather than a business tool, however even current concept totally fulfills its purpose as a benchmarking tool for measuring intellectual capital in IT industry.
Resumo:
Las tecnologías de la información han empezado a ser un factor importante a tener en cuenta en cada uno de los procesos que se llevan a cabo en la cadena de suministro. Su implementación y correcto uso otorgan a las empresas ventajas que favorecen el desempeño operacional a lo largo de la cadena. El desarrollo y aplicación de software han contribuido a la integración de los diferentes miembros de la cadena, de tal forma que desde los proveedores hasta el cliente final, perciben beneficios en las variables de desempeño operacional y nivel de satisfacción respectivamente. Por otra parte es importante considerar que su implementación no siempre presenta resultados positivos, por el contrario dicho proceso de implementación puede verse afectado seriamente por barreras que impiden maximizar los beneficios que otorgan las TIC.
Resumo:
The idea of Sustainable Intensification comes as a response to the challenge of avoiding resources such as land, water and energy being overexploited while increasing food production for an increasing demand from a growing global population. Sustainable Intensification means that farmers need to simultaneously increase yields and sustainably use limited natural resources, such as water. Within the agricultural sector water has a number of uses including irrigation, spraying, drinking for livestock and washing (vegetables, livestock buildings). In order to achieve Sustainable Intensification measures are needed that enable policy makers and managers to inform them about the relative performance of farms as well as of possible ways to improve such performance. We provide a benchmarking tool to assess water use (relative) efficiency at a farm level, suggest pathways to improve farm level productivity by identifying best practices for reducing excessive use of water for irrigation. Data Envelopment Analysis techniques including analysis of returns to scale were used to evaluate any excess in agricultural water use of 66 Horticulture Farms based on different River Basin Catchments across England. We found that farms in the sample can reduce on average water requirements by 35% to achieve the same output (Gross Margin) when compared to their peers on the frontier. In addition, 47% of the farms operate under increasing returns to scale, indicating that farms will need to develop economies of scale to achieve input cost savings. Regarding the adoption of specific water use efficiency management practices, we found that the use of a decision support tool, recycling water and the installation of trickle/drip/spray lines irrigation system has a positive impact on water use efficiency at a farm level whereas the use of other irrigation systems such as the overhead irrigation system was found to have a negative effect on water use efficiency.
Resumo:
Advanced Building Energy Data Visualization is a way to detect performance problems in commercialbuildings. By placing sensors in a building that collects data from example, air temperature and electricalpower, then makes it possible to calculate the data in Data Visualization software. This softwaregenerates visual diagrams so the building manager or building operator can see if for example thepower consumption is to high.A first step (before sensors are installed in a building) to see how the energy consumption is in abuilding can be to use a Benchmarking Tool. There is a number of Benchmarking Tools that is availablefor free on the Internet. Each tool have a bit different approach, but they all show how much energyconsumption there is in a building compared to other similar buildings.In this study a new web design for the benchmarking tool CalARCH has been developed. CalARCHis developed at the Berkeley Lab in Berkeley, California, USA. CalARCH uses data collected only frombuildings in California, and is only for comparing buildings in California with other similar buildingsin the state.Five different versions of the web site were made. Then a web survey was done to determine whichversion would be the best for CalARCH. The results showed that Version 5 and Version 3 was the best.Then a new version was made, based on these two versions. This study was made at the LawrenceBerkeley Laboratory.
Resumo:
BACKGROUND Due to the implementation of the diagnosis-related groups (DRG) system, the competitive pressure on German hospitals increased. In this context it has been shown that acute pain management offers economic benefits for hospitals. The aim of this study was to analyze the impact of the competitive situation, the ownership and the economic resources required on structures and processes for acute pain management. MATERIAL AND METHODS A standardized questionnaire on structures and processes of acute pain management was mailed to the 885 directors of German departments of anesthesiology listed as members of the German Society of Anesthesiology and Intensive Care Medicine (DGAI, Deutsche Gesellschaft für Anästhesiologie und Intensivmedizin). RESULTS For most hospitals a strong regional competition existed; however, this parameter affected neither the implementation of structures nor the recommended treatment processes for pain therapy. In contrast, a clear preference for hospitals in private ownership to use the benchmarking tool QUIPS (quality improvement in postoperative pain therapy) was found. These hospitals also presented information on coping with the management of pain in the corporate clinic mission statement more often and published information about the quality of acute pain management in the quality reports more frequently. No differences were found between hospitals with different forms of ownership in the implementation of acute pain services, quality circles, expert standard pain management and the implementation of recommended processes. Hospitals with a higher case mix index (CMI) had a certified acute pain management more often. The corporate mission statement of these hospitals also contained information on how to cope with pain, presentation of the quality of pain management in the quality report, implementation of quality circles and the implementation of the expert standard pain management more frequently. There were no differences in the frequency of using the benchmarking tool QUIPS or the implementation of recommended treatment processes with respect to the CMI. CONCLUSION In this survey no effect of the competitive situation of hospitals on acute pain management could be demonstrated. Private ownership and a higher CMI were more often associated with structures of acute pain management which were publicly accessible in terms of hospital marketing.
Resumo:
This sustained longitudinal study, carried out in a single local authority, investigates the implementation of a Total Quality Management (TQM) philosophy in professional local government services. At the start of this research, a large majority of what was written about TQM was polemical and based on limited empirical evidence. This thesis seeks to provide a significant and important piece of work, making a considerable contribution to the current state of knowledge in this area. Teams from four professional services within a single local authority participated in this research, providing the main evidence on how the quality management agenda in a local authority can be successfully implemented. To supplement this rich source of data, various other sources and methods of data collection have been used: 1) Interviews were carried out with senior managers from within the authority; 2) Customer focus groups and questionnaires were used; 3) Interviews were carried out with other organisations, all of which were proponents of a TQM philosophy. A number of tools have been developed to assist in gathering data: 1) The CSFs (critical success factors) benchmarking tool; 2) Five Stages of Quality Improvement Model. A Best Practice Quality Improvement Model, arising from an analysis of the literature and the researcher's own experience is proposed and tested. From the results a number of significant conclusions have been drawn relating to: 1) Triggers for change; 2) Resistance of local government professionals to change 3) Critical success factors and barriers to quality improvement in professional local government services; 4) The problems associated with participant observation and other methodological issues used.
Resumo:
The article investigates the division between member states of the European Union considering the aspect of their level of information and communication technology (ICT) development focusing on e-learning. With the help of discriminant analysis the countries are categorized into groups based on their ICT maturity and e-learning literacy level of development. Making a comparison with a benchmarking tool, the ITU (International Telecommunication Union)’s ICT Development Index (IDI) the results are confirmed partly correct. The article tries to find economical explanations for the re-grouping of the countries ranking. Finally the author examines the reliability of Hungary’s ranking results and the factors which may affect this divergence from the real picture.
Resumo:
This paper assesses the status of pre-disaster risk management in the case of Turkey. By focusing on the period following the catastrophic August 17, 1999 earthquake, the study benefits from USAID’s Disaster Risk Management Benchmarking Tool (DRMBT). In line with the benchmarking tool, the paper covers key developments in the four components of pre-disaster risk management, namely: risk identification, risk mitigation, risk transfer and disaster preparedness. In the end, it will present three major conclusions: (i) Although post-1999 Turkey has made some important progress in the pre-disaster phase of DRM, particularly with the enactment of obligatory earthquake insurance and tightened standards for building construction, the country is far away from substantial levels of success in DRM. (ii) In recent years, local governments have had been given more authority in the realm of DRM, however, Turkey’s approach to DRM is still predominantly centralized at the expense of successful DRM practices at the local level. (iii) While the devastating 1999 earthquake has resulted in advances in the pre-disaster components of DRM; progress has been mostly in the realm of earthquakes. Turkey’s other major disasters (landslides, floods, wild fires i.e.) also require similar attention by local and central authorities.