43 resultados para Utility-based performance measures
Resumo:
There is an increasing need of a model for the process-based performance measurement of multispecialty tertiary care hospitals for quality improvement. Analytic hierarchy process (AHP) is utilized in this study to evolve such a model. Each step in the model was derived by group-discussions and brainstorming sessions among experienced clinicians and managers. This tool was applied to two tertiary care teaching hospitals in Barbados and India. The model enabled identification of specific areas where neither hospital performed very well, and helped to suggest recommendations to improve those areas. AHP is recommended as a valuable tool to measure the process-based performance of multispecialty tertiary care hospitals. © Emerald Group Publishing Limited.
Resumo:
This paper considers the use of general performance measures in evaluating specific planning and design decisions in higher education and reflects on the students' learning process. Specifically, it concerns the use of the MENTOR multimedia computer aided learning package for helping students learn about OR as part of a general business degree. It includes the transfer of responsibility for a learning module to a new staff member and a change from a single tutor to a system involving multiple tutors. Student satisfaction measures, learning outcome measures and MENTOR usage patterns are examined in monitoring the effects of the changes in course delivery. The results raise some questions about the effectiveness of general performance measures in supporting specific decisions relating to course design and planning.
Resumo:
Purpose: Short product life cycle and/or mass customization necessitate reconfiguration of operational enablers of supply chain (SC) from time to time in order to harness high levels of performance. The purpose of this paper is to identify the key operational enablers under stochastic environment on which practitioner should focus while reconfiguring a SC network. Design/methodology/approach: The paper used interpretive structural modeling (ISM) approach that presents a hierarchy-based model and the mutual relationships among the enablers. The contextual relationship needed for developing structural self-interaction matrix (SSIM) among various enablers is realized by conducting experiments through simulation of a hypothetical SC network. Findings: The research identifies various operational enablers having a high driving power towards assumed performance measures. In this regard, these enablers require maximum attention and of strategic importance while reconfiguring SC. Practical implications: ISM provides a useful tool to the SC managers to strategically adopt and focus on the key enablers which have comparatively greater potential in enhancing the SC performance under given operational settings. Originality/value: The present research realizes the importance of SC flexibility under the premise of reconfiguration of the operational units in order to harness high value of SC performance. Given the resulting digraph through ISM, the decision maker can focus the key enablers for effective reconfiguration. The study is one of the first efforts that develop contextual relations among operational enablers for SSIM matrix through integration of discrete event simulation to ISM. © Emerald Group Publishing Limited.
Resumo:
Purpose – The purpose of the paper is to develop an integrated framework for performance management of healthcare services. Design/methodology/approach – This study develops a performance management framework for healthcare services using a combined analytic hierarchy process (AHP) and logical framework (LOGFRAME). The framework is then applied to the intensive care units of three different hospitals in developing nations. Numerous focus group discussions were undertaken, involving experts from the specific area under investigation. Findings – The study reveals that a combination of outcome, structure and process-based critical success factors and a combined AHP and LOGFRAME-based performance management framework helps manage performance of healthcare services. Practical implications – The proposed framework could be practiced in hospital-based healthcare services. Originality/value – The conventional approaches to healthcare performance management are either outcome-based or process-based, which cannot reveal improvement measures appropriately in order to assure superior performance. Additionally, they lack planning, implementing and evaluating improvement projects that are identified from performance measurement. This study presents an integrated approach to performance measurement and implementing framework of improvement projects.
Resumo:
Subjective measures of company performance are widely used in research and typically are interpreted as equivalent to objective measures. Yet, the assumption of equivalence is open to challenge. We compared the use of both types of measure in 3 separate samples. Findings were consistent in showing that: (a) subjective and objective measures of company performance were positively associated (convergent validity); (b) those relationships were stronger than those between measures of differing aspects of performance using the same method (discriminant validity); and (c) the relationships of subjective and objective company performance measures with a range of independent variables were equivalent (construct validity).
Resumo:
This thesis proposes that despite many experimental studies of thinking, and the development of models of thinking, such as Bruner's (1966) enactive, iconic and symbolic developmental modes, the imagery and inner verbal strategies used by children need further investigation to establish a coherent, theoretical basis from which to create experimental curricula for direct improvement of those strategies. Five hundred and twenty-three first, second and third year comprehensive school children were tested on 'recall' imagery, using a modified Betts Imagery Test; and a test of dual-coding processes (Paivio, 1971, p.179), by the P/W Visual/Verbal Questionnaire, measuring 'applied imagery' and inner verbalising. Three lines of investigation were pursued: 1. An investigation a. of hypothetical representational strategy differences between boys and girls; and b. the extent to which strategies change with increasing age. 2. The second and third year children's use of representational processes, were taken separately and compared with performance measures of perception, field independence, creativity, self-sufficiency and self-concept. 3. The second and third year children were categorised into four dual-coding strategy groups: a. High Visual/High Verbal b. Low Visual/High Verbal c. High Visual/Low Verbal d. Low Visual/Low Verbal These groups were compared on the same performance measures. The main result indicates that: 1. A hierarchy of dual-coding strategy use can be identified that is significantly related (.01, Binomial Test) to success or failure in the performance measures: the High Visual/High Verbal group registering the highest scores, the Low Visual/High Verbal and High Visual/Low Verbal groups registering intermediate scores, and the Low Visual/Low Verbal group registering the lowest scores on the performance measures. Subsidiary results indicate that: 2. Boys' use of visual strategies declines, and of verbal strategies increases, with age; girls' recall imagery strategy increases with age. Educational implications from the main result are discussed, the establishment of experimental curricula proposed, and further research suggested.
Resumo:
Short text messages a.k.a Microposts (e.g. Tweets) have proven to be an effective channel for revealing information about trends and events, ranging from those related to Disaster (e.g. hurricane Sandy) to those related to Violence (e.g. Egyptian revolution). Being informed about such events as they occur could be extremely important to authorities and emergency professionals by allowing such parties to immediately respond. In this work we study the problem of topic classification (TC) of Microposts, which aims to automatically classify short messages based on the subject(s) discussed in them. The accurate TC of Microposts however is a challenging task since the limited number of tokens in a post often implies a lack of sufficient contextual information. In order to provide contextual information to Microposts, we present and evaluate several graph structures surrounding concepts present in linked knowledge sources (KSs). Traditional TC techniques enrich the content of Microposts with features extracted only from the Microposts content. In contrast our approach relies on the generation of different weighted semantic meta-graphs extracted from linked KSs. We introduce a new semantic graph, called category meta-graph. This novel meta-graph provides a more fine grained categorisation of concepts providing a set of novel semantic features. Our findings show that such category meta-graph features effectively improve the performance of a topic classifier of Microposts. Furthermore our goal is also to understand which semantic feature contributes to the performance of a topic classifier. For this reason we propose an approach for automatic estimation of accuracy loss of a topic classifier on new, unseen Microposts. We introduce and evaluate novel topic similarity measures, which capture the similarity between the KS documents and Microposts at a conceptual level, considering the enriched representation of these documents. Extensive evaluation in the context of Emergency Response (ER) and Violence Detection (VD) revealed that our approach outperforms previous approaches using single KS without linked data and Twitter data only up to 31.4% in terms of F1 measure. Our main findings indicate that the new category graph contains useful information for TC and achieves comparable results to previously used semantic graphs. Furthermore our results also indicate that the accuracy of a topic classifier can be accurately predicted using the enhanced text representation, outperforming previous approaches considering content-based similarity measures. © 2014 Elsevier B.V. All rights reserved.
Resumo:
The increasing trend of disaster victims globally is posing a complex challenge for disaster management authorities. Moreover, to accomplish successful transition between preparedness and response, it is important to consider the different features inherent to each type of disaster. Floods are portrayed as one of the most frequent and harmful disasters, hence introducing the necessity to develop a tool for disaster preparedness to perform efficient and effective flood management. The purpose of the article is to introduce a method to simultaneously define the proper location of shelters and distribution centers, along with the allocation of prepositioned goods and distribution decisions required to satisfy flood victims. The tool combines the use of a raster geographical information system (GIS) and an optimization model. The GIS determines the flood hazard of the city areas aiming to assess the flood situation and to discard floodable facilities. Then, the multi-commodity multimodal optimization model is solved to obtain the Pareto frontier of two criteria: distance and cost. The methodology was applied to a case study in the flood of Villahermosa, Mexico, in 2007, and the results were compared to an optimized scenario of the guidelines followed by Mexican authorities, concluding that the value of the performance measures was improved using the developed method. Furthermore, the results exhibited the possibility to provide adequate care for people affected with less facilities than the current approach and the advantages of considering more than one distribution center for relief prepositioning.
Resumo:
The amplification of demand variation up a supply chain widely termed ‘the Bullwhip Effect’ is disruptive, costly and something that supply chain management generally seeks to minimise. Originally attributed to poor system design; deficiencies in policies, organisation structure and delays in material and information flow all lead to sub-optimal reorder point calculation. It has since been attributed to exogenous random factors such as: uncertainties in demand, supply and distribution lead time but these causes are not exclusive as academic and operational studies since have shown that orders and/or inventories can exhibit significant variability even if customer demand and lead time are deterministic. This increase in the range of possible causes of dynamic behaviour indicates that our understanding of the phenomenon is far from complete. One possible, yet previously unexplored, factor that may influence dynamic behaviour in supply chains is the application and operation of supply chain performance measures. Organisations monitoring and responding to their adopted key performance metrics will make operational changes and this action may influence the level of dynamics within the supply chain, possibly degrading the performance of the very system they were intended to measure. In order to explore this a plausible abstraction of the operational responses to the Supply Chain Council’s SCOR® (Supply Chain Operations Reference) model was incorporated into a classic Beer Game distribution representation, using the dynamic discrete event simulation software Simul8. During the simulation the five SCOR Supply Chain Performance Attributes: Reliability, Responsiveness, Flexibility, Cost and Utilisation were continuously monitored and compared to established targets. Operational adjustments to the; reorder point, transportation modes and production capacity (where appropriate) for three independent supply chain roles were made and the degree of dynamic behaviour in the Supply Chain measured, using the ratio of the standard deviation of upstream demand relative to the standard deviation of the downstream demand. Factors employed to build the detailed model include: variable retail demand, order transmission, transportation delays, production delays, capacity constraints demand multipliers and demand averaging periods. Five dimensions of supply chain performance were monitored independently in three autonomous supply chain roles and operational settings adjusted accordingly. Uniqueness of this research stems from the application of the five SCOR performance attributes with modelled operational responses in a dynamic discrete event simulation model. This project makes its primary contribution to knowledge by measuring the impact, on supply chain dynamics, of applying a representative performance measurement system.
Resumo:
This article reports the results of a web-based survey of real estate portfolio managers in the pension fund industry. The study focused on ascertaining the real estate research interests of the respondents as well as whether or not research funding should be allocated to various research topics. Performance measures of real estate assets and portfolios, microeconomic factors affecting real estate and the role of real estate in a mixed-asset portfolio were the top three real estate research interests. There was some variation by the type and size of fund providing evidence that segmentation is important within the money management industry. Respondents were also queried on more focused research subtopics and additional questions in the survey focused on satisfaction with existing real estate benchmarks, and perceptions of the usefulness of published research. Findings should be used to guide research practitioners and academics as to the most important research interests of plan sponsor real estate investment managers.
Resumo:
Strategic group research originated in the 1970s and a number of notable studies centered on the U.S. pharmaceutical industry. Results were, however, conflicting. This paper explores the nature of strategic groups in the U.K. pharmaceutical industry. The study confirms the presence of between six and eight strategic groups across the period studied, 1998-2002. The study also demonstrates a statistically significant relationship between these strategic groups and performance using three performance measures. The paper then compares strategic groups with competitive groups and concludes that the distinction is important and may explain the contradictory findings in earlier strategic group research. Copyright © 2007 John Wiley & Sons, Ltd.
Resumo:
This article presents some evidence on an aspect of the design of a strategic control system, at the microlevel, within a single organization. The research we report used an ethnographic approach to provide an understanding of strategy formulation. Our aim is to contribute to an area of literature which is of increasing significance, but relatively underdeveloped in terms of the application of in-depth, field-research techniques. We take an intensive look at the manner in which performance measures are formulated, at the microlevel, within a single organization. The article presents, as an in-depth case analysis, the experience of a fisheries holding company in New Zealand. The article recounts the experiences of managers within the organization of the process of identification of such things as critical success factors and key performance indicators (KPIs) and, more broadly, the formulation of a strategic performance measurement system.
Resumo:
A local area network that can support both voice and data packets offers economic advantages due to the use of only a single network for both types of traffic, greater flexibility to changing user demands, and it also enables efficient use to be made of the transmission capacity. The latter aspect is very important in local broadcast networks where the capacity is a scarce resource, for example mobile radio. This research has examined two types of local broadcast network, these being the Ethernet-type bus local area network and a mobile radio network with a central base station. With such contention networks, medium access control (MAC) protocols are required to gain access to the channel. MAC protocols must provide efficient scheduling on the channel between the distributed population of stations who want to transmit. No access scheme can exceed the performance of a single server queue, due to the spatial distribution of the stations. Stations cannot in general form a queue without using part of the channel capacity to exchange protocol information. In this research, several medium access protocols have been examined and developed in order to increase the channel throughput compared to existing protocols. However, the established performance measures of average packet time delay and throughput cannot adequately characterise protocol performance for packet voice. Rather, the percentage of bits delivered within a given time bound becomes the relevant performance measure. Performance evaluation of the protocols has been examined using discrete event simulation and in some cases also by mathematical modelling. All the protocols use either implicit or explicit reservation schemes, with their efficiency dependent on the fact that many voice packets are generated periodically within a talkspurt. Two of the protocols are based on the existing 'Reservation Virtual Time CSMA/CD' protocol, which forms a distributed queue through implicit reservations. This protocol has been improved firstly by utilising two channels, a packet transmission channel and a packet contention channel. Packet contention is then performed in parallel with a packet transmission to increase throughput. The second protocol uses variable length packets to reduce the contention time between transmissions on a single channel. A third protocol developed, is based on contention for explicit reservations. Once a station has achieved a reservation, it maintains this effective queue position for the remainder of the talkspurt and transmits after it has sensed the transmission from the preceeding station within the queue. In the mobile radio environment, adaptions to the protocols were necessary in order that their operation was robust to signal fading. This was achieved through centralised control at a base station, unlike the local area network versions where the control was distributed at the stations. The results show an improvement in throughput compared to some previous protocols. Further work includes subjective testing to validate the protocols' effectiveness.