797 resultados para Performance Measurement System, PMS, review PMS, KPIs
Resumo:
Dashboards are expected to improve decision making by amplifying cognition and capitalizing on human perceptual capabilities. Hence, interest in dashboards has increased recently, which is also evident from the proliferation of dashboard solution providers in the market. Despite dashboards' popularity, little is known about the extent of their effectiveness, i.e. what types of dashboards work best for different users or tasks. In this paper, we conduct a comprehensive multidisciplinary literature review with an aim to identify the critical issues organizations might need to consider when implementing dashboards. Dashboards are likely to succeed and solve the problems of presentation format and information load when certain visualization principles and features are present (e.g. high data-ink ratio and drill down features).Werecommend that dashboards come with some level of flexibility, i.e. allowing users to switch between alternative presentation formats. Also some theory driven guidance through popups and warnings can help users to select an appropriate presentation format. Given the dearth of research on dashboards, we conclude the paper with a research agenda that could guide future studies in this area.
Resumo:
While academic interest in destination branding has been gathering momentum since the field commenced in the late 1990s, one important gap in this literature that has received relatively little attention to date is the measurement of destination brand performance. This paper sets out one method for assessing the performance of a destination brand over time. The intent is to present an approach that will appeal to marketing practitioners, and which is also conceptually sound. The method is underpinned by Decision Set Theory and the concept of Consumer-Based Brand Equity (CBBE), while the key variables mirror the branding objectives used by many destination marketing organisations (DMO). The approach is demonstrated in this paper to measure brand performance for Australia in the New Zealand market. It is suggested the findings provide indicators of both i) the success of previous marketing communications, and ii) future performance, which can be easily communicated to a DMO’s stakeholders.
Resumo:
The paper analyses technical efficiency of the Japanese banks from 2000 to 2007. The estimation technique is based on the Russell directional distance function that takes into consideration not only desirable outputs but also an undesirable output that is represented by non-performing loans (NPLs). The results indicate that NPLs remain a significant burden as for banks' performance. We show that banks' inputs have to be utilised more efficiently, particularly labour and premises. We also argue that a further restructuring process is needed in the segment of Regional Banks. We conclude that the Japanese banking system is still far away from being fully consolidated and restructured.
Resumo:
Balance and stability are very important for everybody and especially for sports-person who undergo extreme physical activities. Balance and stability exercises not only have a great impact on the performance of the sportsperson but also play a pivotal role in their rehabilitation. Therefore, it is very essential to have knowledge about a sportsperson’s balance and also to quantify the same. In this work, we propose a system consisting of a wobble board, with a gyro enhanced orientation sensor and a motion display for visual feedback to help the sportsperson improve their stability. The display unit gives in real time the orientation of the wobble board, which can help the sportsperson to apply necessary corrective forces to maintain neutral position. The system is compact and portable. We also quantify balance and stability using power spectral density. The sportsperson is made stand on the wobble board and the angular orientation of the wobble board is recorded for each 0.1 second interval. The signal is analized using discrete Fourier transforms. The power of this signal is related to the stability of the subject. This procedure is used to measure the balance and stability of an elite cricket team. Representative results are shown below: Table 1 represents power comparison of two subjects and Table 2 represents power comparison of left leg and right leg of one subject. This procedure can also be used in clinical practice to monitor improvement in stability dysfunction of sportsperson with injuries or other related problems undergoing rehabilitation.
Resumo:
Measurement of in-plane motion with high resolution and large bandwidth enables model-identification and real-time control of motion-stages. This paper presents an optical beam deflection based system for measurement of in-plane motion of both macro- and micro-scale motion stages. A curved reflector is integrated with the motion stage to achieve sensitivity to in-plane translational motion along two axes. Under optimal settings, the measurement system is shown to theoretically achieve sub-angstrom measurement resolution over a bandwidth in excess of 1 kHz and negligible cross-sensitivity to linear motion. Subsequently, the proposed technique is experimentally demonstrated by measuring the in-plane motion of a piezo flexure stage and a scanning probe microcantilever. For the former case, reflective spherical balls of different radii are employed to measure the in-plane motion and the measured sensitivities are shown to agree with theoretical values, on average, to within 8.3%. For the latter case, a prototype polydimethylsiloxane micro-reflector is integrated with the microcantilever. The measured in-plane motion of the microcantilever probe is used to identify nonlinearities and the transient dynamics of the piezo-stage upon which the probe is mounted. These are subsequently compensated by means of feedback control. (C) 2013 AIP Publishing LLC.
Resumo:
Performance measurement and management (PMM) is a management and research paradox. On one hand, it provides management with many critical, useful, and needed functions. Yet, there is evidence that it can adversely affect performance. This paper attempts to resolve this paradox by focusing on the issue of "fit". That is, in today's dynamic and turbulent environment, changes in either the business environment or the business strategy can lead to the need for new or revised measures and metrics. Yet, if these measures and metrics are either not revised or incorrectly revised, then we can encounter situations where what the firm wants to achieve (as communicated by its strategy) and what the firm measures and rewards are not synchronised with each other (i.e., there is a lack of "fit"). This situation can adversely affect the ability of the firm to compete. The issue of fit is explored using a three phase Delphi approach. Initially intended to resolve this first paradox, the Delphi study identified another paradox - one in which the researchers found that in a dynamic environment, firms do revise their strategies, yet, often the PMM system is not changed. To resolve this second paradox, the paper proposes a new framework - one that shows that under certain conditions, the observed metrics "lag" is not only explainable but also desirable. The findings suggest a need to recast the accepted relationship between strategy and PMM system and the output included the Performance Alignment Matrix that had utility for managers. © 2013 .
Resumo:
We present a fiber-optic interferometric system for measuring depth-resolved scattering in two angular dimensions using Fourier-domain low-coherence interferometry. The system is a unique hybrid of the Michelson and Sagnac interferometer topologies. The collection arm of the interferometer is scanned in two dimensions to detect angular scattering from the sample, which can then be analyzed to determine the structure of the scatterers. A key feature of the system is the full control of polarization of both the illumination and the collection fields, allowing for polarization-sensitive detection, which is essential for two-dimensional angular measurements. System performance is demonstrated using a double-layer microsphere phantom. Experimental data from samples with different sizes and acquired with different polarizations show excellent agreement with Mie theory, producing structural measurements with subwavelength accuracy.
Resumo:
This paper reports on a Field Programmable Gate Array (FPGA) implementation as well as prototyping for real-time testing of a low complexity high efficiency decimation filter processor which is deployed in conjunction with a custom built low-power jitter insensitive Continuous Time (CT) Sigma-Delta (Σ-Δ) Modulator to measure and assess its performance. The CT Σ-Δ modulator/decimation filter cascade can be used in integrated all-digital microphone interfaces for a variety of applications including mobile phone handsets, wireless handsets as well as other applications requiring all-digital microphones. The work reported here concentrates on the design and implementation as well as prototyping on a Xilinx Spartan 3 FPGA development system and real-time testing of the decimation processing part deploying All-Pass based structures to process the bit stream coming from CT Σ-Δ modulator hence measuring in real-time and fully assessing the modulator's performance.
Resumo:
Although there is a large body of research on brand equity, little in terms of a literature review has been published on this since Feldwick's (1996) paper. To address this gap, this paper brings together the scattered literature on consumer-based brand equity's conceptualisation and measurement. Measures of consumer-based brand equity are classified as either direct or indirect. Indirect measures assess consumer-based brand equity through its demonstrable dimensions and are superior from a diagnostic level. The paper concludes with directions for future research and managerial pointers for setting up a brand equity measurement system.
Resumo:
The intension of this paper was to review and discuss some of the current quantitative analytical procedures which are used for quality control of pharmaceutical products. The selected papers were organized according to the analytical technique employed. Several techniques like ultraviolet/visible spectrophotometry, fluorimetry, titrimetry, electroanalytical techniques, chromatographic methods (thin-layer chromatography, gas chromatography and high-performance liquid chromatography), capillary electrophoresis and vibrational spectroscopies are the main techniques that have been used for the quantitative analysis of pharmaceutical compounds. In conclusion, although simple techniques such as UV/VIS spectrophotometry and TLC are still extensively employed, HPLC is the most popular instrumental technique used for the analysis of pharmaceuticals. Besides, a review of recent works in the area of pharmaceutical analysis showed a trend in the application of techniques increasingly rapid such as ultra performance liquid chromatography and the use of sensitive and specific detectors as mass spectrometers.
Resumo:
The research activity carried out during the PhD course in Electrical Engineering belongs to the branch of electric and electronic measurements. The main subject of the present thesis is a distributed measurement system to be installed in Medium Voltage power networks, as well as the method developed to analyze data acquired by the measurement system itself and to monitor power quality. In chapter 2 the increasing interest towards power quality in electrical systems is illustrated, by reporting the international research activity inherent to the problem and the relevant standards and guidelines emitted. The aspect of the quality of voltage provided by utilities and influenced by customers in the various points of a network came out only in recent years, in particular as a consequence of the energy market liberalization. Usually, the concept of quality of the delivered energy has been associated mostly to its continuity. Hence the reliability was the main characteristic to be ensured for power systems. Nowadays, the number and duration of interruptions are the “quality indicators” commonly perceived by most customers; for this reason, a short section is dedicated also to network reliability and its regulation. In this contest it should be noted that although the measurement system developed during the research activity belongs to the field of power quality evaluation systems, the information registered in real time by its remote stations can be used to improve the system reliability too. Given the vast scenario of power quality degrading phenomena that usually can occur in distribution networks, the study has been focused on electromagnetic transients affecting line voltages. The outcome of such a study has been the design and realization of a distributed measurement system which continuously monitor the phase signals in different points of a network, detect the occurrence of transients superposed to the fundamental steady state component and register the time of occurrence of such events. The data set is finally used to locate the source of the transient disturbance propagating along the network lines. Most of the oscillatory transients affecting line voltages are due to faults occurring in any point of the distribution system and have to be seen before protection equipment intervention. An important conclusion is that the method can improve the monitored network reliability, since the knowledge of the location of a fault allows the energy manager to reduce as much as possible both the area of the network to be disconnected for protection purposes and the time spent by technical staff to recover the abnormal condition and/or the damage. The part of the thesis presenting the results of such a study and activity is structured as follows: chapter 3 deals with the propagation of electromagnetic transients in power systems by defining characteristics and causes of the phenomena and briefly reporting the theory and approaches used to study transients propagation. Then the state of the art concerning methods to detect and locate faults in distribution networks is presented. Finally the attention is paid on the particular technique adopted for the same purpose during the thesis, and the methods developed on the basis of such approach. Chapter 4 reports the configuration of the distribution networks on which the fault location method has been applied by means of simulations as well as the results obtained case by case. In this way the performance featured by the location procedure firstly in ideal then in realistic operating conditions are tested. In chapter 5 the measurement system designed to implement the transients detection and fault location method is presented. The hardware belonging to the measurement chain of every acquisition channel in remote stations is described. Then, the global measurement system is characterized by considering the non ideal aspects of each device that can concur to the final combined uncertainty on the estimated position of the fault in the network under test. Finally, such parameter is computed according to the Guide to the Expression of Uncertainty in Measurements, by means of a numeric procedure. In the last chapter a device is described that has been designed and realized during the PhD activity aiming at substituting the commercial capacitive voltage divider belonging to the conditioning block of the measurement chain. Such a study has been carried out aiming at providing an alternative to the used transducer that could feature equivalent performance and lower cost. In this way, the economical impact of the investment associated to the whole measurement system would be significantly reduced, making the method application much more feasible.
Resumo:
"June 1997."
Resumo:
The Intensive Care Unit (ICU) being one of those vital areas of a hospital providing clinical care, the quality of service rendered must be monitored and measured quantitatively. It is, therefore, essential to know the performance of an ICU, in order to identify any deficits and enable the service providers to improve the quality of service. Although there have been many attempts to do this with the help of illness severity scoring systems, the relative lack of success using these methods has led to the search for a form of measurement, which would encompass all the different aspects of an ICU in a holistic manner. The Analytic Hierarchy Process (AHP), a multiple-attribute, decision-making technique is utilised in this study to evolve a system to measure the performance of ICU services reliably. This tool has been applied to a surgical ICU in Barbados; we recommend AHP as a valuable tool to quantify the performance of an ICU. Copyright © 2004 Inderscience Enterprises Ltd.
Resumo:
This paper discusses the use of comparative performance measurement by means of Data Envelopment Analysis in the context of the regulation of English and Welsh water companies. Specifically, the use of Data Envelopment Analysis to estimate potential cost savings in sewerage is discussed as it fed into the price review of water companies carried out by the regulator of water companies in 1994. The application is used as a vehicle for highlighting generic issues in terms of assessing the impact of factors on the ranking of units on performance, the insights gained from using alternative methods to assess comparative performance, and the issue of assessing comparative performance when few in number but highly complex entities are involved. The paper should prove of interest to those interested in regulation and, more generally, in the use of methods of comparative performance measurement.