53 resultados para Satisfaction Measurement Methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Taking issue with the prevalent practice of measuring customer satisfaction with a single global measurement item, this article stresses the importance of measuring customer satisfaction through its underlying dimensions, especially in retail settings. Empirical results of a survey of 351 consumers demonstrate that (a) consumer satisfaction with retail stores has 6 key dimensions, (b) the suggested dimensions of retail satisfaction predict overall satisfaction, and (c) the dimensions of retail satisfaction have a greater effect on overall satisfaction than SERVQUAL dimensions. However, the predictive power of the dimensions of retail satisfaction is still fairly low. Implications for retail management as well as academic research are outlined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Heterochromatic flicker photometry (HFP) is a psychophysical technique used to measure macular pigment optical density (MPOD). We used the MPS 9000 (MPS) HFP device. Our aim was to determine if the repeatability of the MPS could be improved to make it more suitable for monitoring MPOD over time. Methods: Intra-session repeatability was assessed in 25 participants (aged 20-50 years). The resulting data was explored in detail, e.g., by examining the effect of removal and adjustment of data with less than optimal quality parameters. A protocol was developed for improved overall reliability, which was then tested in terms of inter-session repeatability in a separate group of 27 participants (aged 19-52 years). Results: Removal and adjustment of data reduced the intra-session coefficient of repeatability (CR) by 0.04, on average, and the mean individual standard deviation by 0.004. Raw data observation offered further insight into ways of improving repeatability. The proposed protocol resulted in an inter-session CR of 0.08. Conclusions: Removal and adjustment of less than optimal data improved repeatability, and is therefore recommended. To further improve repeatability, in brief we propose that each patient perform each part of the test twice, and a third time where necessary (described in detail by the protocol). Doing so will make the MPS more useful in research and clinical settings. © 2012 Springer-Verlag.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rationale for carrying out this research was to address the clear lack of knowledge surrounding the measurement of public hospital performance in Ireland. The objectives of this research were to develop a comprehensive model for measuring hospital performance and using this model to measure the performance of public acute hospitals in Ireland in 2007. Having assessed the advantages and disadvantages of various measurement models the Data Envelopment Analysis (DEA) model was chosen for this research. DEA was initiated by Charnes, Cooper and Rhodes in 1978 and further developed by Fare et al. (1983) and Banker et al. (1984). The method used to choose relevant inputs and outputs to be included in the model followed that adopted by Casu et al. (2005) which included the use of focus groups. The main conclusions of the research are threefold. Firstly, it is clear that each stakeholder group has differing opinions on what constitutes good performance. It is therefore imperative that any performance measurement model would be designed within parameters that are clearly understood by any intended audience. Secondly, there is a lack of publicly available qualitative information in Ireland that inhibits detailed analysis of hospital performance. Thirdly, based on available qualitative and quantitative data the results indicated a high level of efficiency among the public acute hospitals in Ireland in their staffing and non pay costs, averaging 98.5%. As DEA scores are sensitive to the number of input and output variables as well as the size of the sample it should be borne in mind that a high level of efficiency could be as a result of using DEA with too many variables compared to the number of hospitals. No hospital was deemed to be scale efficient in any of the models even though the average scale efficiency for all of the hospitals was relatively high at 90.3%. Arising from this research the main recommendations would be that information on medical outcomes, survival rates and patient satisfaction should be made publicly available in Ireland; that despite a high average efficiency level that many individual hospitals need to focus on improving their technical and scale efficiencies, and that performance measurement models should be developed that would include more qualitative data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The leadership categorisation theory suggests that followers rely on a hierarchical cognitive structure in perceiving leaders and the leadership process, which consists of three levels; superordinate, basic and subordinate. The predominant view is that followers rely on Implicit Leadership Theories (ILTs) at the basic level in making judgments about managers. The thesis examines whether this presumption is true by proposing and testing two competing conceptualisations; namely the congruence between the basic level ILTs (general leader) and actual manager perceptions, and subordinate level ILTs (job-specific leader) and actual manager. The conceptualisation at the job-specific level builds on context-related assertions of the ILT explanatory models: leadership categorisation, information processing and connectionist network theories. Further, the thesis addresses the effects of ILT congruence at the group level. The hypothesised model suggests that Leader-Member Exchange (LMX) will act as a mediator between ILT congruence and outcomes. Three studies examined the proposed model. The first was cross-sectional with 175 students reporting on work experience during a 1-year industrial placement. The second was longitudinal and had a sample of 343 students engaging in a business simulation in groups with formal leadership. The final study was a cross-sectional survey in several organisations with a sample of 178. A novel approach was taken to congruence analysis; the hypothesised models were tested using Latent Congruence Modelling (LCM), which accounts for measurement error and overcomes the majority of limitations of traditional approaches. The first two studies confirm the traditional theorised view that employees rely on basic-level ILTs in making judgments about their managers with important implications, and show that LMX mediates the relationship between ILT congruence and work-related outcomes (performance, job satisfaction, well-being, task satisfaction, intragroup conflict, group satisfaction, team realness, team-member exchange, group performance). The third study confirms this with conflict, well-being, self-rated performance and commitment as outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To analyse the relationship between measured intraocular pressure (IOP) and central corneal thickness (CCT), corneal hysteresis (CH) and corneal resistance factor (CRF) in ocular hypertension (OHT), primary open-angle (POAG) and normal tension glaucoma (NTG) eyes using multiple tonometry devices. Methods: Right eyes of patients diagnosed with OHT (n=47), normal tension glaucoma (n=17) and POAG (n=50) were assessed, IOP was measured in random order with four devices: Goldmann applanation tonometry (GAT); Pascal(R) dynamic contour tonometer (DCT); Reichert(R) ocular response analyser (ORA); and Tono-Pen(R) XL. CCT was then measured using a hand-held ultrasonic pachymeter. CH and CRF were derived from the air pressure to corneal reflectance relationship of the ORA data. Results: Compared to the GAT, the Tonopen and ORA Goldmann equivalent (IOPg) and corneal compensated (IOPcc) measured higher IOP readings (F=19.351, p<0.001), particularly in NTG (F=12.604, p<0.001). DCT was closest to Goldmann IOP and had the lowest variance. CCT was significantly different (F=8.305, p<0.001) between the 3 conditions as was CH (F=6.854, p=0.002) and CRF (F=19.653, p<0.001). IOPcc measures were not affected by CCT. The DCT was generally not affected by corneal biomechanical factors. Conclusion: This study suggests that as the true pressure of the eye cannot be determined non-invasively, measurements from any tonometer should be interpreted with care, particularly when alterations in the corneal tissue are suspected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is increasing evidence that non-enzymatic post-translational protein modifications might play key roles in various diseases. These protein modifications can be caused by free radicals generated during oxidative stress or by their products generated during lipid peroxidation. 4-Hydroxynonenal (HNE), a major biomarker of oxidative stress and lipid peroxidation, has been recognized as important molecule in pathology as well as in physiology of living organisms. Therefore, its detection and quantification can be considered as valuable tool for evaluating various pathophysiological conditions.The HNE-protein adduct ELISA is a method to detect HNE bound to proteins, which is considered as the most likely form of HNE occurrence in living systems. Since the earlier described ELISA has been validated for cell lysates and the antibody used for detection of HNE-protein adducts is non-commercial, the aim of this work was to adapt the ELISA to a commercial antibody and to apply it in the analysis of human plasma samples.After modification and validation of the protocol for both antibodies, samples of two groups were analyzed: apparently healthy obese (n=62) and non-obese controls (n=15). Although the detected absolute values of HNE-protein adducts were different, depending on the antibody used, both ELISA methods showed significantly higher values of HNE-protein adducts in the obese group. © 2013 The Authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Performance evaluation in conventional data envelopment analysis (DEA) requires crisp numerical values. However, the observed values of the input and output data in real-world problems are often imprecise or vague. These imprecise and vague data can be represented by linguistic terms characterised by fuzzy numbers in DEA to reflect the decision-makers' intuition and subjective judgements. This paper extends the conventional DEA models to a fuzzy framework by proposing a new fuzzy additive DEA model for evaluating the efficiency of a set of decision-making units (DMUs) with fuzzy inputs and outputs. The contribution of this paper is threefold: (1) we consider ambiguous, uncertain and imprecise input and output data in DEA, (2) we propose a new fuzzy additive DEA model derived from the a-level approach and (3) we demonstrate the practical aspects of our model with two numerical examples and show its comparability with five different fuzzy DEA methods in the literature. Copyright © 2011 Inderscience Enterprises Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Guest editorial Ali Emrouznejad is a Senior Lecturer at the Aston Business School in Birmingham, UK. His areas of research interest include performance measurement and management, efficiency and productivity analysis as well as data mining. He has published widely in various international journals. He is an Associate Editor of IMA Journal of Management Mathematics and Guest Editor to several special issues of journals including Journal of Operational Research Society, Annals of Operations Research, Journal of Medical Systems, and International Journal of Energy Management Sector. He is in the editorial board of several international journals and co-founder of Performance Improvement Management Software. William Ho is a Senior Lecturer at the Aston University Business School. Before joining Aston in 2005, he had worked as a Research Associate in the Department of Industrial and Systems Engineering at the Hong Kong Polytechnic University. His research interests include supply chain management, production and operations management, and operations research. He has published extensively in various international journals like Computers & Operations Research, Engineering Applications of Artificial Intelligence, European Journal of Operational Research, Expert Systems with Applications, International Journal of Production Economics, International Journal of Production Research, Supply Chain Management: An International Journal, and so on. His first authored book was published in 2006. He is an Editorial Board member of the International Journal of Advanced Manufacturing Technology and an Associate Editor of the OR Insight Journal. Currently, he is a Scholar of the Advanced Institute of Management Research. Uses of frontier efficiency methodologies and multi-criteria decision making for performance measurement in the energy sector This special issue aims to focus on holistic, applied research on performance measurement in energy sector management and for publication of relevant applied research to bridge the gap between industry and academia. After a rigorous refereeing process, seven papers were included in this special issue. The volume opens with five data envelopment analysis (DEA)-based papers. Wu et al. apply the DEA-based Malmquist index to evaluate the changes in relative efficiency and the total factor productivity of coal-fired electricity generation of 30 Chinese administrative regions from 1999 to 2007. Factors considered in the model include fuel consumption, labor, capital, sulphur dioxide emissions, and electricity generated. The authors reveal that the east provinces were relatively and technically more efficient, whereas the west provinces had the highest growth rate in the period studied. Ioannis E. Tsolas applies the DEA approach to assess the performance of Greek fossil fuel-fired power stations taking undesirable outputs into consideration, such as carbon dioxide and sulphur dioxide emissions. In addition, the bootstrapping approach is deployed to address the uncertainty surrounding DEA point estimates, and provide bias-corrected estimations and confidence intervals for the point estimates. The author revealed from the sample that the non-lignite-fired stations are on an average more efficient than the lignite-fired stations. Maethee Mekaroonreung and Andrew L. Johnson compare the relative performance of three DEA-based measures, which estimate production frontiers and evaluate the relative efficiency of 113 US petroleum refineries while considering undesirable outputs. Three inputs (capital, energy consumption, and crude oil consumption), two desirable outputs (gasoline and distillate generation), and an undesirable output (toxic release) are considered in the DEA models. The authors discover that refineries in the Rocky Mountain region performed the best, and about 60 percent of oil refineries in the sample could improve their efficiencies further. H. Omrani, A. Azadeh, S. F. Ghaderi, and S. Abdollahzadeh presented an integrated approach, combining DEA, corrected ordinary least squares (COLS), and principal component analysis (PCA) methods, to calculate the relative efficiency scores of 26 Iranian electricity distribution units from 2003 to 2006. Specifically, both DEA and COLS are used to check three internal consistency conditions, whereas PCA is used to verify and validate the final ranking results of either DEA (consistency) or DEA-COLS (non-consistency). Three inputs (network length, transformer capacity, and number of employees) and two outputs (number of customers and total electricity sales) are considered in the model. Virendra Ajodhia applied three DEA-based models to evaluate the relative performance of 20 electricity distribution firms from the UK and the Netherlands. The first model is a traditional DEA model for analyzing cost-only efficiency. The second model includes (inverse) quality by modelling total customer minutes lost as an input data. The third model is based on the idea of using total social costs, including the firm’s private costs and the interruption costs incurred by consumers, as an input. Both energy-delivered and number of consumers are treated as the outputs in the models. After five DEA papers, Stelios Grafakos, Alexandros Flamos, Vlasis Oikonomou, and D. Zevgolis presented a multiple criteria analysis weighting approach to evaluate the energy and climate policy. The proposed approach is akin to the analytic hierarchy process, which consists of pairwise comparisons, consistency verification, and criteria prioritization. In the approach, stakeholders and experts in the energy policy field are incorporated in the evaluation process by providing an interactive mean with verbal, numerical, and visual representation of their preferences. A total of 14 evaluation criteria were considered and classified into four objectives, such as climate change mitigation, energy effectiveness, socioeconomic, and competitiveness and technology. Finally, Borge Hess applied the stochastic frontier analysis approach to analyze the impact of various business strategies, including acquisition, holding structures, and joint ventures, on a firm’s efficiency within a sample of 47 natural gas transmission pipelines in the USA from 1996 to 2005. The author finds that there were no significant changes in the firm’s efficiency by an acquisition, and there is a weak evidence for efficiency improvements caused by the new shareholder. Besides, the author discovers that parent companies appear not to influence a subsidiary’s efficiency positively. In addition, the analysis shows a negative impact of a joint venture on technical efficiency of the pipeline company. To conclude, we are grateful to all the authors for their contribution, and all the reviewers for their constructive comments, which made this special issue possible. We hope that this issue would contribute significantly to performance improvement of the energy sector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For a Switched Reluctance Motor (SRM), the flux linkage characteristic is the most basic magnetic characteristic, and many other quantities, including the incremental inductance, back emf, and electromagnetic torque can be determined indirectly from it. In this paper, two methods of measuring the flux linkage profile of an SRM from the phase winding voltage and current measurements, with and without rotor locking devices, are presented. Torque, incremental inductance and back emf characteristics of the SRM are then obtained from the flux linkage measurements. The torque of the SRM is also measured directly as a comparison, and the closeness of the calculated and directly measured torque curves suggests the validity of the method to obtain the SRM torque, incremental inductance and back emf profiles from the flux linkage measurements. © 2013 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE. The purpose of this study was to evaluate the potential of the portable Grand Seiko FR-5000 autorefractor to allow objective, continuous, open-field measurement of accommodation and pupil size for the investigation of the visual response to real-world environments and changes in the optical components of the eye. METHODS. The FR-5000 projects a pair of infrared horizontal and vertical lines on either side of fixation, analyzing the separation of the bars in the reflected image. The measurement bars were turned on permanently and the video output of the FR-5000 fed into a PC for real-time analysis. The calibration between infrared bar separation and the refractive error was assessed over a range of 10.0 D with a model eye. Tolerance to longitudinal instrument head shift was investigated over a ±15 mm range and to eye alignment away from the visual axis over eccentricities up to 25.0°. The minimum pupil size for measurement was determined with a model eye. RESULTS. The separation of the measurement bars changed linearly (r = 0.99), allowing continuous online analysis of the refractive state at 60 Hz temporal and approximately 0.01 D system resolution with pupils >2 mm. The pupil edge could be analyzed on the diagonal axes at the same rate with a system resolution of approximately 0.05 mm. The measurement of accommodation and pupil size were affected by eccentricity of viewing and instrument focusing inaccuracies. CONCLUSIONS. The small size of the instrument together with its resolution and temporal properties and ability to measure through a 2 mm pupil make it useful for the measurement of dynamic accommodation and pupil responses in confined environments, although good eye alignment is important. Copyright © 2006 American Academy of Optometry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - The UK Prospective Diabetic Study has confirmed the importance of blood pressure (BP) as a major risk factor for diabetic retinopathy (DR). We wanted to investigate whether measuring the BP in the diabetic eye clinic could identify new hypertensive patients and monitor control in existing ones. Patients and methods - We compared BP in patients attending the diabetic eye clinic with home blood pressure measurement (HBPM) and ambulatory BP measurement (ABPM). In all, 106 patients attending a diabetic eye clinic were selected at random from clinic attendees. BP measurement (on an Omron 705 CP) was performed in the eye clinic and also compared to HBPM three times per day with an Omron 705 CP machine, and was compared to diabetic clinic measurements. In addition, 11 randomly chosen patients had 24 h ABPM to validate the above techniques. Results - In all, 106 patients (70 male and 36 female) were recruited for the study, of which 71 were known to be hypertensive on antihypertensive medication. Of the total, 75 patients (70.8%) had BP>140/85 in the eye clinic, of which 51 (68%) were known to be hypertensive on treatment and this was confirmed in 46 (90%) on HBPM. A total of, 24 patients (22.6%) were newly diagnosed as hypertensive in the eye clinic, which was confirmed by HBPM in 22 patients (92%). The mean BP of the measurements performed in the eye clinic was significantly higher than that carried out in the diabetic clinic (P<0.01). Tropicamide 1% and phenylephrine 2.5% eye drop instillation had no effect on BP. In 11 randomly chosen patients, 24 h ABPM validated both diabetic eye clinic and home BP measurements. Conclusion - Attendance at the diabetic eye clinic is an important chance to detect both new patients with systemic hypertension and those with inadequate BP control. Ophthalmologists should be encouraged to measure BP in their diabetic patients attending diabetic eye clinics, as it is an important risk factor for DR. On the basis of our findings, good BP control is a goal yet to be achieved in diabetic patients with retinopathy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The phagocytic clearance of apoptotic cells is a highly efficient and nonphlogistic process in vivo. Research in this area has been limited, at least in part, by technical difficulties associated with the techniques used in the detailed study of apoptotic cell clearance mechanisms. This chapter provides details of methods that may be used to study apoptotic cell clearance in vitro. Such methods have been used successfully to identify phagocyte-associated or apoptotic cell-associated molecular players in the recognition process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Protein carbonyls are widely analysed as a measure of protein oxidation. Several different methods exist for their determination. A previous study had described orders of magnitude variance that existed when protein carbonyls were analysed in a single laboratory by ELISA using different commercial kits. We have further explored the potential causes of variance in carbonyl analysis in a ring study. A soluble protein fraction was prepared from rat liver and exposed to 0, 5 and 15 min of UV irradiation. Lyophilised preparations were distributed to six different laboratories that routinely undertook protein carbonyl analysis across Europe. ELISA and Western blotting techniques detected an increase in protein carbonyl formation between 0 and 5 min of UV irradiation irrespective of method used. After irradiation for 15 min, less oxidation was detected by half of the laboratories than after 5 min irradiation. Three of the four ELISA carbonyl results fell within 95% confidence intervals. Likely errors in calculating absolute carbonyl values may be attributed to differences in standardisation. Out of up to 88 proteins identified as containing carbonyl groups after tryptic cleavage of irradiated and control liver proteins, only seven were common in all three liver preparations. Lysine and arginine residues modified by carbonyls are likely to be resistant to tryptic proteolysis. Use of a cocktail of proteases may increase the recovery of oxidised peptides. In conclusion, standardisation is critical for carbonyl analysis and heavily oxidised proteins may not be effectively analysed by any existing technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new experimental technique is presented for making measurements of biaxial residual stress using load and depth sensing indentation (nanoindentation). The technique is based on spherical indentation, which, in certain deformation regimes, can be much more sensitive to residual stress than indentation with sharp pyramidal indenters like the Berkovich. Two different methods of analysis were developed: one requiring an independent measure of the material's yield strength and the other a reference specimen in the unstressed state or other known reference condition. Experiments conducted on aluminum alloys to which controlled biaxial bending stresses were applied showed that the methods are capable of measuring the residual stress to within 10-20% of the specimen yield stress. Because the methods do not require imaging of the hardness impressions, they are potentially useful for making localized measurements of residual stress, as in thin films or small volumes, or for characterization of point-to-point spatial variations of the surface stress.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Defining 'effectiveness' in the context of community mental health teams (CMHTs) has become increasingly difficult under the current pattern of provision required in National Health Service mental health services in England. The aim of this study was to establish the characteristics of multi-professional team working effectiveness in adult CMHTs to develop a new measure of CMHT effectiveness. The study was conducted between May and November 2010 and comprised two stages. Stage 1 used a formative evaluative approach based on the Productivity Measurement and Enhancement System to develop the scale with multiple stakeholder groups over a series of qualitative workshops held in various locations across England. Stage 2 analysed responses from a cross-sectional survey of 1500 members in 135 CMHTs from 11 Mental Health Trusts in England to determine the scale's psychometric properties. Based on an analysis of its structural validity and reliability, the resultant 20-item scale demonstrated good psychometric properties and captured one overall latent factor of CMHT effectiveness comprising seven dimensions: improved service user well-being, creative problem-solving, continuous care, inter-team working, respect between professionals, engagement with carers and therapeutic relationships with service users. The scale will be of significant value to CMHTs and healthcare commissioners both nationally and internationally for monitoring, evaluating and improving team functioning in practice.