50 resultados para model quality
em Aston University Research Archive
Resumo:
Purpose: To evaluate lenses produced by excimer laser ablation of poly(methyl methacrylate) (PMMA) plates. Setting: University research laboratory. Methods: Two Nidek EC-5000 scanning-slit excimer laser systems were used to ablate plane-parallel plates of PMMA. The ablated lenses were examined by focimetry, interferometry, and mechanical surface profiling. Results: The spherical optical powers of the lenses matched the expected values, but the cylindrical powers were generally lower than intended. Interferometry revealed marked irregularity in the surface of negative corrections, which often had a positive “island” at their center. Positive corrections were generally smoother. These findings were supported by the results of mechanical profiling. Contrast sensitivity measurements carried out when observing through ablated lenses whose power had been neutralized with a suitable spectacle lens of opposite sign confirmed that the surface irregularities of the ablated lenses markedly reduced contrast sensitivity over a range of spatial frequencies. Conclusion: Improvements in beam delivery systems seem desirable.
Resumo:
Video streaming via Transmission Control Protocol (TCP) networks has become a popular and highly demanded service, but its quality assessment in both objective and subjective terms has not been properly addressed. In this paper, based on statistical analysis a full analytic model of a no-reference objective metric, namely pause intensity (PI), for video quality assessment is presented. The model characterizes the video playout buffer behavior in connection with the network performance (throughput) and the video playout rate. This allows for instant quality measurement and control without requiring a reference video. PI specifically addresses the need for assessing the quality issue in terms of the continuity in the playout of TCP streaming videos, which cannot be properly measured by other objective metrics such as peak signal-to-noise-ratio, structural similarity, and buffer underrun or pause frequency. The performance of the analytical model is rigidly verified by simulation results and subjective tests using a range of video clips. It is demonstrated that PI is closely correlated with viewers' opinion scores regardless of the vastly different composition of individual elements, such as pause duration and pause frequency which jointly constitute this new quality metric. It is also shown that the correlation performance of PI is consistent and content independent. © 2013 IEEE.
Resumo:
Using survey data from 358 online customers, the study finds that the e-service quality construct conforms to the structure of a third-order factor model that links online service quality perceptions to distinct and actionable dimensions, including (1) website design, (2) fulfilment, (3) customer service, and (4) security/privacy. Each dimension is found to consist of several attributes that define the basis of e-service quality perceptions. A comprehensive specification of the construct, which includes attributes not covered in existing scales, is developed. The study contrasts a formative model consisting of 4 dimensions and 16 attributes against a reflective conceptualization. The results of this comparison indicate that studies using an incorrectly specified model overestimate the importance of certain e-service quality attributes. Global fit criteria are also found to support the detection of measurement misspecification. Meta-analytic data from 31,264 online customers are used to show that the developed measurement predicts customer behavior better than widely used scales, such as WebQual and E-S-Qual. The results show that the new measurement enables managers to assess e-service quality more accurately and predict customer behavior more reliably.
Resumo:
The thesis presents a two-dimensional Risk Assessment Method (RAM) where the assessment of risk to the groundwater resources incorporates both the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The approach emphasizes the need for a greater dependency on the potential pollution sources, rather than the traditional approach where assessment is based mainly on the intrinsic geo-hydrologic parameters. The risk is calculated using Monte Carlo simulation methods whereby random pollution events were generated to the same distribution as historically occurring events or a priori potential probability distribution. Integrated mathematical models then simulate contaminant concentrations at the predefined monitoring points within the aquifer. The spatial and temporal distributions of the concentrations were calculated from repeated realisations, and the number of times when a user defined concentration magnitude was exceeded is quantified as a risk. The method was setup by integrating MODFLOW-2000, MT3DMS and a FORTRAN coded risk model, and automated, using a DOS batch processing file. GIS software was employed in producing the input files and for the presentation of the results. The functionalities of the method, as well as its sensitivities to the model grid sizes, contaminant loading rates, length of stress periods, and the historical frequencies of occurrence of pollution events were evaluated using hypothetical scenarios and a case study. Chloride-related pollution sources were compiled and used as indicative potential contaminant sources for the case study. At any active model cell, if a random generated number is less than the probability of pollution occurrence, then the risk model will generate synthetic contaminant source term as an input into the transport model. The results of the applications of the method are presented in the form of tables, graphs and spatial maps. Varying the model grid sizes indicates no significant effects on the simulated groundwater head. The simulated frequency of daily occurrence of pollution incidents is also independent of the model dimensions. However, the simulated total contaminant mass generated within the aquifer, and the associated volumetric numerical error appear to increase with the increasing grid sizes. Also, the migration of contaminant plume advances faster with the coarse grid sizes as compared to the finer grid sizes. The number of daily contaminant source terms generated and consequently the total mass of contaminant within the aquifer increases in a non linear proportion to the increasing frequency of occurrence of pollution events. The risk of pollution from a number of sources all occurring by chance together was evaluated, and quantitatively presented as risk maps. This capability to combine the risk to a groundwater feature from numerous potential sources of pollution proved to be a great asset to the method, and a large benefit over the contemporary risk and vulnerability methods.
Resumo:
This paper examines the impact that the introduction of a closing call auction had on market quality at the London Stock Exchange. Using estimates from the partial adjustment with noise model of Amihud and Mendelson [Amihud, Y., Mendelson, H., 1987. Trading mechanisms and stock returns: An empirical investigation. Journal of Finance 42, 533–553] we show that opening and closing market quality improved for participating stocks. When we stratify our sample securities into five groups based on trading activity we find that the least active securities experience the greatest improvements to market quality. A control sample of stocks are not characterized by discernable changes to market quality.
Resumo:
Offshore oil and gas pipelines are vulnerable to environment as any leak and burst in pipelines cause oil/gas spill resulting in huge negative Impacts on marine lives. Breakdown maintenance of these pipelines is also cost-intensive and time-consuming resulting in huge tangible and intangible loss to the pipeline operators. Pipelines health monitoring and integrity analysis have been researched a lot for successful pipeline operations and risk-based maintenance model is one of the outcomes of those researches. This study develops a risk-based maintenance model using a combined multiple-criteria decision-making and weight method for offshore oil and gas pipelines in Thailand with the active participation of experienced executives. The model's effectiveness has been demonstrated through real life application on oil and gas pipelines in the Gulf of Thailand. Practical implications. Risk-based inspection and maintenance methodology is particularly important for oil pipelines system, as any failure in the system will not only affect productivity negatively but also has tremendous negative environmental impact. The proposed model helps the pipelines operators to analyze the health of pipelines dynamically, to select specific inspection and maintenance method for specific section in line with its probability and severity of failure.
Resumo:
The existing method of pipeline health monitoring, which requires an entire pipeline to be inspected periodically, is both time-wasting and expensive. A risk-based model that reduces the amount of time spent on inspection has been presented. This model not only reduces the cost of maintaining petroleum pipelines, but also suggests efficient design and operation philosophy, construction methodology and logical insurance plans. The risk-based model uses Analytic Hierarchy Process (AHP), a multiple attribute decision-making technique, to identify the factors that influence failure on specific segments and analyzes their effects by determining probability of risk factors. The severity of failure is determined through consequence analysis. From this, the effect of a failure caused by each risk factor can be established in terms of cost, and the cumulative effect of failure is determined through probability analysis. The technique does not totally eliminate subjectivity, but it is an improvement over the existing inspection method.
Resumo:
Purpose - The purpose of this paper is to develop an integrated quality management model that identifies problems, suggests solutions, develops a framework for implementation and helps to evaluate dynamically healthcare service performance. Design/methodology/approach - This study used the logical framework analysis (LFA) to improve the performance of healthcare service processes. LFA has three major steps - problems identification, solution derivation, and formation of a planning matrix for implementation. LFA has been applied in a case-study environment to three acute healthcare services (Operating Room utilisation, Accident and Emergency, and Intensive Care) in order to demonstrate its effectiveness. Findings - The paper finds that LFA is an effective method of quality management of hospital-based healthcare services. Research limitations/implications - This study shows LFA application in three service processes in one hospital. This very limited population sample needs to be extended. Practical implications - The proposed model can be implemented in hospital-based healthcare services in order to improve performance. It may also be applied to other services. Originality/value - Quality improvement in healthcare services is a complex and multi-dimensional task. Although various quality management tools are routinely deployed for identifying quality issues in healthcare delivery, they are not without flaws. There is an absence of an integrated approach, which can identify and analyse issues, provide solutions to resolve those issues, develop a project management framework to implement those solutions. This study introduces an integrated and uniform quality management tool for healthcare services. © Emerald Group Publishing Limited.
Resumo:
Purpose - The purpose of the paper is to develop an integrated quality management model, which identifies problems, suggests solutions, develops a framework for implementation and helps evaluate performance of health care services dynamically. Design/methodology/approach - This paper uses logical framework analysis (LFA), a matrix approach to project planning for managing quality. This has been applied to three acute healthcare services (Operating room utilization, Accident and emergency, and Intensive care) in order to demonstrate its effectiveness. Findings - The paper finds that LFA is an effective method of quality management of hospital-based healthcare services. Research limitations/implications - This paper shows LFA application in three service processes in one hospital. However, ideally this is required to be tested in several hospitals and other services as well. Practical implications - In the paper the proposed model can be practised in hospital-based healthcare services for improving performance. Originality/value - The paper shows that quality improvement in healthcare services is a complex and multi-dimensional task. Although various quality management tools are routinely deployed for identifying quality issues in health care delivery and corrective measures are taken for superior performance, there is an absence of an integrated approach, which can identify and analyze issues, provide solutions to resolve those issues, develop a project management framework (planning, monitoring, and evaluating) to implement those solutions in order to improve process performance. This study introduces an integrated and uniform quality management tool. It integrates operations with organizational strategies. © Emerald Group Publishing Limited.
A model of service performance enhancement:the role of transactional and transformational leadership
Resumo:
This paper is concerned with the ways in which transactional and transformational leadership styles can improve the service performance of front-line staff. Past literature on services marketing has indicated the importance of leadership but has largely ignored the parallel literature in which leadership styles have been conceptualized and operationalized (e.g., sales management, organizational psychology). This paper seeks to build upon existing services marketing theory by introducing the role of leadership styles in enhancing service performance. Consequently, a conceptual framework of the effect of transactional and transformational leadership styles on service performance, anchored in a crossdisciplinary literature review, is developed. Managerial implications and future research directions are also discussed.
Resumo:
This paper is concerned with the effects that leadership styles (i.e., transactional and transformational) can have upon the level of front-line employees’ service delivery quality. Previous literature has mostly looked at leadership and its effects upon subordinates within a sales, psychology, or human resources context. However, due to the idiosyncrasies inherent in services (i.e., intangibility, heterogeneity, perishability, and inseparability), it is likely that, in such a context, different leadership styles will effect performance outcomes. Consequently, this paper seeks to expand the services marketing literature by developing a conceptual framework of leadership style effects adapted to the field of services marketing. Of particular importance are the effects that leadership styles have upon front-line employee “motivators” and service-related job outcomes. Specific hypotheses are developed and future research directions are also presented for consideration.
Resumo:
This paper reports the construction of an 'efficient frontier' of the perceived quality attributes of academic accounting journals. The analysis is based on perception data from two web-based surveys of Australasian and British academics. The research reported here contributes to the existing literature by augmenting the commonly supported single dimension of quality with an additional measure indicating the variation of perceptions of journal quality. The result of combining these factors is depicted diagrammatically in a manner that reflects the risk and return trade-off as conceptualised in the capital market model of an efficient frontier of investment opportunities. This conceptualisation of a 'market' for accounting research provides a context in which to highlight the complex issues facing academics in their roles as editors, researchers and authors. The analysis indicates that the perceptions of the so-called 'elite' US accounting journals have become unsettled particularly in Australasia, showing high levels of variability in perceived quality, while other traditionally highly ranked journals (ABR, AOS, CAR) have a more 'efficient' combination of high-quality ranking and lower dispersion of perceptions. The implications of these results for accounting academics in the context of what is often seen as a market for accounting research are discussed. © 2006 Elsevier Ltd. All rights reserved.
Resumo:
Purpose – Role clarity of frontline staff is critical to their perceptions of service quality in call centres. The purpose of this study is to examine the effects of role clarity and its antecedents and consequences on employee-perceived service quality. Design/methodology/approach – A conceptual model, based on the job characteristics model and cognitive theories, is proposed. Key antecedents of role clarity considered here are feedback, autonomy, participation, supervisory consideration, and team support; while key consequences are organizational commitment, job satisfaction and service quality. An internal marketing approach is adopted and all variables are measured from the frontline employee's perspective. A structural equation model is developed and tested on a sample of 342 call centre representatives of a major commercial bank in the UK. Findings – The research reveals that role clarity plays a critical role in explaining employee perceptions of service quality. Further, the research findings indicate that feedback, participation and team support significantly influence role clarity, which in turn influences job satisfaction and organizational commitment. Research limitations/implications – The research suggests that boundary personnel in service firms should strive for more clarity in perceived role for delivering better service quality. The limitations are in sample availability from in-house transaction call centres of a single bank. Originality/value – The contributions of this study are untangling the confusing research evidence on the effect of role clarity on service quality, using service quality as a performance variable as opposed to productivity estimates, adopting an internal marketing approach to understanding the phenomenon, and introducing teamwork along with job-design and supervisory factors as antecedent to role clarity.
Resumo:
Advances in technology coupled with increasing labour costs have caused service firms to explore self-service delivery options. Although some studies have focused on self-service and use of technology in service delivery, few have explored the role of service quality in consumer evaluation of technology-based self-service options. By integrating and extending the self-service quality framework the service evaluation model and the Technology Acceptance Model the authors address this emerging issue by empirically testing a comprehensive model that captures the antecedents and consequences of perceived service quality to predict continued customer interaction in the technology-based self-service context of Internet banking. Important service evaluation constructs like perceived risk, perceived value and perceived satisfaction are modelled in this framework. The results show that perceived control has the strongest influence on service quality evaluations. Perceived speed of delivery, reliability and enjoyment also have a significant impact on service quality perceptions. The study also found that even though perceived service quality, perceived risk and satisfaction are important predictors of continued interaction, perceived customer value plays a pivotal role in influencing continued interaction.
Resumo:
Although techniques such as biopanning rely heavily upon the screening of randomized gene libraries, there is surprisingly little information available on the construction of those libraries. In general, it is based on the cloning of 'randomized' synthetic oligonucleotides, in which given position(s) contain an equal mixture of all four bases. Yet, many supposedly 'randomized' libraries contain significant elements of bias and/or omission. Here, we report the development and validation of a new, PCR-based assay that enables rapid examination of library composition both prior to and after cloning. By using our assay to analyse model libraries, we demonstrate that the cloning of a given distribution of sequences does not necessarily result in a similarly composed library of clones. Thus, while bias in randomized synthetic oligonucleotide mixtures can be virtually eliminated by using unequal ratios of the four phosphoramidites, the use of such mixtures does not ensure retrieval of a truly randomized library. We propose that in the absence of a technique to control cloning frequencies, the ability to analyse the composition of libraries after cloning will enhance significantly the quality of information derived from those libraries. (C) 2000 Published by Elsevier Science B.V. All rights reserved.