50 resultados para model quality


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to develop a comprehensive framework for improving intensive care unit performance. Design/methodology/approach – The study introduces a quality management framework by combining cause and effect diagram and logical framework. An intensive care unit was identified for the study on the basis of its performance. The reasons for not achieving the desired performance were identified using a cause and effect diagram with the stakeholder involvement. A logical framework was developed using information from the cause and effect diagram and a detailed project plan was developed. The improvement projects were implemented and evaluated. Findings – Stakeholders identified various intensive care unit issues. Managerial performance, organizational processes and insufficient staff were considered major issues. A logical framework was developed to plan an improvement project to resolve issues raised by clinicians and patients. Improved infrastructure, state-of-the-art equipment, well maintained facilities, IT-based communication, motivated doctors, nurses and support staff, improved patient care and improved drug availability were considered the main project outputs for improving performance. The proposed framework is currently being used as a continuous quality improvement tool, providing a planning, implementing, monitoring and evaluating framework for the quality improvement measures on a sustainable basis. Practical implications – The combined cause and effect diagram and logical framework analysis is a novel and effective approach to improving intensive care performance. Similar approaches could be adopted in any intensive care unit. Originality/value – The paper focuses on a uniform model that can be applied to most intensive care units.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Particle breakage due to fluid flow through various geometries can have a major influence on the performance of particle/fluid processes and on the product quality characteristics of particle/fluid products. In this study, whey protein precipitate dispersions were used as a case study to investigate the effect of flow intensity and exposure time on the breakage of these precipitate particles. Computational fluid dynamic (CFD) simulations were performed to evaluate the turbulent eddy dissipation rate (TED) and associated exposure time along various flow geometries. The focus of this work is on the predictive modelling of particle breakage in particle/fluid systems. A number of breakage models were developed to relate TED and exposure time to particle breakage. The suitability of these breakage models was evaluated for their ability to predict the experimentally determined breakage of the whey protein precipitate particles. A "power-law threshold" breakage model was found to provide a satisfactory capability for predicting the breakage of the whey protein precipitate particles. The whey protein precipitate dispersions were propelled through a number of different geometries such as bends, tees and elbows, and the model accurately predicted the mean particle size attained after flow through these geometries. © 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To show that the limited quality of surfaces produced by one model of excimer laser systems can degrade visual performance with a polymethylmethacrylate (PMMA) model. METHODS: A range of lenses of different powers was ablated in PMMA sheets using five DOS-based Nidek EC-5000 laser systems (Nidek Technologies, Gamagori, Japan) from different clinics. Surface quality was objectively assessed using profilometry. Contrast sensitivity and visual acuity were measured through the lenses when their powers were neutralized with suitable spectacle trial lenses. RESULTS: Average surface roughness was found to increase with lens power, roughness values being higher for negative lenses than for positive lenses. Losses in visual contrast sensitivity and acuity measured in two subjects were found to follow a similar pattern. Findings are similar to those previously published with other excimer laser systems. CONCLUSIONS: Levels of surface roughness produced by some laser systems may be sufficient to degrade visual performance under some circumstances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the dynamic and mutihop nature of the Mobile Ad-hoc Network (MANET), voice communication over MANET may encounter many challenges. We set up a subjective quality evaluation model using ITU-T E-model with extension. And through simulation in NS-2, we evaluate how the following factors impact voice quality in MANET: the number of hops, the number of route breakages, the number of communication pairs and the background traffic. Using AODV as the underlying routing protocol, and with the MAC layer changed from 802.11 DCF to 802.11e EDCF, we observe that 802.11e is more suitable for implementating voice communication over MANET. © 2005 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To examine the detailed operation of the power distribution network in a future more electric aircraft that employs electric actuation systems, a Micro-Cap SPICE simulation is developed for one of the essential buses. Particular attention is paid to model accurately the most important effects that influence system power quality. Representative system and flight data are used to illustrate the operation of the simulation and to assess the power quality conditions within the network as the flight control surfaces are deployed. The results illustrate the importance of correct cable sizing to ensure stable operation of actuators during transient conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The target of no-reference (NR) image quality assessment (IQA) is to establish a computational model to predict the visual quality of an image. The existing prominent method is based on natural scene statistics (NSS). It uses the joint and marginal distributions of wavelet coefficients for IQA. However, this method is only applicable to JPEG2000 compressed images. Since the wavelet transform fails to capture the directional information of images, an improved NSS model is established by contourlets. In this paper, the contourlet transform is utilized to NSS of images, and then the relationship of contourlet coefficients is represented by the joint distribution. The statistics of contourlet coefficients are applicable to indicate variation of image quality. In addition, an image-dependent threshold is adopted to reduce the effect of content to the statistical model. Finally, image quality can be evaluated by combining the extracted features in each subband nonlinearly. Our algorithm is trained and tested on the LIVE database II. Experimental results demonstrate that the proposed algorithm is superior to the conventional NSS model and can be applied to different distortions. © 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research is focused on the optimisation of resource utilisation in wireless mobile networks with the consideration of the users’ experienced quality of video streaming services. The study specifically considers the new generation of mobile communication networks, i.e. 4G-LTE, as the main research context. The background study provides an overview of the main properties of the relevant technologies investigated. These include video streaming protocols and networks, video service quality assessment methods, the infrastructure and related functionalities of LTE, and resource allocation algorithms in mobile communication systems. A mathematical model based on an objective and no-reference quality assessment metric for video streaming, namely Pause Intensity, is developed in this work for the evaluation of the continuity of streaming services. The analytical model is verified by extensive simulation and subjective testing on the joint impairment effects of the pause duration and pause frequency. Various types of the video contents and different levels of the impairments have been used in the process of validation tests. It has been shown that Pause Intensity is closely correlated with the subjective quality measurement in terms of the Mean Opinion Score and this correlation property is content independent. Based on the Pause Intensity metric, an optimised resource allocation approach is proposed for the given user requirements, communication system specifications and network performances. This approach concerns both system efficiency and fairness when establishing appropriate resource allocation algorithms, together with the consideration of the correlation between the required and allocated data rates per user. Pause Intensity plays a key role here, representing the required level of Quality of Experience (QoE) to ensure the best balance between system efficiency and fairness. The 3GPP Long Term Evolution (LTE) system is used as the main application environment where the proposed research framework is examined and the results are compared with existing scheduling methods on the achievable fairness, efficiency and correlation. Adaptive video streaming technologies are also investigated and combined with our initiatives on determining the distribution of QoE performance across the network. The resulting scheduling process is controlled through the prioritization of users by considering their perceived quality for the services received. Meanwhile, a trade-off between fairness and efficiency is maintained through an online adjustment of the scheduler’s parameters. Furthermore, Pause Intensity is applied to act as a regulator to realise the rate adaptation function during the end user’s playback of the adaptive streaming service. The adaptive rates under various channel conditions and the shape of the QoE distribution amongst the users for different scheduling policies have been demonstrated in the context of LTE. Finally, the work for interworking between mobile communication system at the macro-cell level and the different deployments of WiFi technologies throughout the macro-cell is presented. A QoEdriven approach is proposed to analyse the offloading mechanism of the user’s data (e.g. video traffic) while the new rate distribution algorithm reshapes the network capacity across the macrocell. The scheduling policy derived is used to regulate the performance of the resource allocation across the fair-efficient spectrum. The associated offloading mechanism can properly control the number of the users within the coverages of the macro-cell base station and each of the WiFi access points involved. The performance of the non-seamless and user-controlled mobile traffic offloading (through the mobile WiFi devices) has been evaluated and compared with that of the standard operator-controlled WiFi hotspots.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data fluctuation in multiple measurements of Laser Induced Breakdown Spectroscopy (LIBS) greatly affects the accuracy of quantitative analysis. A new LIBS quantitative analysis method based on the Robust Least Squares Support Vector Machine (RLS-SVM) regression model is proposed. The usual way to enhance the analysis accuracy is to improve the quality and consistency of the emission signal, such as by averaging the spectral signals or spectrum standardization over a number of laser shots. The proposed method focuses more on how to enhance the robustness of the quantitative analysis regression model. The proposed RLS-SVM regression model originates from the Weighted Least Squares Support Vector Machine (WLS-SVM) but has an improved segmented weighting function and residual error calculation according to the statistical distribution of measured spectral data. Through the improved segmented weighting function, the information on the spectral data in the normal distribution will be retained in the regression model while the information on the outliers will be restrained or removed. Copper elemental concentration analysis experiments of 16 certified standard brass samples were carried out. The average value of relative standard deviation obtained from the RLS-SVM model was 3.06% and the root mean square error was 1.537%. The experimental results showed that the proposed method achieved better prediction accuracy and better modeling robustness compared with the quantitative analysis methods based on Partial Least Squares (PLS) regression, standard Support Vector Machine (SVM) and WLS-SVM. It was also demonstrated that the improved weighting function had better comprehensive performance in model robustness and convergence speed, compared with the four known weighting functions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Surface quality is important in engineering and a vital aspect of it is surface roughness, since it plays an important role in wear resistance, ductility, tensile, and fatigue strength for machined parts. This paper reports on a research study on the development of a geometrical model for surface roughness prediction when face milling with square inserts. The model is based on a geometrical analysis of the recreation of the tool trail left on the machined surface. The model has been validated with experimental data obtained for high speed milling of aluminum alloy (Al 7075-T7351) when using a wide range of cutting speed, feed per tooth, axial depth of cut and different values of tool nose radius (0.8. mm and 2.5. mm), using the Taguchi method as the design of experiments. The experimental roughness was obtained by measuring the surface roughness of the milled surfaces with a non-contact profilometer. The developed model can be used for any combination of material workpiece and tool, when tool flank wear is not considered and is suitable for using any tool diameter with any number of teeth and tool nose radius. The results show that the developed model achieved an excellent performance with almost 98% accuracy in terms of predicting the surface roughness when compared to the experimental data. © 2014 The Society of Manufacturing Engineers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurement assisted assembly (MAA) has the potential to facilitate a step change in assembly efficiency for large structures such as airframes through the reduction of rework, manually intensive processes and expensive monolithic assembly tooling. It is shown how MAA can enable rapid part-to-part assembly, increased use of flexible automation, traceable quality assurance and control, reduced structure weight and improved aerodynamic tolerances. These advances will require the development of automated networks of measurement instruments; model based thermal compensation, the automatic integration of 'live' measurement data into variation simulation and algorithms to generate cutting paths for predictive shimming and drilling processes. This paper sets out an architecture for digital systems which will enable this integrated approach to variation management. © 2013 The Authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decision-making in product quality is an indispensable stage in product development, in order to reduce product development risk. Based on the identification of the deficiencies of quality function deployment (QFD) and failure modes and effects analysis (FMEA), a novel decision-making method is presented that draws upon a knowledge network of failure scenarios. An ontological expression of failure scenarios is presented together with a framework of failure knowledge network (FKN). According to the roles of quality characteristics (QCs) in failure processing, QCs are set into three categories namely perceptible QCs, restrictive QCs, and controllable QCs, which present the monitor targets, control targets and improvement targets respectively for quality management. A mathematical model and algorithms based on the analytic network process (ANP) is introduced for calculating the priority of QCs with respect to different development scenarios. A case study is provided according to the proposed decision-making procedure based on FKN. This methodology is applied in the propeller design process to solve the problem of prioritising QCs. This paper provides a practical approach for decision-making in product quality. Copyright © 2011 Inderscience Enterprises Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores the sharing of value in business transactions. Although there is an increased usage of the terminology of value in marketing (such concepts as value based selling and pricing), as well as in purchasing (value-based purchasing), the definition of the term is still vague. In order to better understand the definition of value, the author’s argue that it is important to understand the sharing of value, in general and the element of power for the sharing of value in particular. The aim of this paper is to add to this debate and this requires us to critique the current models. The key process that the analysis of power will help to explain is the division of the available revenue stream flowing up the chain from the buyer's customers. If the buyer and supplier do not cooperate, then power will be key in the sharing of that money flow. If buyers and suppliers fully cooperate, they may be able to reduce their costs and/or increase the quality of the sales offering the buyer makes to their customer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Local air quality was one of the main stimulants for low carbon vehicle development during the 1990s. Issues of national fuel security and global air quality (climate change) have added pressure for their development, stimulating schemes to facilitate their deployment in the UK. In this case study, Coventry City Council aimed to adopt an in-house fleet of electric and hybrid-electric vehicles to replace business mileage paid for in employee's private vehicles. This study made comparisons between the proposed vehicle technologies, in terms of costs and air quality, over projected scenarios of typical use. The study found that under 2009 conditions, the electric and hybrid fleet could not compete on cost with the current business model because of untested assumptions, but certain emissions were significantly reduced >50%. Climate change gas emissions were most drastically reduced where electric vehicles were adopted because the electricity supply was generated by renewable energy sources. The study identified the key cost barriers and benefits to adoption of low-emission vehicles in current conditions in the Coventry fleet. Low-emission vehicles achieved significant air pollution-associated health cost and atmospheric emission reductions per vehicle, and widespread adoption in cities could deliver significant change. © The Author 2011. Published by Oxford University Press. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a full analytic model for pause intensity (PI), a no-reference metric for video quality assessment, is presented. The model is built upon the video play out buffer behavior at the client side and also encompasses the characteristics of a TCP network. Video streaming via TCP produces impairments in play continuity, which are not typically reflected in current objective metrics such as PSNR and SSIM. Recently the buffer under run frequency/probability has been used to characterize the buffer behavior and as a measurement for performance optimization. But we show, using subjective testing, that under run frequency cannot reflect the viewers' quality of experience for TCP based streaming. We also demonstrate that PI is a comprehensive metric made up of a combination of phenomena observed in the play out buffer. The analytical model in this work is verified with simulations carried out on ns-2, showing that the two results are closely matched. The effectiveness of the PI metric has also been proved by subjective testing on a range of video clips, where PI values exhibit a good correlation with the viewers' opinion scores. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a process to enhance the quality of higher education. At the heart of the process is a cross-sparring collaborative model, whereby institutions are critical friends. This is based on a prior self-evaluation, where the institution / programme identifies quality criteria it wants to improve. Part of the process is to ensure the documentation of best practices so that they can be shared with others in a so called market place. Linking the best practices to a criterion makes them searchable on a large scale. Optimal pairings of institutions can then take place for the cross-sparring activities.