483 resultados para Quality models


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Managing service quality is of primary importance for organizations that are increasingly service oriented, and offering a growing range of services to external and internal customers. Managing service quality requires the capacity to measure service quality, concomitantly requiring explicit conceptions of ‘service’ and ‘service quality’. This white-paper explores three keys areas of service and service marketing literature: service definition and conceptualisation, service classifications, and service quality models, and make the following observations and proposals.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Service bundles, in the context of e-government, are used to group services together that relate to a certain citizen need. These bundles can then be presented on a governmental one-stop portal to structure the available service offerings according to citizen expectations. In order to ensure that citizens utilise the one-stop portal and comprised service bundles for future transactions, the quality of these service bundles needs to be managed and maximised accordingly. Consequently, models and tools that focus on assessing service bundle quality play an important role, when it comes to increasing or retaining usage behaviour of citizens. This study focuses on providing a rigorous and structured literature review of e-government outlets with regards to their coverage of service bundle quality and e-service quality themes. The study contributes to academia and practice by providing a framework that allows structuring and classifying existing studies relevant for the assessment of quality for government portals. Furthermore, this study provides insights into the status quo of quality models that can be used by governments to assess the quality of their service bundles. Directions for future research and limitations of the present study are provided as well.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Process variability in pollutant build-up and wash-off generates inherent uncertainty that affects the outcomes of stormwater quality models. Poor characterisation of process variability constrains the accurate accounting of the uncertainty associated with pollutant processes. This acts as a significant limitation to effective decision making in relation to stormwater pollution mitigation. The study undertaken developed three theoretical scenarios based on research findings that variations in particle size fractions <150µm and >150µm during pollutant build-up and wash-off primarily determine the variability associated with these processes. These scenarios, which combine pollutant build-up and wash-off processes that takes place on a continuous timeline, are able to explain process variability under different field conditions. Given the variability characteristics of a specific build-up or wash-off event, the theoretical scenarios help to infer the variability characteristics of the associated pollutant process that follows. Mathematical formulation of the theoretical scenarios enables the incorporation of variability characteristics of pollutant build-up and wash-off processes in stormwater quality models. The research study outcomes will contribute to the quantitative assessment of uncertainty as an integral part of the interpretation of stormwater quality modelling outcomes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Stormwater quality modelling results is subject to uncertainty. The variability of input parameters is an important source of overall model error. An in-depth understanding of the variability associated with input parameters can provide knowledge on the uncertainty associated with these parameters and consequently assist in uncertainty analysis of stormwater quality models and the decision making based on modelling outcomes. This paper discusses the outcomes of a research study undertaken to analyse the variability related to pollutant build-up parameters in stormwater quality modelling. The study was based on the analysis of pollutant build-up samples collected from 12 road surfaces in residential, commercial and industrial land uses. It was found that build-up characteristics vary appreciably even within the same land use. Therefore, using land use as a lumped parameter would contribute significant uncertainties in stormwater quality modelling. Additionally, it was also found that the variability in pollutant build-up can also be significant depending on the pollutant type. This underlines the importance of taking into account specific land use characteristics and targeted pollutant species when undertaking uncertainty analysis of stormwater quality models or in interpreting the modelling outcomes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Due to knowledge gaps in relation to urban stormwater quality processes, an in-depth understanding of model uncertainty can enhance decision making. Uncertainty in stormwater quality models can originate from a range of sources such as the complexity of urban rainfall-runoff-stormwater pollutant processes and the paucity of observed data. Unfortunately, studies relating to epistemic uncertainty, which arises from the simplification of reality are limited and often deemed mostly unquantifiable. This paper presents a statistical modelling framework for ascertaining epistemic uncertainty associated with pollutant wash-off under a regression modelling paradigm using Ordinary Least Squares Regression (OLSR) and Weighted Least Squares Regression (WLSR) methods with a Bayesian/Gibbs sampling statistical approach. The study results confirmed that WLSR assuming probability distributed data provides more realistic uncertainty estimates of the observed and predicted wash-off values compared to OLSR modelling. It was also noted that the Bayesian/Gibbs sampling approach is superior compared to the most commonly adopted classical statistical and deterministic approaches commonly used in water quality modelling. The study outcomes confirmed that the predication error associated with wash-off replication is relatively higher due to limited data availability. The uncertainty analysis also highlighted the variability of the wash-off modelling coefficient k as a function of complex physical processes, which is primarily influenced by surface characteristics and rainfall intensity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Variability in the pollutant wash-off process is a concept which needs to be understood in-depth in order to better assess the outcomes of stormwater quality models, and thereby strengthen stormwater pollution mitigation strategies. Current knowledge about the wash-off process does not extend to a clear understanding of the influence of the initially available pollutant build-up on the variability of the pollutant wash-off load and composition. Consequently, pollutant wash-off process variability is poorly characterised in stormwater quality models, which can result in inaccurate stormwater quality predictions. Mathematical simulation of particulate wash-off from three urban road surfaces confirmed that the wash-off load of particle size fractions <150µm and >150µm after a storm event vary with the build-up of the respective particle size fractions available at the beginning of the storm event. Furthermore, pollutant load and composition associated with the initially available build-up of <150µm particles predominantly influence the variability in washed-off pollutant load and composition. The influence of the build-up of pollutants associated with >150µm particles on wash-off process variability is significant only for relatively shorter duration storm events.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Assessing build-up and wash-off process uncertainty is important for accurate interpretation of model outcomes to facilitate informed decision making for developing effective stormwater pollution mitigation strategies. Uncertainty inherent to pollutant build-up and wash-off processes influences the variations in pollutant loads entrained in stormwater runoff from urban catchments. However, build-up and wash-off predictions from stormwater quality models do not adequately represent such variations due to poor characterisation of the variability of these processes in mathematical models. The changes to the mathematical form of current models with the incorporation of process variability, facilitates accounting for process uncertainty without significantly affecting the model prediction performance. Moreover, the investigation of uncertainty propagation from build-up to wash-off confirmed that uncertainty in build-up process significantly influences wash-off process uncertainty. Specifically, the behaviour of particles <150µm during build-up primarily influences uncertainty propagation, resulting in appreciable variations in the pollutant load and composition during a wash-off event.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Process modeling can be regarded as the currently most popular form of conceptual modeling. Research evidence illustrates how process modeling is applied across the different information system life cycle phases for a range of different applications, such as configuration of Enterprise Systems, workflow management, or software development. However, a detailed discussion of critical factors of the quality of process models is still missing. This paper proposes a framework consisting of six quality factors, which is derived from a comprehensive literature review. It then presents in a case study, a utility provider, who had designed various business process models for the selection of an Enterprise System. The paper summarizes potential means of conducting a successful process modeling initiative and evaluates the described modeling approach within the Guidelines of Modeling (GoM) framework. An outlook shows the potential lessons learnt, and concludes with insights to the next phases of this study.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The indoor air quality (IAQ) in buildings is currently assessed by measurement of pollutants during building operation for comparison with air quality standards. Current practice at the design stage tries to minimise potential indoor air quality impacts of new building materials and contents by selecting low-emission materials. However low-emission materials are not always available, and even when used the aggregated pollutant concentrations from such materials are generally overlooked. This paper presents an innovative tool for estimating indoor air pollutant concentrations at the design stage, based on emissions over time from large area building materials, furniture and office equipment. The estimator considers volatile organic compounds, formaldehyde and airborne particles from indoor materials and office equipment and the contribution of outdoor urban air pollutants affected by urban location and ventilation system filtration. The estimated pollutants are for a single, fully mixed and ventilated zone in an office building with acceptable levels derived from Australian and international health-based standards. The model acquires its dimensional data for the indoor spaces from a 3D CAD model via IFC files and the emission data from a building products/contents emissions database. This paper describes the underlying approach to estimating indoor air quality and discusses the benefits of such an approach for designers and the occupants of buildings.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A pragmatic method for assessing the accuracy and precision of a given processing pipeline required for converting computed tomography (CT) image data of bones into representative three dimensional (3D) models of bone shapes is proposed. The method is based on coprocessing a control object with known geometry which enables the assessment of the quality of resulting 3D models. At three stages of the conversion process, distance measurements were obtained and statistically evaluated. For this study, 31 CT datasets were processed. The final 3D model of the control object contained an average deviation from reference values of −1.07±0.52 mm standard deviation (SD) for edge distances and −0.647±0.43 mm SD for parallel side distances of the control object. Coprocessing a reference object enables the assessment of the accuracy and precision of a given processing pipeline for creating CTbased 3D bone models and is suitable for detecting most systematic or human errors when processing a CT-scan. Typical errors have about the same size as the scan resolution.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the current business world which companies’ competition is very compact in the business arena, quality in manufacturing and providing products and services can be considered as a means of seeking excellence and success of companies in this competition arena. Entering the era of e-commerce and emergence of new production systems and new organizational structures, traditional management and quality assurance systems have been challenged. Consequently, quality information system has been gained a special seat as one of the new tools of quality management. In this paper, quality information system has been studied with a review of the literature of the quality information system, and the role and position of quality Information System (QIS) among other information systems of a organization is investigated. The quality Information system models are analyzed and by analyzing and assessing presented models in quality information system a conceptual and hierarchical model of quality information system is suggested and studied. As a case study the hierarchical model of quality information system is developed by evaluating hierarchical models presented in the field of quality information system based on the Shetabkar Co.