942 resultados para Quality models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The respiratory tract is a major target of exposure to air pollutants, and respiratory diseases are associated with both short- and long-term exposures. We hypothesized that improved air quality in North Carolina was associated with reduced rates of death from respiratory diseases in local populations. MATERIALS AND METHODS: We analyzed the trends of emphysema, asthma, and pneumonia mortality and changes of the levels of ozone, sulfur dioxide (SO2), nitrogen dioxide (NO2), carbon monoxide (CO), and particulate matters (PM2.5 and PM10) using monthly data measurements from air-monitoring stations in North Carolina in 1993-2010. The log-linear model was used to evaluate associations between air-pollutant levels and age-adjusted death rates (per 100,000 of population) calculated for 5-year age-groups and for standard 2000 North Carolina population. The studied associations were adjusted by age group-specific smoking prevalence and seasonal fluctuations of disease-specific respiratory deaths. RESULTS: Decline in emphysema deaths was associated with decreasing levels of SO2 and CO in the air, decline in asthma deaths-with lower SO2, CO, and PM10 levels, and decline in pneumonia deaths-with lower levels of SO2. Sensitivity analyses were performed to study potential effects of the change from International Classification of Diseases (ICD)-9 to ICD-10 codes, the effects of air pollutants on mortality during summer and winter, the impact of approach when only the underlying causes of deaths were used, and when mortality and air-quality data were analyzed on the county level. In each case, the results of sensitivity analyses demonstrated stability. The importance of analysis of pneumonia as an underlying cause of death was also highlighted. CONCLUSION: Significant associations were observed between decreasing death rates of emphysema, asthma, and pneumonia and decreases in levels of ambient air pollutants in North Carolina.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Anticoagulation can reduce quality of life, and different models of anticoagulation management might have different impacts on satisfaction with this component of medical care. Yet, to our knowledge, there are no scales measuring quality of life and satisfaction with anticoagulation that can be generalized across different models of anticoagulation management. We describe the development and preliminary validation of such an instrument - the Duke Anticoagulation Satisfaction Scale (DASS). METHODS: The DASS is a 25-item scale addressing the (a) negative impacts of anticoagulation (limitations, hassles and burdens); and (b) positive impacts of anticoagulation (confidence, reassurance, satisfaction). Each item has 7 possible responses. The DASS was administered to 262 patients currently receiving oral anticoagulation. Scales measuring generic quality of life, satisfaction with medical care, and tendency to provide socially desirable responses were also administered. Statistical analysis included assessment of item variability, internal consistency (Cronbach's alpha), scale structure (factor analysis), and correlations between the DASS and demographic variables, clinical characteristics, and scores on the above scales. A follow-up study of 105 additional patients assessed test-retest reliability. RESULTS: 220 subjects answered all items. Ceiling and floor effects were modest, and 25 of the 27 proposed items grouped into 2 factors (positive impacts, negative impacts, this latter factor being potentially subdivided into limitations versus hassles and burdens). Each factor had a high degree of internal consistency (Cronbach's alpha 0.78-0.91). The limitations and hassles factors consistently correlated with the SF-36 scales measuring generic quality of life, while the positive psychological impact scale correlated with age and time on anticoagulation. The intra-class correlation coefficient for test-retest reliability was 0.80. CONCLUSIONS: The DASS has demonstrated reasonable psychometric properties to date. Further validation is ongoing. To the degree that dissatisfaction with anticoagulation leads to decreased adherence, poorer INR control, and poor clinical outcomes, the DASS has the potential to help identify reasons for dissatisfaction (and positive satisfaction), and thus help to develop interventions to break this cycle. As an instrument designed to be applicable across multiple models of anticoagulation management, the DASS could be crucial in the scientific comparison between those models of care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mathematical models of straight-grate pellet induration processes have been developed and carefully validated by a number of workers over the past two decades. However, the subsequent exploitation of these models in process optimization is less clear, but obviously requires a sound understanding of how the key factors control the operation. In this article, we show how a thermokinetic model of pellet induration, validated against operating data from one of the Iron Ore Company of Canada (IOCC) lines in Canada, can be exploited in process optimization from the perspective of fuel efficiency, production rate, and product quality. Most existing processes are restricted in the options available for process optimization. Here, we review the role of each of the drying (D), preheating (PH), firing (F), after-firing (AF), and cooling (C) phases of the induration process. We then use the induration process model to evaluate whether the first drying zone is best to use on the up- or down-draft gas-flow stream, and we optimize the on-gas temperature profile in the hood of the PH, F, and AF zones, to reduce the burner fuel by at least 10 pct over the long term. Finally, we consider how efficient and flexible the process could be if some of the structural constraints were removed (i.e., addressed at the design stage). The analysis suggests it should be possible to reduce the burner fuel lead by 35 pct, easily increase production by 5+ pct, and improve pellet quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most of the air quality modelling work has been so far oriented towards deterministic simulations of ambient pollutant concentrations. This traditional approach, which is based on the use of one selected model and one data set of discrete input values, does not reflect the uncertainties due to errors in model formulation and input data. Given the complexities of urban environments and the inherent limitations of mathematical modelling, it is unlikely that a single model based on routinely available meteorological and emission data will give satisfactory short-term predictions. In this study, different methods involving the use of more than one dispersion model, in association with different emission simulation methodologies and meteorological data sets, were explored for predicting best CO and benzene estimates, and related confidence bounds. The different approaches were tested using experimental data obtained during intensive monitoring campaigns in busy street canyons in Paris, France. Three relative simple dispersion models (STREET, OSPM and AEOLIUS) that are likely to be used for regulatory purposes were selected for this application. A sensitivity analysis was conducted in order to identify internal model parameters that might significantly affect results. Finally, a probabilistic methodology for assessing urban air quality was proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High pollution levels have been often observed in urban street canyons due to the increased traffic emissions and reduced natural ventilation. Microscale dispersion models with different levels of complexity may be used to assess urban air qualityand support decision-making for pollution control strategies and traffic planning. Mathematical models calculate pollutant concentrations by solving either analytically a simplified set of parametric equations or numerically a set of differential equations that describe in detail wind flow and pollutant dispersion. Street canyon models, which might also include simplified photochemistry and particle deposition–resuspension algorithms, are often nested within larger-scale urban dispersion codes. Reduced-scale physical models in wind tunnels may also be used for investigating atmospheric processes within urban canyons and validating mathematical models. A range of monitoring techniques is used to measure pollutant concentrations in urban streets. Point measurement methods (continuous monitoring, passive and active pre-concentration sampling, grab sampling) are available for gaseous pollutants. A number of sampling techniques (mainlybased on filtration and impaction) can be used to obtain mass concentration, size distribution and chemical composition of particles. A combination of different sampling/monitoring techniques is often adopted in experimental studies. Relativelysimple mathematical models have usually been used in association with field measurements to obtain and interpret time series of pollutant concentrations at a limited number of receptor locations in street canyons. On the other hand, advanced numerical codes have often been applied in combination with wind tunnel and/or field data to simulate small-scale dispersion within the urban canopy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A study of the external, loaded and unloaded quality factors for frequency selective surfaces (FSSs) is presented. The study is focused on THz frequencies between 5 and 30 THz, where ohmic losses arising from the conductors become important. The influence of material properties, such as metal thickness, conductivity dispersion and surface roughness, is investigated. An equivalent circuit that models the FSS in the presence of ohmic losses is introduced and validated by means of full-wave results. Using both full-wave methods as well as a circuit model, the reactive energy stored in the vicinity of the FSS at resonance upon plane-wave incidence is presented. By studying a doubly periodic array of aluminium strips, it is revealed that the reactive power stored at resonance increases rapidly with increasing periodicity. Moreover, it is demonstrated that arrays with larger periodicity-and therefore less metallisation per unit area-exhibit stronger thermal absorption. Despite this absorption, arrays with higher periodicities produce higher unloaded quality factors. Finally, experimental results of a fabricated prototype operating at 14 THz are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electricity systems models are software tools used to manage electricity demand and the electricity systems, to trade electricity and for generation expansion planning purposes. Various portfolios and scenarios are modelled in order to compare the effects of decision making in policy and on business development plans in electricity systems so as to best advise governments and industry on the least cost economic and environmental approach to electricity supply, while maintaining a secure supply of sufficient quality electricity. The modelling techniques developed to study vertically integrated state monopolies are now applied in liberalised markets where the issues and constraints are more complex. This paper reviews the changing role of electricity systems modelling in a strategic manner, focussing on the modelling response to key developments, the move away from monopoly towards liberalised market regimes and the increasing complexity brought about by policy targets for renewable energy and emissions. The paper provides an overview of electricity systems modelling techniques, discusses a number of key proprietary electricity systems models used in the USA and Europe and provides an information resource to the electricity analyst not currently readily available in the literature on the choice of model to investigate different aspects of the electricity system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract:
Background: Health care organisations
worldwide are faced with the need to develop
and implement strategic organisational plans
to meet the challenges of modern health care.
There is a need for models for developing, implementing and evaluating strategic plans that engage practitioners, and make a measurable difference to the patients that they serve. These presentations describe the development, implementation and evaluation of such a model by a team of senior nurses and practice developers, to underpin a strategy for nursing and midwifery in an acute hospital trust. Developing a Strategy The PARIHS (Promoting Action on Research Implementation in Health Services) conceptual framework (Kitson et al, 1998) proposes that successful implementation of change in practice is a function of the interplay of three core elements: the level of evidence supporting the proposed change; the context or environment in which the change takes place, and the way in which change is facilitated. We chose to draw on this framework to develop our strategy and implementation plan (O’Halloran, Martin and Connolly, 2005). At the centre of the plan are ward managers. These professionals provide leadership for the majority of staff in the trust and so were seen to be a key group in the implementation process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study uses a discrete choice experiment (DCE) to elicit willingness to pay estimates for changes in the water quality of three rivers. As many regions the metropolitan region Berlin-Brandenburg struggles to achieve the objectives of the Water Framework Directive until 2015. A major problem is the high load of nutrients. As the region is part of two states (Länder) and the river sections are common throughout the whole region we account for the spatial context twofold. Firstly, we incorporate the distance between each respondent and all river stretches in all MNL and RPL models, and, secondly, we consider whether respondents reside in the state of Berlin or Brandenburg. The compensating variation (CV) calculated for various scenarios shows that overall people would significantly benefit from improved water quality. The CV measures, however, also reveal that not considering the spatial context would result in severely biased welfare measures. While the distance decay effect lowers CV, state residency is connected to the frequency of status quo choices and not accounting for residency would underestimate possible welfare gains in one state. Another finding is that the extent of the market varies with respect to attributes (river stretches) and attribute levels (water quality levels).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper will consider the inter-relationship of a number of overlapping disciplinary theoretical concepts relevant to a strengths-based orientation, including well-being, salutogenesis, sense of coherence, quality of life and resilience. Psychological trauma will be referenced and the current evidence base for interventions with children and young people outlined and critiqued. The relational impact of trauma on family relationships is emphasised, providing a rationale for systemic psychotherapeutic interventions as part of a holistic approach to managing the effects of trauma. The congruence between second-order systemic psychotherapy models and a strengths-based philosophy is noted, with particular reference to solution-focused brief therapy and narrative therapy, and illustrated; via a description of the process of helping someone move from a victim position to a survivor identity using solution-focused brief therapy, and through a case example applying a narrative therapy approach to a teenage boy who suffered a serious assault. The benefits of a strength-based approach to psychological trauma for the clients and therapists will be summarised and a number of potential pitfalls articulated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polymer extrusion, in which a polymer is melted and conveyed to a mould or die, forms the basis of most polymer processing techniques. Extruders frequently run at non-optimised conditions and can account for 15–20% of overall process energy losses. In times of increasing energy efficiency such losses are a major concern for the industry. Product quality, which depends on the homogeneity and stability of the melt flow which in turn depends on melt temperature and screw speed, is also an issue of concern of processors. Gear pumps can be used to improve the stability of the production line, but the cost is usually high. Likewise it is possible to introduce energy meters but they also add to the capital cost of the machine. Advanced control incorporating soft sensing capabilities offers opportunities to this industry to improve both quality and energy efficiency. Due to strong correlations between the critical variables, such as the melt temperature and melt pressure, traditional decentralized PID (Proportional–Integral–Derivative) control is incapable of handling such processes if stricter product specifications are imposed or the material is changed from one batch to another. In this paper, new real-time energy monitoring methods have been introduced without the need to install power meters or develop data-driven models. The effects of process settings on energy efficiency and melt quality are then studied based on developed monitoring methods. Process variables include barrel heating temperature, water cooling temperature, and screw speed. Finally, a fuzzy logic controller is developed for a single screw extruder to achieve high melt quality. The resultant performance of the developed controller has shown it to be a satisfactory alternative to the expensive gear pump. Energy efficiency of the extruder can further be achieved by optimising the temperature settings. Experimental results from open-loop control and fuzzy control on a Killion 25 mm single screw extruder are presented to confirm the efficacy of the proposed approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quality Management and Managerialism in Healthcare creates a comprehensive and systematic international survey of various perspectives on healthcare quality management together with some of their most pertinent critiques. Chapter one starts with a general discussion of the factors that drove the introduction of management paradigms into public sector and health management contexts in the mid to late 1980s. Chapter two explores the rise of risk awareness in medicine; which, prior to the 1980s, stood largely in isolation to the implementation of managerial performance targets. Chapter three investigates the widespread adoption of performance management and clinical governance frameworks during the 1980s and 1990s. This is followed by Chapters four and five which examine systems based models of patient safety and the evidence-based medicine movement as exemplars of managerial perspectives on healthcare quality. Chapter six discusses potential future avenues for the development of alternative perspectives on quality of care which emphasise workforce involvement. The book concludes by reviewing the factors which have underpinned the managerialist trajectory of healthcare management over the past decades and explores the potential impact of nascent technologies such as 'connected health' and 'telehealth' on future developments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The area of mortality modelling has received significant attention over the last 20 years owing to the need to quantify and forecast improving mortality rates. This need is driven primarily by the concern of governments, professionals, insurance and actuarial professionals and individuals to be able to fund their old age. In particular, to quantify the costs of increasing longevity we need suitable model of mortality rates that capture the dynamics of the data and forecast them with sufficient accuracy to make them useful. In this paper we test several of those models by considering the fitting quality and in particular, testing the residuals of those models for normality properties. In a wide ranging study considering 30 countries we find that almost exclusively the residuals do not demonstrate normality. Further, in Hurst tests of the residuals we find evidence that structure remains that is not captured by the models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the earliest days of cystic fibrosis (CF) treatment, patient data have been recorded and reviewed in order to identify the factors that lead to more favourable outcomes. Large data repositories, such as the US Cystic Fibrosis Registry, which was established in the 1960s, enabled successful treatments and patient outcomes to be recognized and improvement programmes to be implemented in specialist CF centres. Over the past decades, the greater volumes of data becoming available through Centre databases and patient registries led to the possibility of making comparisons between different therapies, approaches to care and indeed data recording. The quality of care for individuals with CF has become a focus at several levels: patient, centre, regional, national and international. This paper reviews the quality management and improvement issues at each of these levels with particular reference to indicators of health, the role of CF Centres, regional networks, national health policy, and international data registration and comparisons. 

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new technological approach in the analysis and forensic interpretation of Total Hydrocarbons in soils and waters using 2D Gas Chromatography method (GC-GC) was developed alongside environmental forensic and the assessment models to provide better customer products for the environmental industry.
The objective was to develop an analytical methodology for TPH CWG. Raw data from this method is then to be evaluated for forensic interpretation and risk assessment modelling. Access will be made available to the expertise in methods of forensic tracing contaminant sources, transport modelling, human health risk modelling and detailed quantitative risk assessment.
The quantification of internal standards was key to the development of this method. As the laboratory does not test for TPH in 1D, it was requested during INAB ISO 17025 audit to individually map out where each compound falls chromatographically in the 2D. This was done through comparing carbon equivalent numbers to the n-alkane carbons. This proved e.g. 2-methylnaphthalene has 11 carbons in its structure; its carbon equivalent is 12.84 , the result of which falls within the band of Aromatic eC12-eC16 as opposed to expected eC10-eC12. This was carried out for all 16 PAH (polyaromatic hydrocarbons) and BTEX (benzene, toluene, ethylbenzene and o, m and p-xylenes). The n-alkanes were also assigned to their corresponding aliphatic bands e.g. nC8 would be expected to be in nC8-nC10.
The method was validated through a designated systematic experimental protocol and was challenged with spikes of known concentration of hydrocarbon parameters such as recoveries, precision, bias and linearity. The method was verified by testing a certified reference material which was used as a proficiency round of testing for numerous laboratories.
It is hoped that the method will be used in conjunction with the analysis through Bonn Agreement with their OSINet group. This is a panel of experts and laboratories (including CLS) who forensically identify oil spill contamination from a water source.
This method can prove itself to be a robust method and benefit the industry for contaminated land and water but the method needs to be seen as separate from the regular 1D chromatography. It will help identify contaminants and assist consultants, regulators, clients and scientists valuable information not seen in 1D