21 resultados para Quality models
Resumo:
Purpose – The aim of this paper is to analyse how critical incidents or organisational crises can be used to check and legitimise quality management change efforts in relation to the fundamental principles of quality. Design/methodology/approach – Multiple case studies analyse critical incidents that demonstrate the importance of legitimisation, normative evaluation and conflict constructs in this process. A theoretical framework composed of these constructs is used to guide the analysis. Findings – The cases show that the critical incidents leading to the legitimisation of continuous improvement (CI) were diverse. However all resulted in the need for significant ongoing cost reduction to achieve or retain competitiveness. In addition, attempts at legitimising CI were coupled with attempts at destabilising the existing normative practice. This destabilisation process, in some cases, advocated supplementing the existing approaches and in others replacing them. In all cases, significant conflict arose in these legitimising and normative evaluation processes. Research limitations/implications – It is suggested that further research could involve a critical analysis of existing quality models, tools and techniques in relation to how they incorporate, and are built upon, fundamental quality management principles. Furthermore, such studies could probe the dangers of quality curriculum becoming divorced from business and market reality and thus creating a parallel existence. Practical implications – As demonstrated by the case studies, models, tools and techniques are not valued for their intrinsic value but rather for what they will contribute to addressing the business needs. Thus, in addition to being an opportunity for quality management, critical incidents present a challenge to the field. Quality management must be shown to make a contribution in these circumstances. Originality/value – This paper is of value to both academics and practitioners.
Resumo:
In this paper, we investigate the remanufacturing problem of pricing single-class used products (cores) in the face of random price-dependent returns and random demand. Specifically, we propose a dynamic pricing policy for the cores and then model the problem as a continuous-time Markov decision process. Our models are designed to address three objectives: finite horizon total cost minimization, infinite horizon discounted cost, and average cost minimization. Besides proving optimal policy uniqueness and establishing monotonicity results for the infinite horizon problem, we also characterize the structures of the optimal policies, which can greatly simplify the computational procedure. Finally, we use computational examples to assess the impacts of specific parameters on optimal price and reveal the benefits of a dynamic pricing policy. © 2013 Elsevier B.V. All rights reserved.
Resumo:
Background: Implementing effective antenatal care models is a key global policy goal. However, the mechanisms of action of these multi-faceted models that would allow widespread implementation are seldom examined and poorly understood. In existing care model analyses there is little distinction between what is done, how it is done, and who does it. A new evidence-informed quality maternal and newborn care (QMNC) framework identifies key characteristics of quality care. This offers the opportunity to identify systematically the characteristics of care delivery that may be generalizable across contexts, thereby enhancing implementation. Our objective was to map the characteristics of antenatal care models tested in Randomised Controlled Trials (RCTs) to a new evidence-based framework for quality maternal and newborn care; thus facilitating the identification of characteristics of effective care.
Methods: A systematic review of RCTs of midwifery-led antenatal care models. Mapping and evaluation of these models’ characteristics to the QMNC framework using data extraction and scoring forms derived from the five framework components. Paired team members independently extracted data and conducted quality assessment using the QMNC framework and standard RCT criteria.
Results: From 13,050 citations initially retrieved we identified 17 RCTs of midwifery-led antenatal care models from Australia (7), the UK (4), China (2), and Sweden, Ireland, Mexico and Canada (1 each). QMNC framework scores ranged from 9 to 25 (possible range 0–32), with most models reporting fewer than half the characteristics associated with quality maternity care. Description of care model characteristics was lacking in many studies, but was better reported for the intervention arms. Organisation of care was the best-described component. Underlying values and philosophy of care were poorly reported.
Conclusions: The QMNC framework facilitates assessment of the characteristics of antenatal care models. It is vital to understand all the characteristics of multi-faceted interventions such as care models; not only what is done but why it is done, by whom, and how this differed from the standard care package. By applying the QMNC framework we have established a foundation for future reports of intervention studies so that the characteristics of individual models can be evaluated, and the impact of any differences appraised.
Resumo:
A study of the external, loaded and unloaded quality factors for frequency selective surfaces (FSSs) is presented. The study is focused on THz frequencies between 5 and 30 THz, where ohmic losses arising from the conductors become important. The influence of material properties, such as metal thickness, conductivity dispersion and surface roughness, is investigated. An equivalent circuit that models the FSS in the presence of ohmic losses is introduced and validated by means of full-wave results. Using both full-wave methods as well as a circuit model, the reactive energy stored in the vicinity of the FSS at resonance upon plane-wave incidence is presented. By studying a doubly periodic array of aluminium strips, it is revealed that the reactive power stored at resonance increases rapidly with increasing periodicity. Moreover, it is demonstrated that arrays with larger periodicity-and therefore less metallisation per unit area-exhibit stronger thermal absorption. Despite this absorption, arrays with higher periodicities produce higher unloaded quality factors. Finally, experimental results of a fabricated prototype operating at 14 THz are presented.
Resumo:
Electricity systems models are software tools used to manage electricity demand and the electricity systems, to trade electricity and for generation expansion planning purposes. Various portfolios and scenarios are modelled in order to compare the effects of decision making in policy and on business development plans in electricity systems so as to best advise governments and industry on the least cost economic and environmental approach to electricity supply, while maintaining a secure supply of sufficient quality electricity. The modelling techniques developed to study vertically integrated state monopolies are now applied in liberalised markets where the issues and constraints are more complex. This paper reviews the changing role of electricity systems modelling in a strategic manner, focussing on the modelling response to key developments, the move away from monopoly towards liberalised market regimes and the increasing complexity brought about by policy targets for renewable energy and emissions. The paper provides an overview of electricity systems modelling techniques, discusses a number of key proprietary electricity systems models used in the USA and Europe and provides an information resource to the electricity analyst not currently readily available in the literature on the choice of model to investigate different aspects of the electricity system.
Resumo:
Abstract:
Background: Health care organisations
worldwide are faced with the need to develop
and implement strategic organisational plans
to meet the challenges of modern health care.
There is a need for models for developing, implementing and evaluating strategic plans that engage practitioners, and make a measurable difference to the patients that they serve. These presentations describe the development, implementation and evaluation of such a model by a team of senior nurses and practice developers, to underpin a strategy for nursing and midwifery in an acute hospital trust. Developing a Strategy The PARIHS (Promoting Action on Research Implementation in Health Services) conceptual framework (Kitson et al, 1998) proposes that successful implementation of change in practice is a function of the interplay of three core elements: the level of evidence supporting the proposed change; the context or environment in which the change takes place, and the way in which change is facilitated. We chose to draw on this framework to develop our strategy and implementation plan (O’Halloran, Martin and Connolly, 2005). At the centre of the plan are ward managers. These professionals provide leadership for the majority of staff in the trust and so were seen to be a key group in the implementation process.
Resumo:
This study uses a discrete choice experiment (DCE) to elicit willingness to pay estimates for changes in the water quality of three rivers. As many regions the metropolitan region Berlin-Brandenburg struggles to achieve the objectives of the Water Framework Directive until 2015. A major problem is the high load of nutrients. As the region is part of two states (Länder) and the river sections are common throughout the whole region we account for the spatial context twofold. Firstly, we incorporate the distance between each respondent and all river stretches in all MNL and RPL models, and, secondly, we consider whether respondents reside in the state of Berlin or Brandenburg. The compensating variation (CV) calculated for various scenarios shows that overall people would significantly benefit from improved water quality. The CV measures, however, also reveal that not considering the spatial context would result in severely biased welfare measures. While the distance decay effect lowers CV, state residency is connected to the frequency of status quo choices and not accounting for residency would underestimate possible welfare gains in one state. Another finding is that the extent of the market varies with respect to attributes (river stretches) and attribute levels (water quality levels).
Resumo:
This paper will consider the inter-relationship of a number of overlapping disciplinary theoretical concepts relevant to a strengths-based orientation, including well-being, salutogenesis, sense of coherence, quality of life and resilience. Psychological trauma will be referenced and the current evidence base for interventions with children and young people outlined and critiqued. The relational impact of trauma on family relationships is emphasised, providing a rationale for systemic psychotherapeutic interventions as part of a holistic approach to managing the effects of trauma. The congruence between second-order systemic psychotherapy models and a strengths-based philosophy is noted, with particular reference to solution-focused brief therapy and narrative therapy, and illustrated; via a description of the process of helping someone move from a victim position to a survivor identity using solution-focused brief therapy, and through a case example applying a narrative therapy approach to a teenage boy who suffered a serious assault. The benefits of a strength-based approach to psychological trauma for the clients and therapists will be summarised and a number of potential pitfalls articulated.
Resumo:
Polymer extrusion, in which a polymer is melted and conveyed to a mould or die, forms the basis of most polymer processing techniques. Extruders frequently run at non-optimised conditions and can account for 15–20% of overall process energy losses. In times of increasing energy efficiency such losses are a major concern for the industry. Product quality, which depends on the homogeneity and stability of the melt flow which in turn depends on melt temperature and screw speed, is also an issue of concern of processors. Gear pumps can be used to improve the stability of the production line, but the cost is usually high. Likewise it is possible to introduce energy meters but they also add to the capital cost of the machine. Advanced control incorporating soft sensing capabilities offers opportunities to this industry to improve both quality and energy efficiency. Due to strong correlations between the critical variables, such as the melt temperature and melt pressure, traditional decentralized PID (Proportional–Integral–Derivative) control is incapable of handling such processes if stricter product specifications are imposed or the material is changed from one batch to another. In this paper, new real-time energy monitoring methods have been introduced without the need to install power meters or develop data-driven models. The effects of process settings on energy efficiency and melt quality are then studied based on developed monitoring methods. Process variables include barrel heating temperature, water cooling temperature, and screw speed. Finally, a fuzzy logic controller is developed for a single screw extruder to achieve high melt quality. The resultant performance of the developed controller has shown it to be a satisfactory alternative to the expensive gear pump. Energy efficiency of the extruder can further be achieved by optimising the temperature settings. Experimental results from open-loop control and fuzzy control on a Killion 25 mm single screw extruder are presented to confirm the efficacy of the proposed approach.
Resumo:
Quality Management and Managerialism in Healthcare creates a comprehensive and systematic international survey of various perspectives on healthcare quality management together with some of their most pertinent critiques. Chapter one starts with a general discussion of the factors that drove the introduction of management paradigms into public sector and health management contexts in the mid to late 1980s. Chapter two explores the rise of risk awareness in medicine; which, prior to the 1980s, stood largely in isolation to the implementation of managerial performance targets. Chapter three investigates the widespread adoption of performance management and clinical governance frameworks during the 1980s and 1990s. This is followed by Chapters four and five which examine systems based models of patient safety and the evidence-based medicine movement as exemplars of managerial perspectives on healthcare quality. Chapter six discusses potential future avenues for the development of alternative perspectives on quality of care which emphasise workforce involvement. The book concludes by reviewing the factors which have underpinned the managerialist trajectory of healthcare management over the past decades and explores the potential impact of nascent technologies such as 'connected health' and 'telehealth' on future developments.
Resumo:
The area of mortality modelling has received significant attention over the last 20 years owing to the need to quantify and forecast improving mortality rates. This need is driven primarily by the concern of governments, professionals, insurance and actuarial professionals and individuals to be able to fund their old age. In particular, to quantify the costs of increasing longevity we need suitable model of mortality rates that capture the dynamics of the data and forecast them with sufficient accuracy to make them useful. In this paper we test several of those models by considering the fitting quality and in particular, testing the residuals of those models for normality properties. In a wide ranging study considering 30 countries we find that almost exclusively the residuals do not demonstrate normality. Further, in Hurst tests of the residuals we find evidence that structure remains that is not captured by the models.
Resumo:
Since the earliest days of cystic fibrosis (CF) treatment, patient data have been recorded and reviewed in order to identify the factors that lead to more favourable outcomes. Large data repositories, such as the US Cystic Fibrosis Registry, which was established in the 1960s, enabled successful treatments and patient outcomes to be recognized and improvement programmes to be implemented in specialist CF centres. Over the past decades, the greater volumes of data becoming available through Centre databases and patient registries led to the possibility of making comparisons between different therapies, approaches to care and indeed data recording. The quality of care for individuals with CF has become a focus at several levels: patient, centre, regional, national and international. This paper reviews the quality management and improvement issues at each of these levels with particular reference to indicators of health, the role of CF Centres, regional networks, national health policy, and international data registration and comparisons.
Resumo:
A new technological approach in the analysis and forensic interpretation of Total Hydrocarbons in soils and waters using 2D Gas Chromatography method (GC-GC) was developed alongside environmental forensic and the assessment models to provide better customer products for the environmental industry.
The objective was to develop an analytical methodology for TPH CWG. Raw data from this method is then to be evaluated for forensic interpretation and risk assessment modelling. Access will be made available to the expertise in methods of forensic tracing contaminant sources, transport modelling, human health risk modelling and detailed quantitative risk assessment.
The quantification of internal standards was key to the development of this method. As the laboratory does not test for TPH in 1D, it was requested during INAB ISO 17025 audit to individually map out where each compound falls chromatographically in the 2D. This was done through comparing carbon equivalent numbers to the n-alkane carbons. This proved e.g. 2-methylnaphthalene has 11 carbons in its structure; its carbon equivalent is 12.84 , the result of which falls within the band of Aromatic eC12-eC16 as opposed to expected eC10-eC12. This was carried out for all 16 PAH (polyaromatic hydrocarbons) and BTEX (benzene, toluene, ethylbenzene and o, m and p-xylenes). The n-alkanes were also assigned to their corresponding aliphatic bands e.g. nC8 would be expected to be in nC8-nC10.
The method was validated through a designated systematic experimental protocol and was challenged with spikes of known concentration of hydrocarbon parameters such as recoveries, precision, bias and linearity. The method was verified by testing a certified reference material which was used as a proficiency round of testing for numerous laboratories.
It is hoped that the method will be used in conjunction with the analysis through Bonn Agreement with their OSINet group. This is a panel of experts and laboratories (including CLS) who forensically identify oil spill contamination from a water source.
This method can prove itself to be a robust method and benefit the industry for contaminated land and water but the method needs to be seen as separate from the regular 1D chromatography. It will help identify contaminants and assist consultants, regulators, clients and scientists valuable information not seen in 1D
Resumo:
viii
Executive Summary
The Pathways Project field studies were targeted at improving the understanding of contaminant transport along different hydrological pathways in Irish catchments, including their associated impacts on water quality and river ecology. The contaminants of interest were phosphorus, nitrogen and sediment. The working Pathways conceptual model included overland flow, interflow, shallow groundwater flow, and deep groundwater flow. This research informed the development of a set of Catchment Management Support Tools (CMSTs) comprising an Exploratory Tool, Catchment Characterization Tool (CCT) and Catchment Modelling Tool (CMT) as outlined in Pathways Project Final Reports Volumes 3 and 4.
In order to inform the CMST, four suitable study catchments were selected following an extensive selection process, namely the Mattock catchment, Co. Louth/Meath; Gortinlieve catchment, Co. Donegal; Nuenna catchment, Co. Kilkenny and the Glen Burn catchment, Co. Down. The Nuenna catchment is well drained as it is underlain by a regionally important karstified limestone aquifer with permeable limestone tills and gravels, while the other three catchments are underlain by poorly productive aquifers and low permeability clayey tills, and are poorly drained.
All catchments were instrumented, and groundwater, surface and near-surface water and aquatic ecology were monitored for a period of two years. Intensive water quality sampling during rainfall events was used to investigate the pathways delivering nutrients. The proportion of flow along each pathway was determined using chemical and physical hydrograph separation techniques, supported by numerical modelling.
The outcome of the field studies broadly supported the use of the initial four-pathway conceptual model used in the Pathways CMT (time-variant model). The artificial drainage network was found to be a significant contributing pathway in the poorly drained catchments, at low flows and during peak flows in wet antecedent conditions. The transition zone (TZ), i.e. the broken up weathered zone at the top of the bedrock, was also found to be an important pathway. It was observed to operate in two contrasting hydrogeological scenarios: in groundwater discharge zones the TZ can be regarded as being part of the shallow groundwater pathway, whereas in groundwater recharge zones it behaves more like interflow.
In the catchments overlying poorly productive aquifers, only a few fractures or fracture zones were found to be hydraulically active and the TZ, where present, was the main groundwater pathway. In the karstified Nuenna catchment, the springs, which are linked to conduits as well as to a diffuse fracture network, delivered the majority of the flow. These findings confirm the two-component groundwater contribution from bedrock but suggest that the size and nature of the hydraulically active fractures and the nature of the TZ are the dominant factors at the scale of a stream flow event.
Diffuse sources of nitrate were found to be typically delivered via the subsurface pathways, especially in the TZ and land drains in the poorly productive aquifer catchments, and via the bedrock groundwater in the Nuenna. Phosphorus was primarily transported via overland flow in both particulate and soluble forms. Where preferential flow paths existed in the soil and subsoil, soluble P, and to a lesser extent particulate P, were also transported via the TZ and in drains and ditches. Arable land was found to be the most important land use for
ix
the delivery of sediment, although channel bank and in-stream sources were the most significant in the Glen Burn catchment. Overland flow was found to be the predominant transport sediment pathway in the poorly productive catchments. These findings informed the development of the transport and attenuation equations used in the CCT and CMT. From an assessment of the relationship between physico-chemical and biological conditions, it is suggested that in the Nuenna, Glen Burn and Gortinlieve catchments, a relationship may exist between biological water quality and nitrogen concentrations, as well as with P. In the Nuenna, there was also a relationship between macroinvertebrate status and alkalinity.
Further research is recommended on the transport and delivery of phosphorus in groundwater, the transport and attenuation dynamics in the TZ in different hydrogeological settings and the relationship between macroinvertebrates and co-limiting factors. High resolution temporal and spatial sampling was found to be important for constraining the conceptual understanding of nutrient and sediment dynamics which should also be considered in future studies.