903 resultados para cost-informed process improvement
Resumo:
Cell culture and direct fluorescent antibody (DFA) assays have been traditionally used for the laboratory diagnosis of respiratory viral infections. Multiplex reverse transcriptase polymerase chain reaction (m-RT-PCR) is a sensitive, specific, and rapid method for detecting several DNIA and RNA viruses in a single specimen. We developed a m-RT-PCR assay that utilizes multiple virus-specific primer pairs in a single reaction mix combined with an enzyme-linked amplicon hybridization assay (ELAHA) using virus-specific probes targeting unique gene sequences for each virus. Using this m-RT-PCR-ELAHA, we examined the presence of seven respiratory viruses in 598 nasopharyngeal aspirate (NPA) samples from patients with suspected respiratory infection. The specificity of each assay was 100%. The sensitivity of the DFA was 79.7% and the combined DFA/culture amplified-DFA (CA-DFA) was 88.6% when compared to the m-RT-PCR-ELAHA. Of the 598 NPA specimens screened by m-RT-PCR-ELAHA, 3% were positive for adenovirus (ADM), 2% for influenza A (Flu A) virus, 0.3% for influenza B (Flu B) virus, 1% for parainfluenza type I virus (PIV1), 1% for parainfluenza type 2 virus (PIV2), 5.5% for parainfluenza type 3 virus (PIV3), and 21% for respiratory syncytial virus (RSV). The enhanced sensitivity, specificity, rapid result turnaround time and reduced expense of the m-RT-PCR-ELAHA compared to DFA and CA-DFA, suggests that this assay would be a significant improvement over traditional assays for the detection of respiratory viruses in a clinical laboratory.
Multisite, quality-improvement collaboration to optimise cardiac care in Queensland public hospitals
Resumo:
Objective: To evaluate changes in quality of in-hospital care of patients with either acute coronary syndromes (ACS) or congestive heart failure (CHF) admitted to hospitals participating in a multisite quality improvement collaboration. Design: Before-and-after study of changes in quality indicators measured on representative patient samples between June 2001 and January 2003. Setting: Nine public hospitals in Queensland. Study populations: Consecutive or randomly selected patients admitted to study hospitals during the baseline period (June 2001 to January 2002; n = 807 for ACS, n = 357 for CHF) and post-intervention period (July 2002 to January 2003; n = 717 for ACS, n = 220 for CHF). Intervention: Provision of comparative baseline feedback at a facilitative workshop combined with hospital-specific quality-improvement interventions supported by on-site quality officers and a central program management group. Main outcome measure: Changes in process-of-care indicators between baseline and post-intervention periods. Results: Compared with baseline, more patients with ACS in the post-intervention period received therapeutic heparin regimens (84% v 72%; P < 0.001), angiotensin-converting enzyme inhibitors (64% v 56%; P = 0.02), lipid-lowering agents (72% v 62%; P < 0.001), early use of coronary angiography (52% v 39%; P < 0.001), in-hospital cardiac counselling (65% v 43%; P < 0.001), and referral to cardiac rehabilitation (15% v 5%; P < 0.001). The numbers of patients with CHF receiving β-blockers also increased (52% v 34%; P < 0.001), with fewer patients receiving deleterious agents (13% v 23%; P = 0.04). Same-cause 30-day readmission rate decreased from 7.2% to 2.4% (P = 0.02) in patients with CHF. Conclusion: Quality-improvement interventions conducted as multisite collaborations may improve in-hospital care of acute cardiac conditions within relatively short time frames.
Resumo:
Rationale. The Brisbane Cardiac Consortium, a quality improvement collaboration of clinicians from three hospitals and five divisions of general practice, developed and reported clinical indicators as measures of the quality of care received by patients with acute coronary syndromes or congestive heart failure. Development of indicators. An expert panel derived indicators that measured gaps between evidence and practice. Data collected from hospital records and general practice heart-check forms were used to calculate process and outcome indicators for each condition. Our indicators were reliable (kappa scores 0.7-1.0) and widely accepted by clinicians as having face validity. Independent review of indicator-failed, in-hospital cases revealed that, for 27 of 28 process indicators, clinically legitimate reasons for withholding specific interventions were found in
Resumo:
The therapeutic letter has a long history, with roots in psychoanalytic work and continuing application in family therapy. The advent of e-mail has allowed another form for therapeutic written communication which, while incorporating the benefits of therapeutic letters, adds to these. It has also opened up some potential risks. This article incorporates a brief review of the literature covering therapeutic written communication and offers a case example where e-mail was used as an adjunct in face-to-face therapy with a client who experienced attachment difficulties. This therapy was informed by systemic and psychoanalytic traditions. The authors explore a variety of technical matters including the timing and Crafting of e-mail responses, the integration of written communication with face-to-face therapy, impact on the therapeutic relationship and management of crisis. Ethical issues such as confidentiality and duty of care are also considered.
Resumo:
Hyaluronic acid is routinely produced through fermentation of both Group A and C streptococci. Despite significant production costs associated with short fermentations and removal of contaminating proteins released during entry into stationary phase, hyaluronic acid is typically produced in batch rather than continuous culture. The main reason is that hyaluronic acid synthesis has been found to be unstable in continuous culture except at very low dilution rates. Here, we investigated the mechanisms underlying this instability and developed a stable, high dilution rate (0.4 h(-1)) chemostat process for both chemically defined and complex media operating for more than 150 h of production. In chemically defined medium, the product yield was 25% higher in chemostat cultures than in conventional batch culture when arginine or glucose was the limiting substrate. In contrast, glutamine limitation resulted in higher ATP requirements and a yield similar to that observed in batch culture. In complex, glucose-limited medium, ATP requirements were greatly reduced but biomass synthesis was favored over hyaluronic acid and no improvement in hyaluronic acid yield was observed. The successful establishment of continuous culture at high dilution rate enables both commercial production at reduced cost and a more rational characterization and optimization of hyaluronic acid production in streptococci. (c) 2005 Wiley Periodicals, Inc.
Resumo:
Calibration of a groundwater model requires that hydraulic properties be estimated throughout a model domain. This generally constitutes an underdetermined inverse problem, for which a Solution can only be found when some kind of regularization device is included in the inversion process. Inclusion of regularization in the calibration process can be implicit, for example through the use of zones of constant parameter value, or explicit, for example through solution of a constrained minimization problem in which parameters are made to respect preferred values, or preferred relationships, to the degree necessary for a unique solution to be obtained. The cost of uniqueness is this: no matter which regularization methodology is employed, the inevitable consequence of its use is a loss of detail in the calibrated field. This, ill turn, can lead to erroneous predictions made by a model that is ostensibly well calibrated. Information made available as a by-product of the regularized inversion process allows the reasons for this loss of detail to be better understood. In particular, it is easily demonstrated that the estimated value for an hydraulic property at any point within a model domain is, in fact, a weighted average of the true hydraulic property over a much larger area. This averaging process causes loss of resolution in the estimated field. Where hydraulic conductivity is the hydraulic property being estimated, high averaging weights exist in areas that are strategically disposed with respect to measurement wells, while other areas may contribute very little to the estimated hydraulic conductivity at any point within the model domain, this possibly making the detection of hydraulic conductivity anomalies in these latter areas almost impossible. A study of the post-calibration parameter field covariance matrix allows further insights into the loss of system detail incurred through the calibration process to be gained. A comparison of pre- and post-calibration parameter covariance matrices shows that the latter often possess a much smaller spectral bandwidth than the former. It is also demonstrated that, as all inevitable consequence of the fact that a calibrated model cannot replicate every detail of the true system, model-to-measurement residuals can show a high degree of spatial correlation, a fact which must be taken into account when assessing these residuals either qualitatively, or quantitatively in the exploration of model predictive uncertainty. These principles are demonstrated using a synthetic case in which spatial parameter definition is based oil pilot points, and calibration is Implemented using both zones of piecewise constancy and constrained minimization regularization. (C) 2005 Elsevier Ltd. All rights reserved.
Resumo:
Aim of study: Different criteria for treatment response were explored to identify predictors of OA improvement. Analyses were based on data from a previously reported 1-year randomized controlled trial of appropriate care with or without hylan G-F 20 in patients with knee OA. Methods: Five definitions of ‘‘patient responder’’ from baseline to month 12 were examined: at least 20% reduction in WOMAC pain score; at least 20% reduction in WOMAC pain score and at least 20% reduction in either the WOMAC stiffness or function score; OARSI responder criteria (Propositions A and B) for intra-articular drugs; and OMERACT-OARSI responder criteria (Proposition D). As an a posteriori analysis, multivariable logistic regression models for each definition of patient responder were developed using a forward selection method. The following variables were defined prior to modeling and considered in the model along with two-way interactions: age (O65 years), BMI, gender, X-ray grade (0, I, II vs III, IV), co-morbidity (1 or 2 conditions vs 3 or more), duration of OA in study knee (years), previous surgery of study knee, hylan G-F 20 injection technique, WOMAC pain, stiffness and function, and treatment group. Results: Hylan G-F 20 was a predictor of improvement for all patient responder definitions P ! 0.001; odds of improvement were 2.7 or higher for patients in the hylan G-F 20 group compared to appropriate care without hylan G-F 20. For three of the five patient responder definitions, X-ray grade was a predictor of improvement (P ! 0.10; lower X-ray grade increased the odds of improvement). For four of the five patient responder definitions, duration of OA was a predictor of improvement (P ! 0.10; shorter duration of OA increased the odds of improvement). Conclusion: Analyses showed that appropriate care with hylan G-F 20 is the dominant predictor of patient improvement. While high grade structural damage does not preclude a response, patients who are targeted early in the disease process when less structural damage has occurred, may have a greater chance of improvement.
Resumo:
A major impediment to developing real-time computer vision systems has been the computational power and level of skill required to process video streams in real-time. This has meant that many researchers have either analysed video streams off-line or used expensive dedicated hardware acceleration techniques. Recent software and hardware developments have greatly eased the development burden of realtime image analysis leading to the development of portable systems using cheap PC hardware and software exploiting the Multimedia Extension (MMX) instruction set of the Intel Pentium chip. This paper describes the implementation of a computationally efficient computer vision system for recognizing hand gestures using efficient coding and MMX-acceleration to achieve real-time performance on low cost hardware.
Resumo:
As várias teorias acerca da estrutura de capital despertam interesse motivando diversos estudos sobre o assunto sem, no entanto, ter um consenso. Outro tema aparentemente pouco explorado refere-se ao ciclo de vida das empresas e como ele pode influenciar a estrutura de capital. Este estudo teve como objetivo verificar quais determinantes possuem maior relevância no endividamento das empresas e se estes determinantes alteram-se dependendo do ciclo de vida da empresa apoiada pelas teorias Trade Off, Pecking Order e Teoria da Agência. Para alcançar o objetivo deste trabalho foi utilizado análise em painel de efeito fixo sendo a amostra composta por empresas brasileiras de capital aberto, com dados secundários disponíveis na Economática® no período de 2005 a 2013, utilizando-se os setores da BM&FBOVESPA. Como resultado principal destaca-se o mesmo comportamento entre a amostra geral, alto e baixo crescimento pelo endividamento contábil para o determinante Lucratividade apresentando uma relação negativa, e para os determinantes Oportunidade de Crescimento e Tamanho, estes com uma relação positiva. Para os grupos de alto e baixo crescimento alguns determinantes apresentaram resultados diferentes, como a singularidade que resultou significância nestes dois grupos, sendo positiva no baixo crescimento e negativa no alto crescimento, para o valor colateral dos ativos e benefício fiscal não dívida apresentaram significância apenas no grupo de baixo crescimento. Para o endividamento a valor de mercado foi observado significância para o Benefício fiscal não dívida e Singularidade. Este resultado reforça o argumento de que o ciclo de vida influência a estrutura de capital
Resumo:
A sieve plate distillation column has been constructed and interfaced to a minicomputer with the necessary instrumentation for dynamic, estimation and control studies with special bearing on low-cost and noise-free instrumentation. A dynamic simulation of the column with a binary liquid system has been compiled using deterministic models that include fluid dynamics via Brambilla's equation for tray liquid holdup calculations. The simulation predictions have been tested experimentally under steady-state and transient conditions. The simulator's predictions of the tray temperatures have shown reasonably close agreement with the measured values under steady-state conditions and in the face of a step change in the feed rate. A method of extending linear filtering theory to highly nonlinear systems with very nonlinear measurement functional relationships has been proposed and tested by simulation on binary distillation. The simulation results have proved that the proposed methodology can overcome the typical instability problems associated with the Kalman filters. Three extended Kalman filters have been formulated and tested by simulation. The filters have been used to refine a much simplified model sequentially and to estimate parameters such as the unmeasured feed composition using information from the column simulation. It is first assumed that corrupted tray composition measurements are made available to the filter and then corrupted tray temperature measurements are accessed instead. The simulation results have demonstrated the powerful capability of the Kalman filters to overcome the typical hardware problems associated with the operation of on-line analyzers in relation to distillation dynamics and control by, in effect, replacirig them. A method of implementing estimator-aided feedforward (EAFF) control schemes has been proposed and tested by simulation on binary distillation. The results have shown that the EAFF scheme provides much better control and energy conservation than the conventional feedback temperature control in the face of a sustained step change in the feed rate or multiple changes in the feed rate, composition and temperature. Further extensions of this work are recommended as regards simulation, estimation and EAFF control.
Resumo:
Purpose: The purpose of this paper is to describe how the application of systems thinking to designing, managing and improving business processes has resulted in a new and unique holonic-based process modeling methodology know as process orientated holonic modeling. Design/methodology/approach: The paper describes key systems thinking axioms that are built upon in an overview of the methodology; the techniques are described using an example taken from a large organization designing and manufacturing capital goods equipment operating within a complex and dynamic environment. These were produced in an 18 month project, using an action research approach, to improve quality and process efficiency. Findings: The findings of this research show that this new methodology can support process depiction and improvement in industrial sectors which are characterized by environments of high variety and low volume (e.g. projects; such as the design and manufacture of a radar system or a hybrid production process) which do not provide repetitive learning opportunities. In such circumstances, the methodology has not only been able to deliver holonic-based process diagrams but also been able to transfer strategic vision from top management to middle and operational levels without being reductionistic. Originality/value: This paper will be of interest to organizational analysts looking at large complex projects whom require a methodology that does not confine them to thinking reductionistically in "task-breakdown" based approaches. The novel ideas in this paper have great impact on the way analysts should perceive organizational processes. Future research is applying the methodology in similar environments in other industries. © Emerald Group Publishing Limited.
Resumo:
Improving healthcare quality is a growing need of any society. Although various quality improvement projects are routinely deployed by the healthcare professional, they are characterised by a fragmented approach, i.e. they are not linked with the strategic intent of the organisation. This study introduces a framework which integrates all quality improvement projects with the strategic intent of the organisation. It first derives the strengths, weaknesses, opportunities and threats (SWOT) matrix of the system with the involvement of the concerned stakeholders (clinical professional), which helps identify a few projects, the implementation of which ensures achievement of desired quality. The projects are then prioritised using the analytic hierarchy process with the involvement of the concerned stakeholders (clinical professionals) and implemented in order to improve system performance. The effectiveness of the method has been demonstrated using a case study in the intensive care unit of Queen Elizabeth Hospital in Bridgetown, Barbados.
Resumo:
Conventionally, oil pipeline projects are evaluated thoroughly by the owner before investment decision is made using market, technical and financial analysis sequentially. The market analysis determines pipelines throughput and supply and demand points. Subsequent, technical analysis identifies technological options and economic and financial analysis then derives the least cost option among all technically feasible options. The subsequent impact assessment tries to justify the selected option by addressing environmental and social issues. The impact assessment often suggests alternative sites, technologies, and/or implementation methodology, necessitating revision of technical and financial analysis. This study addresses these issues via an integrated project evaluation and selection model. The model uses analytic hierarchy process, a multiple-attribute decision-making technique. The effectiveness of the model has been demonstrated through a case application on cross-country petroleum pipeline project in India.
Resumo:
Effective management of projects is becoming increasingly important for any type of organization to remain competitive in today’s dynamic business environment due to pressure of globalization. The use of benchmarking is widening as a technique for supporting project management. Benchmarking can be described as the search for the best practices, leading to the superior performance of an organization. However, effectiveness of benchmarking depends on the use of tools for collecting and analyzing information and deriving subsequent improvement projects. This study demonstrates how analytic hierarchy process (AHP), a multiple attribute decision-making technique, can be used for benchmarking project management practices. The entire methodology has been applied to benchmark project management practice of Caribbean public sector organizations with organizations in the Indian petroleum sector, organizations in the infrastructure sector of Thailand and the UK. This study demonstrates the effectiveness of a proposed benchmarking model using AHP, determines problems and issues of Caribbean project management in the public sector and suggests improvement measures for effective project management.
Resumo:
Time, cost and quality achievements on large-scale construction projects are uncertain because of technological constraints, involvement of many stakeholders, long durations, large capital requirements and improper scope definitions. Projects that are exposed to such an uncertain environment can effectively be managed with the application of risk management throughout the project life cycle. Risk is by nature subjective. However, managing risk subjectively poses the danger of non-achievement of project goals. Moreover, risk analysis of the overall project also poses the danger of developing inappropriate responses. This article demonstrates a quantitative approach to construction risk management through an analytic hierarchy process (AHP) and decision tree analysis. The entire project is classified to form a few work packages. With the involvement of project stakeholders, risky work packages are identified. As all the risk factors are identified, their effects are quantified by determining probability (using AHP) and severity (guess estimate). Various alternative responses are generated, listing the cost implications of mitigating the quantified risks. The expected monetary values are derived for each alternative in a decision tree framework and subsequent probability analysis helps to make the right decision in managing risks. In this article, the entire methodology is explained by using a case application of a cross-country petroleum pipeline project in India. The case study demonstrates the project management effectiveness of using AHP and DTA.