916 resultados para Failure time analysis
Resumo:
The aim of this study was to investigate treatment failure (TF) in hospitalised community-acquired pneumonia (CAP) patients with regard to initial antibiotic treatment and economic impact. CAP patients were included in two open, prospective multicentre studies assessing the direct costs for in-patient treatment. Patients received treatment either with moxifloxacin (MFX) or a nonstandardised antibiotic therapy. Any change in antibiotic therapy after >72 h of treatment to a broadened antibiotic spectrum was considered as TF. Overall, 1,236 patients (mean ± SD age 69.6 ± 16.8 yrs, 691 (55.9%) male) were included. TF occurred in 197 (15.9%) subjects and led to longer hospital stay (15.4 ± 7.3 days versus 9.8 ± 4.2 days; p < 0.001) and increased median treatment costs (€2,206 versus €1,284; p<0.001). 596 (48.2%) patients received MFX and witnessed less TF (10.9% versus 20.6%; p < 0.001). After controlling for confounders in multivariate analysis, adjusted risk of TF was clearly reduced in MFX as compared with β-lactam monotherapy (adjusted OR for MFX 0.43, 95% CI 0.27-0.68) and was more comparable with a β-lactam plus macrolide combination (BLM) (OR 0.68, 95% CI 0.38-1.21). In hospitalised CAP, TF is frequent and leads to prolonged hospital stay and increased treatment costs. Initial treatment with MFX or BLM is a possible strategy to prevent TF, and may thus reduce treatment costs.
Resumo:
This study evaluated critical thresholds for fresh frozen plasma (FFP) and platelet (PLT) to packed red blood cell (PRBC) ratios and determined the impact of high FFP:PRBC and PLT:PRBC ratios on outcomes in patients requiring massive transfusion (MT).
Resumo:
Energy transfer between the interacting waves in a distributed Brillouin sensor can result in a distorted measurement of the local Brillouin gain spectrum, leading to systematic errors. It is demonstrated that this depletion effect can be precisely modelled. This has been validated by experimental tests in an excellent quantitative agreement. Strict guidelines can be enunciated from the model to make the impact of depletion negligible, for any type and any length of fiber. (C) 2013 Optical Society of America
Resumo:
Objective: To compare clinical outcomes after laparoscopic cholecystectomy (LC) for acute cholecystitis performed at various time-points after hospital admission. Background: Symptomatic gallstones represent an important public health problem with LC the treatment of choice. LC is increasingly offered for acute cholecystitis, however, the optimal time-point for LC in this setting remains a matter of debate. Methods: Analysis was based on the prospective database of the Swiss Association of Laparoscopic and Thoracoscopic Surgery and included patients undergoing emergency LC for acute cholecystitis between 1995 and 2006, grouped according to the time-points of LC since hospital admission (admission day (d0), d1, d2, d3, d4/5, d ≥6). Linear and generalized linear regression models assessed the effect of timing of LC on intra- or postoperative complications, conversion and reoperation rates and length of postoperative hospital stay. Results: Of 4113 patients, 52.8% were female, median age was 59.8 years. Delaying LC resulted in significantly higher conversion rates (from 11.9% at d0 to 27.9% at d ≥6 days after admission, P < 0.001), surgical postoperative complications (5.7% to 13%, P < 0.001) and re-operation rates (0.9% to 3%, P = 0.007), with a significantly longer postoperative hospital stay (P < 0.001). Conclusions: Delaying LC for acute cholecystitis has no advantages, resulting in significantly increased conversion/re-operation rate, postoperative complications and longer postoperative hospital stay. This investigation—one of the largest in the literature—provides compelling evidence that acute cholecystitis merits surgery within 48 hours of hospital admission if impact on the patient and health care system is to be minimized.
Resumo:
With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. LAY ABSTRACT: Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.
Resumo:
The original cefepime product was withdrawn from the Swiss market in January 2007 and replaced by a generic 10 months later. The goals of the study were to assess the impact of this cefepime shortage on the use and costs of alternative broad-spectrum antibiotics, on antibiotic policy, and on resistance of Pseudomonas aeruginosa toward carbapenems, ceftazidime, and piperacillin-tazobactam. A generalized regression-based interrupted time series model assessed how much the shortage changed the monthly use and costs of cefepime and of selected alternative broad-spectrum antibiotics (ceftazidime, imipenem-cilastatin, meropenem, piperacillin-tazobactam) in 15 Swiss acute care hospitals from January 2005 to December 2008. Resistance of P. aeruginosa was compared before and after the cefepime shortage. There was a statistically significant increase in the consumption of piperacillin-tazobactam in hospitals with definitive interruption of cefepime supply and of meropenem in hospitals with transient interruption of cefepime supply. Consumption of each alternative antibiotic tended to increase during the cefepime shortage and to decrease when the cefepime generic was released. These shifts were associated with significantly higher overall costs. There was no significant change in hospitals with uninterrupted cefepime supply. The alternative antibiotics for which an increase in consumption showed the strongest association with a progression of resistance were the carbapenems. The use of alternative antibiotics after cefepime withdrawal was associated with a significant increase in piperacillin-tazobactam and meropenem use and in overall costs and with a decrease in susceptibility of P. aeruginosa in hospitals. This warrants caution with regard to shortages and withdrawals of antibiotics.
Resumo:
The use of self-etch primers has increased steadily because of their time savings and greater simplicity; however, overall benefits and potential disadvantages and harms have not been assessed systematically. In this study, we reviewed randomized controlled trials to assess the risk of attachment failure, bonding time, and demineralization adjacent to attachments between 1-stage (self-etch) and 2-stage (acid etch) bonding in orthodontic patients over a minimum follow-up period of 12 months.
Resumo:
American College of Cardiology/American Heart Association guidelines for the diagnosis and management of heart failure recommend investigating exacerbating conditions such as thyroid dysfunction, but without specifying the impact of different thyroid-stimulation hormone (TSH) levels. Limited prospective data exist on the association between subclinical thyroid dysfunction and heart failure events.
Resumo:
Quality of life is an important outcome in the treatment of patients with schizophrenia. It has been suggested that patients' quality of life ratings (referred to as subjective quality of life, SQOL) might be too heavily influenced by symptomatology to be a valid independent outcome criterion. There has been only limited evidence on the association of symptom change and changes in SQOL over time. This study aimed to examine the association between changes in symptoms and in SQOL among patients with schizophrenia. A pooled data set was obtained from eight longitudinal studies that had used the Brief Psychiatric Rating Scale (BPRS) for measuring psychiatric symptoms and either the Lancashire Quality of Life Profile or the Manchester Short Assessment of Quality of Life for assessing SQOL. The sample comprised 886 patients with schizophrenia. After controlling for heterogeneity of findings across studies using linear mixed models, a reduction in psychiatric symptoms was associated with improvements in SQOL scores. In univariate analyses, changes in all BPRS subscales were associated with changes in SQOL scores. In a multivariate model, only associations between changes in the BPRS depression/anxiety and hostility subscales and changes in SQOL remained significant, with 5% and 0.5% of the variance in SQOL changes being attributable to changes in depression/anxiety and hostility respectively. All BPRS subscales together explained 8.5% of variance. The findings indicate that SQOL changes are influenced by symptom change, in particular in depression/anxiety. The level of influence is limited and may not compromise using SQOL as an independent outcome measure.
Resumo:
Objective: We compare the prognostic strength of the lymph node ratio (LNR), positive lymph nodes (+LNs) and collected lymph nodes (LNcoll) using a time-dependent analysis in colorectal cancer patients stratified by mismatch repair (MMR) status. Method: 580 stage III-IV patients were included. Multivariable Cox regression analysis and time-dependent receiver operating characteristic (tROC) curve analysis were performed. The Area under the Curve (AUC) over time was compared for the three features. Results were validated on a second cohort of 105 stage III-IV patients. Results: The AUC for the LNR was 0.71 and outperformed + LNs and LNcoll by 10–15 % in both MMR-proficient and deficient cancers. LNR and + LNs were both significant (p<0.0001) in multivariable analysis but the effect was considerably stronger for the LNR [LNR: HR=5.18 (95 % CI: 3.5–7.6); +LNs=1.06 (95 % CI: 1.04–1.08)]. Similar results were obtained for patients with >12 LNcoll. An optimal cut off score for LNR=0.231 was validated on the second cohort (p<0.001). Conclusion: The LNR outperforms the + LNs and LNcoll even in patients with >12 LNcoll. Its clinical value is not confounded by MMR status. A cut-of score of 0.231 may best stratify patients into prognostic subgroups and could be a basis for the future prospective analysis of the LNR.
Resumo:
The rotational nature of shifting cultivation poses several challenges to its detection by remote sensing. Consequently, there is a lack of spatial data on the dynamics of shifting cultivation landscapes on a regional, i.e. sub-national, or national level. We present an approach based on a time series of Landsat and MODIS data and landscape metrics to delineate the dynamics of shifting cultivation landscapes. Our results reveal that shifting cultivation is a land use system still widely and dynamically utilized in northern Laos. While there is an overall reduction in the areas dominated by shifting cultivation, some regions also show an expansion. A review of relevant reports and articles indicates that policies tend to lead to a reduction while market forces can result in both expansion and reduction. For a better understanding of the different factors affecting shifting cultivation landscapes in Laos, further research should focus on spatially explicit analyses.