158 resultados para Central venous catheter-associated bloodstream infections


Relevância:

100.00% 100.00%

Publicador:

Resumo:

It has been established that mixed venous oxygen saturation (SvO2) reflects the balance between systemic oxygen deliver y and consumption. Literature indicates that it is a valuable clinical indicator and has good prognostic value early in patient course. This article aims to establish the usefulness of SvO2 as a clinical indicator. A secondary aim was to determine whether central venous oxygen saturation (ScvO2) and SvO2 are interchangeable. Of particular relevance to cardiac nurses is the link between decreased SvO2 and cardiac failure in patients with myocardial infarction, and with decline in myocardial function, clinical shock and arrhythmias. While absolute values ScvO2 and SvO2 are not interchangeable, ScvO2 and SvO2are equivalent in terms of clinical course. Additionally, ScvO2 monitoring is a safer and less costly alternative to SvO2 monitoring. It can be concluded that continuous ScvO2 monitoring should potentially be undertaken in patients at risk of haemodynamic instability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives Hospital-acquired bloodstream infections are known to increase the risk of death and prolong hospital stay, but precise estimates of these two important outcomes from well-designed studies are rare, particularly for non-intensive care unit (ICU) patients. We aimed to calculate accurate estimates, which are vital for estimating the economic costs of hospital-acquired bloodstream infections.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To the Editor—In a recent review article in Infection Control and Hospital Epidemiology, Umscheid et al1 summarized published data on incidence rates of catheter-associated bloodstream infection (CABSI), catheter-associated urinary tract infection (CAUTI), surgical site infection (SSI), and ventilator- associated pneumonia (VAP); estimated how many cases are preventable; and calculated the savings in hospital costs and lives that would result from preventing all preventable cases. Providing these estimates to policy makers, political leaders, and health officials helps to galvanize their support for infection prevention programs. Our concern is that important limitations of the published studies on which Umscheid and colleagues built their findings are incompletely addressed in this review. More attention needs to be drawn to the techniques applied to generate these estimates...

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction Vascular access devices (VADs), such as peripheral or central venous catheters, are vital across all medical and surgical specialties. To allow therapy or haemodynamic monitoring, VADs frequently require administration sets (AS) composed of infusion tubing, fluid containers, pressure-monitoring transducers and/or burettes. While VADs are replaced only when necessary, AS are routinely replaced every 3–4 days in the belief that this reduces infectious complications. Strong evidence supports AS use up to 4 days, but there is less evidence for AS use beyond 4 days. AS replacement twice weekly increases hospital costs and workload. Methods and analysis This is a pragmatic, multicentre, randomised controlled trial (RCT) of equivalence design comparing AS replacement at 4 (control) versus 7 (experimental) days. Randomisation is stratified by site and device, centrally allocated and concealed until enrolment. 6554 adult/paediatric patients with a central venous catheter, peripherally inserted central catheter or peripheral arterial catheter will be enrolled over 4 years. The primary outcome is VAD-related bloodstream infection (BSI) and secondary outcomes are VAD colonisation, AS colonisation, all-cause BSI, all-cause mortality, number of AS per patient, VAD time in situ and costs. Relative incidence rates of VAD-BSI per 100 devices and hazard rates per 1000 device days (95% CIs) will summarise the impact of 7-day relative to 4-day AS use and test equivalence. Kaplan-Meier survival curves (with log rank Mantel-Cox test) will compare VAD-BSI over time. Appropriate parametric or non-parametric techniques will be used to compare secondary end points. p Values of <0.05 will be considered significant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background International standard practice for the correct confirmation of the central venous access device is the chest X-ray. The intracavitary electrocardiogram-based insertion method is radiation-free, and allows real-time placement verification, providing immediate treatment and reduced requirement for post-procedural repositioning. Methods Relevant databases were searched for prospective randomised controlled trials (RCTs) or quasi RCTs that compared the effectiveness of electrocardiogram-guided catheter tip positioning with placement using surface-anatomy-guided insertion plus chest X-ray confirmation. The primary outcome was accurate catheter tip placement. Secondary outcomes included complications, patient satisfaction and costs. Results Five studies involving 729 participants were included. Electrocardiogram-guided insertion was more accurate than surface anatomy guided insertion (odds ratio: 8.3; 95% confidence interval (CI) 1.38; 50.07; p=0.02). There was a lack of reporting on complications, patient satisfaction and costs. Conclusion The evidence suggests that intracavitary electrocardiogram-based positioning is superior to surface-anatomy-guided positioning of central venous access devices, leading to significantly more successful placements. This technique could potentially remove the requirement for post-procedural chest X-ray, especially during peripherally inserted central catheter (PICC) line insertion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Some types of antimicrobial-coated central venous catheters (A-CVC) have been shown to be cost-effective in preventing catheter-related bloodstream infection (CR-BSI). However, not all types have been evaluated, and there are concerns over the quality and usefulness of these earlier studies. There is uncertainty amongst clinicians over which, if any, antimicrobial-coated central venous catheters to use. We re-evaluated the cost-effectiveness of all commercially available antimicrobialcoated central venous catheters for prevention of catheter-related bloodstream infection in adult intensive care unit (ICU) patients. Methods: We used a Markov decision model to compare the cost-effectiveness of antimicrobial-coated central venous catheters relative to uncoated catheters. Four catheter types were evaluated; minocycline and rifampicin (MR)-coated catheters; silver, platinum and carbon (SPC)-impregnated catheters; and two chlorhexidine and silver sulfadiazine-coated catheters, one coated on the external surface (CH/SSD (ext)) and the other coated on both surfaces (CH/SSD (int/ext)). The incremental cost per qualityadjusted life-year gained and the expected net monetary benefits were estimated for each. Uncertainty arising from data estimates, data quality and heterogeneity was explored in sensitivity analyses. Results: The baseline analysis, with no consideration of uncertainty, indicated all four types of antimicrobial-coated central venous catheters were cost-saving relative to uncoated catheters. Minocycline and rifampicin-coated catheters prevented 15 infections per 1,000 catheters and generated the greatest health benefits, 1.6 quality-adjusted life-years, and cost-savings, AUD $130,289. After considering uncertainty in the current evidence, the minocycline and rifampicin-coated catheters returned the highest incremental monetary net benefits of $948 per catheter; but there was a 62% probability of error in this conclusion. Although the minocycline and rifampicin-coated catheters had the highest monetary net benefits across multiple scenarios, the decision was always associated with high uncertainty. Conclusions: Current evidence suggests that the cost-effectiveness of using antimicrobial-coated central venous catheters within the ICU is highly uncertain. Policies to prevent catheter-related bloodstream infection amongst ICU patients should consider the cost-effectiveness of competing interventions in the light of this uncertainty. Decision makers would do well to consider the current gaps in knowledge and the complexity of producing good quality evidence in this area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: US Centers for Disease Control guidelines recommend replacement of peripheral intravenous (IV) catheters no more frequently than every 72 to 96 hours. Routine replacement is thought to reduce the risk of phlebitis and bloodstream infection. Catheter insertion is an unpleasant experience for patients and replacement may be unnecessary if the catheter remains functional and there are no signs of inflammation. Costs associated with routine replacement may be considerable. This is an update of a review first published in 2010. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Catheter associated urinary tract infections (CAUTI) are a worldwide problem that may lead to increased patient morbidity, cost and mortality.1e3 The literature is divided on whether there are real effects from CAUTI on length of stay or mortality. Platt4 found the costs and mortality risks to be largeyetGraves et al found the opposite.5 A reviewof the published estimates of the extra length of stay showed results between zero and 30 days.6 The differences in estimates may have been caused by the different epidemiological methods applied. Accurately estimating the effects of CAUTI is difficult because it is a time-dependent exposure. This means that standard statistical techniques, such asmatched case-control studies, tend to overestimate the increased hospital stay and mortality risk due to infection. The aim of the study was to estimate excess length of stay andmortality in an intensive care unit (ICU) due to a CAUTI, using a statistical model that accounts for the timing of infection. Data collected from ICU units in lower and middle income countries were used for this analysis.7,8 There has been little research for these settings, hence the need for this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Less invasive methods of determining cardiac output are now readily available. Using indicator dilution technique, for example has made it easier to continuously measure cardiac output because it uses the existing intra-arterial line. Therefore gone is the need for a pulmonary artery floatation catheter and with it the ability to measure left atrial and left ventricular work indices as well the ability to monitor and measure a mixed venous saturation (SvO2). Purpose The aim of this paper is to put forward the notion that SvO2 provides valuable information about oxygen consumption and venous reserve; important measures in the critically ill to ensure oxygen supply meets cellular demand. In an attempt to portray this, a simplified example of the septic patient is offered to highlight the changing pathophysiological sequelae of the inflammatory process and its importance for monitoring SvO2. Relevance to clinical practice SvO2 monitoring, it could be argued, provides the gold standard for assessing arterial and venous oxygen indices in the critically ill. For the bedside ICU nurse the plethora of information inherent in SvO2 monitoring could provide them with important data that will assist in averting potential problems with oxygen delivery and consumption. However, it has been suggested that central venous saturation (ScvO2) might be an attractive alternative to SvO2 because of its less invasiveness and ease of obtaining a sample for analysis. There are problems with this approach and these are to do with where the catheter tip is sited and the nature of the venous admixture at this site. Studies have shown that ScvO2 is less accurate than SvO2 and should not be used as a sole guiding variable for decision-making. These studies have demonstrated that there is an unacceptably wide range in variance between ScvO2 and SvO2 and this is dependent on the presenting disease, in some cases SvO2 will be significantly lower than ScvO2. Conclusion Whilst newer technologies have been developed to continuously measure cardiac output, SvO2 monitoring is still an important adjunct to clinical decision-making in the ICU. Given the information that it provides, seeking alternatives such as ScvO2 or blood samples obtained from femorally placed central venous lines, can unnecessarily lead to inappropriate treatment being given or withheld. Instead when using ScvO2, trending of this variable should provide clinical determinates that are useable for the bedside ICU nurse, remembering that in most conditions SvO2 will be approximately 16% lower.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Infective endocarditis (IE) is a life-threatening infection of the heart endothelium and valves. Staphylococcus aureus is a predominant cause of severe IE and is frequently associated with infections in health care settings and device-related infections. Multilocus sequence typing (MLST), spa typing, and virulence gene microarrays are frequently used to classify S. aureus clinical isolates. This study examined the utility of these typing tools to investigate S. aureus epidemiology associated with IE. Ninety-seven S. aureus isolates were collected from patients diagnosed with (i) IE, (ii) bloodstream infection related to medical devices, (iii) bloodstream infection not related to medical devices, and (iv) skin or soft-tissue infections. The MLST clonal complex (CC) for each isolate was determined and compared to the CCs of members of the S. aureus population by eBURST analysis. The spa type of all isolates was also determined. A null model was used to determine correlations of IE with CC and spa type. DNA microarray analysis was performed, and a permutational analysis of multivariate variance (PERMANOVA) and principal coordinates analysis were conducted to identify genotypic differences between IE and non-IE strains. CC12, CC20, and spa type t160 were significantly associated with IE S. aureus. A subset of virulence-associated genes and alleles, including genes encoding staphylococcal superantigen-like proteins, fibrinogen-binding protein, and a leukocidin subunit, also significantly correlated with IE isolates. MLST, spa typing, and microarray analysis are promising tools for monitoring S. aureus epidemiology associated with IE. Further research to determine a role for the S. aureus IE-associated virulence genes identified in this study is warranted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction Critical care patients frequently receive blood transfusions. Some reports show an association between aged or stored blood and increased morbidity and mortality, including the development of transfusion-related acute lung injury (TRALI). However, the existence of conflicting data endorses the need for research to either reject this association, or to confirm it and elucidate the underlying mechanisms. Methods Twenty-eight sheep were randomised into two groups, receiving saline or lipopolysaccharide (LPS). Sheep were further randomised to also receive transfusion of pooled and heat-inactivated supernatant from fresh (Day 1) or stored (Day 42) non-leucoreduced human packed red blood cells (PRBC) or an infusion of saline. TRALI was defined by hypoxaemia during or within two hours of transfusion and histological evidence of pulmonary oedema. Regression modelling compared physiology between groups, and to a previous study, using stored platelet concentrates (PLT). Samples of the transfused blood products also underwent cytokine array and biochemical analyses, and their neutrophil priming ability was measured in vitro. Results TRALI did not develop in sheep that first received saline-infusion. In contrast, 80% of sheep that first received LPS-infusion developed TRALI following transfusion with "stored PRBC." The decreased mean arterial pressure and cardiac output as well as increased central venous pressure and body temperature were more severe for TRALI induced by "stored PRBC" than by "stored PLT." Storage-related accumulation of several factors was demonstrated in both "stored PRBC" and "stored PLT", and was associated with increased in vitro neutrophil priming. Concentrations of several factors were higher in the "stored PRBC" than in the "stored PLT," however, there was no difference to neutrophil priming in vitro. Conclusions In this in vivo ovine model, both recipient and blood product factors contributed to the development of TRALI. Sick (LPS infused) sheep rather than healthy (saline infused) sheep predominantly developed TRALI when transfused with supernatant from stored but not fresh PRBC. "Stored PRBC" induced a more severe injury than "stored PLT" and had a different storage lesion profile, suggesting that these outcomes may be associated with storage lesion factors unique to each blood product type. Therefore, the transfusion of fresh rather than stored PRBC may minimise the risk of TRALI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The anticoagulant effect of apixaban is due to direct inhibition of FXa in the coagulation cascade. The main advantages apixaban has over the current anti-coagulant drugs is that it is active after oral administration, and its coagulation effect does not require monitoring. Apixaban has been compared to enoxaparin in the prevention of venous thromboembolism associated with knee and hip replacement, where it is as efficacious as enoxaparin, but causes less bleeding. However, apixaban is not the only FXa inhibitor that could replace enoxaparin for this indication, as the FXa inhibitor rivaroxaban is as efficacious and safe as enoxaparin in preventing thromboembolism associated with these surgical procedures. Until the results of the AMPLIFY Phase III trial are known, it is too early to consider apixaban as an alternative to enoxaparin in symptomatic thromboembolism. Apixaban should not be used to prevent thromboembolism in medical immobilised subjects or acute coronary syndromes, as it causes excess bleeding in these conditions without benefit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose The use of intravascular devices is associated with a number of potential complications. Despite a number of evidence-based clinical guidelines in this area, there continues to be nursing practice discrepancies. This study aims to examine nursing practice in a cancer care setting to identify nursing practice and areas for improvement respective to best available evidence. Methods A point prevalence survey was undertaken in a tertiary cancer care centre in Queensland, Australia. On a randomly selected day, four nurses assessed intravascular device related nursing practices and collected data using a standardized survey tool. Results 58 inpatients (100%) were assessed. Forty-eight (83%) had a device in situ, comprising 14 Peripheral Intravenous Catheters (29.2%), 14 Peripherally Inserted Central Catheters (29.2%), 14 Hickman catheters (29.2%) and six Port-a-Caths (12.4%). Suboptimal outcomes such as incidences of local site complications, incorrect/inadequate documentation, lack of flushing orders, and unclean/non intact dressings were observed. Conclusions This study has highlighted a number of intravascular device related nursing practice discrepancies compared with current hospital policy. Education and other implementation strategies can be applied to improve nursing practice. Following education strategies, it will be valuable to repeat this survey on a regular basis to provide feedback to nursing staff and implement strategies to improve practice. More research is required to provide evidence to clinical practice with regards to intravascular device related consumables, flushing technique and protocols.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Escherichia coli sequence type 131 (ST131) is a globally disseminated, multidrug resistant (MDR) clone responsible for a high proportion of urinary tract and bloodstream infections. The rapid emergence and successful spread of E. coli ST131 is strongly associated with several factors, including resistance to fluoroquinolones, high virulence gene content, the possession of the type 1 fimbriae FimH30 allele, and the production of the CTX-M-15 extended spectrum β-lactamase (ESBL). Here, we used genome sequencing to examine the molecular epidemiology of a collection of E. coli ST131 strains isolated from six distinct geographical locations across the world spanning 2000–2011. The global phylogeny of E. coli ST131, determined from whole-genome sequence data, revealed a single lineage of E. coli ST131 distinct from other extraintestinal E. coli strains within the B2 phylogroup. Three closely related E. coli ST131 sublineages were identified, with little association to geographic origin. The majority of single-nucleotide variants associated with each of the sublineages were due to recombination in regions adjacent to mobile genetic elements (MGEs). The most prevalent sublineage of ST131 strains was characterized by fluoroquinolone resistance, and a distinct virulence factor and MGE profile. Four different variants of the CTX-M ESBL–resistance gene were identified in our ST131 strains, with acquisition of CTX-M-15 representing a defining feature of a discrete but geographically dispersed ST131 sublineage. This study confirms the global dispersal of a single E. coli ST131 clone and demonstrates the role of MGEs and recombination in the evolution of this important MDR pathogen.