43 resultados para Intrabladder catheter


Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Transcatheter closure of patent foramen ovale (PFO) has rapidly evolved as the preferred management strategy for the prevention of recurrent cerebrovascular events in patients with cryptogenic stroke and presumed paradoxical embolus. There is limited outcome data in patients treated with this therapy particularly for the newer devices. METHODS: Data from medical records, catheter, and echocardiography databases on 70 PFO procedures performed was collected prospectively. RESULTS: The cohort consisted of 70 patients (mean age 43.6 years, range 19 to 77 years), of whom 51% were male. The indications for closure were cryptogenic cerebrovascular accident (CVA) or transient ischemic attack (TIA) in 64 (91%) and peripheral emboli in two (2.8%) patients and cryptogenic ST-elevation myocardial infarction in one (1.4%), refractory migraine in one (1.4%), decompression sickness in one (1.4%), and orthodeoxia in one (1.4%) patient, respectively. All patients had demonstrated right-to-left shunting on bubble study. The procedures were guided by intracardiac echocardiography in 53%, transesophageal echocardiography in 39%, and the remainder by transthoracic echo alone. Devices used were the Amplatzer PFO Occluder (AGA Medical) (sizes 18-35 mm) in 49 (70%) and the Premere device (St. Jude Medical) in 21 (30%). In-hospital complications consisted of one significant groin hematoma with skin infection. Echocardiographic follow-up at 6 months revealed that most patients had no or trivial residual shunt (98.6%), while one patient (1.4%) had a mild residual shunt. At a median of 11 months' follow-up (range 1 month to 4.3 years), no patients (0%) experienced further CVA/TIAs or paradoxical embolic events during follow-up. CONCLUSION: PFO causing presumed paradoxical embolism can be closed percutaneously with a low rate of significant residual shunting and very few complications. Recurrent index events are uncommon at medium-term (up to 4 years) follow-up.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: US Centers for Disease Control guidelines recommend replacement of peripheral intravenous (IV) catheters no more frequently than every 72 to 96 hours. Routine replacement is thought to reduce the risk of phlebitis and bloodstream infection. Catheter insertion is an unpleasant experience for patients and replacement may be unnecessary if the catheter remains functional and there are no signs of inflammation. Costs associated with routine replacement may be considerable. This is an update of a review first published in 2010. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Less invasive methods of determining cardiac output are now readily available. Using indicator dilution technique, for example has made it easier to continuously measure cardiac output because it uses the existing intra-arterial line. Therefore gone is the need for a pulmonary artery floatation catheter and with it the ability to measure left atrial and left ventricular work indices as well the ability to monitor and measure a mixed venous saturation (SvO2). Purpose The aim of this paper is to put forward the notion that SvO2 provides valuable information about oxygen consumption and venous reserve; important measures in the critically ill to ensure oxygen supply meets cellular demand. In an attempt to portray this, a simplified example of the septic patient is offered to highlight the changing pathophysiological sequelae of the inflammatory process and its importance for monitoring SvO2. Relevance to clinical practice SvO2 monitoring, it could be argued, provides the gold standard for assessing arterial and venous oxygen indices in the critically ill. For the bedside ICU nurse the plethora of information inherent in SvO2 monitoring could provide them with important data that will assist in averting potential problems with oxygen delivery and consumption. However, it has been suggested that central venous saturation (ScvO2) might be an attractive alternative to SvO2 because of its less invasiveness and ease of obtaining a sample for analysis. There are problems with this approach and these are to do with where the catheter tip is sited and the nature of the venous admixture at this site. Studies have shown that ScvO2 is less accurate than SvO2 and should not be used as a sole guiding variable for decision-making. These studies have demonstrated that there is an unacceptably wide range in variance between ScvO2 and SvO2 and this is dependent on the presenting disease, in some cases SvO2 will be significantly lower than ScvO2. Conclusion Whilst newer technologies have been developed to continuously measure cardiac output, SvO2 monitoring is still an important adjunct to clinical decision-making in the ICU. Given the information that it provides, seeking alternatives such as ScvO2 or blood samples obtained from femorally placed central venous lines, can unnecessarily lead to inappropriate treatment being given or withheld. Instead when using ScvO2, trending of this variable should provide clinical determinates that are useable for the bedside ICU nurse, remembering that in most conditions SvO2 will be approximately 16% lower.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To the Editor—In a recent review article in Infection Control and Hospital Epidemiology, Umscheid et al1 summarized published data on incidence rates of catheter-associated bloodstream infection (CABSI), catheter-associated urinary tract infection (CAUTI), surgical site infection (SSI), and ventilator- associated pneumonia (VAP); estimated how many cases are preventable; and calculated the savings in hospital costs and lives that would result from preventing all preventable cases. Providing these estimates to policy makers, political leaders, and health officials helps to galvanize their support for infection prevention programs. Our concern is that important limitations of the published studies on which Umscheid and colleagues built their findings are incompletely addressed in this review. More attention needs to be drawn to the techniques applied to generate these estimates...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Establishment of asymptomatic bacteriuria (ABU) with Escherichia coli 83972 is a viable prophylactic alternative to antibiotic therapy for the prevention of recurrent bacterial urinary tract infection in humans. Approximately 2 x 108 viable E. coli 83972 cells were introduced into the bladder of six healthy female dogs via a sterile urinary catheter. The presence of pyuria, depression, stranguria, pollakiuria and haematuria was documented for 6 weeks and urinalysis and aerobic bacterial cultures were performed every 24–72 h. Pyuria was present in all dogs on day 1 post-inoculation and 4/6 dogs (67%) had a positive urine culture on this day. Duration of colonization ranged from 0 to 10 days (median 4 days). Four dogs were re-inoculated on day 20. Duration of colonization following the second inoculation ranged from 1 to 3 days. No dog suffered pyrexia or appeared systemically unwell but all dogs initially exhibited mild pollakiuria and a small number displayed gross haematuria and/or stranguria. By day 3 of each trial all clinical signs had resolved. Persistent bacteriuria was not achieved in any dog but two dogs were colonized for 10 days following a single inoculation. Further research is required to determine whether establishment of ABU in dogs with recurrent urinary tract infection is a viable alternative to repeated doses of antimicrobial agents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Catheter-associated urinary tract infection (CAUTI) is the most common nosocomial infection in the United States and is caused by a range of uropathogens. Biofilm formation by uropathogens that cause CAUTI is often mediated by cell surface structures such as fimbriae. In this study, we characterised the genes encoding type 3 fimbriae from CAUTI strains of Escherichia coli, Klebsiella pneumoniae, Klebsiella oxytoca, Citrobacter koseri and Citrobacter freundii. Results Phylogenetic analysis of the type 3 fimbrial genes (mrkABCD) from 39 strains revealed they clustered into five distinct clades (A-E) ranging from one to twenty-three members. The majority of sequences grouped in clade A, which was represented by the mrk gene cluster from the genome sequenced K. pneumoniae MGH78578. The E. coli and K. pneumoniae mrkABCD gene sequences clustered together in two distinct clades, supporting previous evidence for the occurrence of inter-genera lateral gene transfer. All of the strains examined caused type 3 fimbriae mediated agglutination of tannic acid treated human erythrocytes despite sequence variation in the mrkD-encoding adhesin gene. Type 3 fimbriae deletion mutants were constructed in 13 representative strains and were used to demonstrate a direct role for type 3 fimbriae in biofilm formation. Conclusions The expression of functional type 3 fimbriae is common to many Gram-negative pathogens that cause CAUTI and is strongly associated with biofilm growth. Our data provides additional evidence for the spread of type 3 fimbrial genes by lateral gene transfer. Further work is now required to substantiate the clade structure reported here by examining more strains as well as other bacterial genera that make type 3 fimbriae and cause CAUTI.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Urinary tract infections (UTIs) are among the most common infectious diseases of humans, with Escherichia coli being responsible for >80% of all cases. Asymptomatic bacteriuria (ABU) occurs when bacteria colonize the urinary tract without causing clinical symptoms and can affect both catheterized patients (catheter-associated ABU [CA-ABU]) and noncatheterized patients. Here, we compared the virulence properties of a collection of ABU and CA-ABU nosocomial E. coli isolates in terms of antibiotic resistance, phylogenetic grouping, specific UTI-associated virulence genes, hemagglutination characteristics, and biofilm formation. CA-ABU isolates were similar to ABU isolates with regard to the majority of these characteristics; exceptions were that CA-ABU isolates had a higher prevalence of the polysaccharide capsule marker genes kpsMT II and kpsMT K1, while more ABU strains were capable of mannose-resistant hemagglutination. To examine biofilm growth in detail, we performed a global gene expression analysis with two CA-ABU strains that formed a strong biofilm and that possessed a limited adhesin repertoire. The gene expression profile of the CA-ABU strains during biofilm growth showed considerable overlap with that previously described for the prototype ABU E. coli strain, 83972. This is the first global gene expression analysis of E. coli CA-ABU strains. Overall, our data suggest that nosocomial ABU and CA-ABU E. coli isolates possess similar virulence profiles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective.  Leconotide (CVID, AM336, CNSB004) is an omega conopeptide similar to ziconotide, which blocks voltage sensitive calcium channels. However, unlike ziconotide, which must be administered intrathecally, leconotide can be given intravenously because it is less toxic. This study investigated the antihyperalgesic potency of leconotide given intravenously alone and in combinations with morphine-administered intraperitoneally, in a rat model of bone cancer pain. Design.  Syngeneic rat prostate cancer cells AT3B-1 were injected into one tibia of male Wistar rats. The tumor expanded within the bone causing hyperalgesia to heat applied to the ipsilateral hind paw. Measurements were made of the maximum dose (MD) of morphine and leconotide given alone and in combinations that caused no effect in an open-field activity monitor, rotarod, and blood pressure and heart rate measurements. Paw withdrawal thresholds from noxious heat were measured. Dose response curves for morphine (0.312–5.0 mg/kg intraperitoneal) and leconotide (0.002–200 µg/kg intravenous) given alone were plotted and responses compared with those caused by morphine and leconotide in combinations. Results.  Leconotide caused minimal antihyperalgesic effects when administered alone. Morphine given alone intraperitoneally caused dose-related antihyperalgesic effects (ED50 = 2.40 ± 1.24 mg/kg), which were increased by coadministration of leconotide 20 µg/kg (morphine ED50 = 0.16 ± 1.30 mg/kg); 0.2 µg/kg (morphine ED50 = 0.39 ± 1.27 mg/kg); and 0.02 µg/kg (morphine ED50 = 1.24 ± 1.30 mg/kg). Conclusions.  Leconotide caused a significant increase in reversal by morphine of the bone cancer-induced hyperalgesia without increasing the side effect profile of either drug. Clinical Implication.  Translation into clinical practice of the method of analgesia described here will improve the quantity and quality of analgesia in patients with bone metastases. The use of an ordinary parenteral route for administration of the calcium channel blocker (leconotide) at low dose opens up the technique to large numbers of patients who could not have an intrathecal catheter for drug administration. Furthermore, the potentiating synergistic effect with morphine on hyperalgesia without increased side effects will lead to greater analgesia with improved quality of life.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction Vascular access devices (VADs), such as peripheral or central venous catheters, are vital across all medical and surgical specialties. To allow therapy or haemodynamic monitoring, VADs frequently require administration sets (AS) composed of infusion tubing, fluid containers, pressure-monitoring transducers and/or burettes. While VADs are replaced only when necessary, AS are routinely replaced every 3–4 days in the belief that this reduces infectious complications. Strong evidence supports AS use up to 4 days, but there is less evidence for AS use beyond 4 days. AS replacement twice weekly increases hospital costs and workload. Methods and analysis This is a pragmatic, multicentre, randomised controlled trial (RCT) of equivalence design comparing AS replacement at 4 (control) versus 7 (experimental) days. Randomisation is stratified by site and device, centrally allocated and concealed until enrolment. 6554 adult/paediatric patients with a central venous catheter, peripherally inserted central catheter or peripheral arterial catheter will be enrolled over 4 years. The primary outcome is VAD-related bloodstream infection (BSI) and secondary outcomes are VAD colonisation, AS colonisation, all-cause BSI, all-cause mortality, number of AS per patient, VAD time in situ and costs. Relative incidence rates of VAD-BSI per 100 devices and hazard rates per 1000 device days (95% CIs) will summarise the impact of 7-day relative to 4-day AS use and test equivalence. Kaplan-Meier survival curves (with log rank Mantel-Cox test) will compare VAD-BSI over time. Appropriate parametric or non-parametric techniques will be used to compare secondary end points. p Values of <0.05 will be considered significant.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background International standard practice for the correct confirmation of the central venous access device is the chest X-ray. The intracavitary electrocardiogram-based insertion method is radiation-free, and allows real-time placement verification, providing immediate treatment and reduced requirement for post-procedural repositioning. Methods Relevant databases were searched for prospective randomised controlled trials (RCTs) or quasi RCTs that compared the effectiveness of electrocardiogram-guided catheter tip positioning with placement using surface-anatomy-guided insertion plus chest X-ray confirmation. The primary outcome was accurate catheter tip placement. Secondary outcomes included complications, patient satisfaction and costs. Results Five studies involving 729 participants were included. Electrocardiogram-guided insertion was more accurate than surface anatomy guided insertion (odds ratio: 8.3; 95% confidence interval (CI) 1.38; 50.07; p=0.02). There was a lack of reporting on complications, patient satisfaction and costs. Conclusion The evidence suggests that intracavitary electrocardiogram-based positioning is superior to surface-anatomy-guided positioning of central venous access devices, leading to significantly more successful placements. This technique could potentially remove the requirement for post-procedural chest X-ray, especially during peripherally inserted central catheter (PICC) line insertion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Guidelines and clinical practice for the prevention of complications associated with central venous catheters (CVC) around the world vary greatly. Most institutions recommend the use of heparin to prevent occlusion, however there is debate regarding the need for heparin and evidence to suggest 0.9% sodium chloride (normal saline) may be as effective. The use of heparin is not without risk, may be unnecessary and is also associated with increased cost. Objectives To assess the clinical effects (benefits and harms) of intermittent flushing of heparin versus normal saline to prevent occlusion in long term central venous catheters in infants and children. Search Methods The Cochrane Vascular Trials Search Co-ordinator searched the Specialised Register (last searched April 2015) and the Cochrane Register of Studies (Issue 3, 2015). We also searched the reference lists of retrieved trials. Selection criteria Randomised controlled trials that compared the efficacy of normal saline with heparin to prevent occlusion of long term CVCs in infants and children aged up to 18 years of age were included. We excluded temporary CVCs and peripherally inserted central catheters (PICC). Data Collection and Analysis Two review authors independently assessed trial inclusion criteria, trial quality and extracted data. Rate ratios were calculated for two outcome measures - occlusion of the CVC and central line-associated blood stream infection. Other outcome measures included duration of catheter placement, inability to withdraw blood from the catheter, use of urokinase or recombinant tissue plasminogen, incidence of removal or re-insertion of the catheter, or both, and other CVC-related complications such as dislocation of CVCs, other CVC site infections and thrombosis. Main Results Three trials with a total of 245 participants were included in this review. The three trials directly compared the use of normal saline and heparin, however, between studies, all used different protocols for the standard and experimental arms with different concentrations of heparin and different frequency of flushes reported. In addition, not all studies reported on all outcomes. The quality of the evidence ranged from low to very low because there was no blinding, heterogeneity and inconsistency between studies was high and the confidence intervals were wide. CVC occlusion was assessed in all three trials (243 participants). We were able to pool the results of two trials for the outcomes of CVC occlusion and CVC-associated blood stream infection. The estimated rate ratio for CVC occlusion per 1000 catheter days between the normal saline and heparin group was 0.75 (95% CI 0.10 to 5.51, two studies, 229 participants, very low quality evidence). The estimated rate ratio for CVC-associated blood stream infection was 1.48 (95% CI 0.24 to 9.37, two studies, 231 participants; low quality evidence). The duration of catheter placement was reported to be similar between the two study arms, in one study (203 participants). Authors' Conclusions The review found that there was not enough evidence to determine the effects of intermittent flushing of heparin versus normal saline to prevent occlusion in long term central venous catheters in infants and children. Ultimately, if this evidence were available, the development of evidenced-based clinical practice guidelines and consistency of practice would be facilitated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Around the world, guidelines and clinical practice for the prevention of complications associated with central venous catheters (CVC) vary greatly. To prevent occlusion, most institutions recommend the use of heparin when the CVC is not in use. However, there is debate regarding the need for heparin and evidence to suggest normal saline may be as effective. The use of heparin is not without risk, may be unnecessary and is also associated with increased costs. Objectives To assess the clinical effects (benefits and harms) of heparin versus normal saline to prevent occlusion in long-term central venous catheters in infants, children and adolescents. Design A Cochrane systematic review of randomised controlled trials was undertaken. - Data sources: The Cochrane Vascular Group Specialised Register (including MEDLINE, CINAHL, EMBASE and AMED) and the Cochrane Register of Studies were searched. Hand searching of relevant journals and reference lists of retrieved articles was also undertaken. - Review Methods: Data were extracted and appraisal undertaken. We included studies that compared the efficacy of normal saline with heparin to prevent occlusion. We excluded temporary CVCs and peripherally inserted central catheters. Rate ratios per 1000 catheter days were calculated for two outcomes, occlusion of the CVC, and CVC-associated blood stream infection. Results Three trials with a total of 245 participants were included in this review. The three trials directly compared the use of normal saline and heparin. However, between studies, all used different protocols with various concentrations of heparin and frequency of flushes. The quality of the evidence ranged from low to very low. The estimated rate ratio for CVC occlusion per 1000 catheter days between the normal saline and heparin group was 0.75 (95% CI 0.10 to 5.51, two studies, 229 participants, very low quality evidence). The estimated rate ratio for CVC-associated blood stream infection was 1.48 (95% CI 0.24 to 9.37, two studies, 231 participants; low quality evidence). Conclusions It remains unclear whether heparin is necessary for CVC maintenance. More well-designed studies are required to understand this relatively simple, but clinically important question. Ultimately, if this evidence were available, the development of evidenced-based clinical practice guidelines and consistency of practice would be facilitated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter is about essential nursing care. Because it is often referred to as basic nursing, nurses may not always perceive it as deserving of priority. Yet, how well patients are cared for has a direct effect on their sense of wellbeing and their recovery. ‘Interventional patient hygiene’ is a systematic, evidence-based approach to nursing actions designed to improve patient outcomes using a framework of hygiene, catheter care, skin care, mobility and oral care.1 This chapter focuses on the physical care, infection control, preventative therapies and transport of critically ill patients. The first two areas are closely linked: poor-quality physical care increases the risk of infection. The final areas are essential features of critical care nursing.