101 resultados para peripheral intravenous catheter

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Peripheral venous catheters (PVCs) are the simplest and most frequently used method for drug, fluid, and blood product administration in the hospital setting. It is estimated that up to 90% of patients in acute care hospitals require a PVC; however, PVCs are associated with inherent complications, which can be mechanical or infectious. There have been a range of strategies to prevent or reduce PVC-related complications that include optimizing patency through the use of flushing. Little is known about the current status of flushing practice. This observational study quantified preparation and administration time and identified adherence to principles of Aseptic Non-Touch Technique and organizational protocol on PVC flushing by using both manually prepared and prefilled syringes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIM The aim of this evidence-based practice (EBP) project was to promote adherence to the current best practice in monitoring and optimal replacement of peripheral intravenous device (PIVD). METHODS This EBP project took place in a 30-bed acute general surgical ward. Twenty in-patients with PIVD in situ for 4 days or more were recruited. There were five stages in the project: identification of EBP topic, criteria, sample and setting; baseline; dissemination of baseline audit results and identification of best practice barriers; identification of barriers to EBP and implementation of strategies promoting EBP; and postimplementation audit. RESULTS There were eight criteria in this project. The first audit showed moderate compliance in PIVD monitoring and optimal replacement. The project identified three barriers: lack of awareness of the current evidence-based guidelines, hospital policy not being aligned with current guidelines and no standard form of documentation. In order to overcome these barriers the following strategies were used: audit and feedback, interactive educational meetings, reminders and hospital policy change. The second audit showed minor improvements in each criterion. Compliance with documentation remained a challenge, possibly because of the lack of standardised documentation. DISCUSSION Although the project did not render us the results we aimed for, it was successful because it highlighted the current EBP in PIVD management. The major challenges of the project were time and the lack of opinion leaders in our project team. We felt that more time was needed to adapt to the practice change and standardised documentation could not be developed in such a short time period. Further, the role of the opinion leader proved to be vital in this project. We felt that had we recruited more than one opinion leader, the results would have been different.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Centers for Disease Control Guidelines recommend replacement of peripheral intravenous (IV) catheters every 72 to 96 hours. Routine replacement is thought to reduce the risk of phlebitis and bacteraemia. Catheter insertion is an unpleasant experience for patients and replacement may be unnecessary if the catheter remains functional and there are no signs of inflammation. Costs associated with routine replacement may be considerable. Objectives To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: US Centers for Disease Control guidelines recommend replacement of peripheral intravenous (IV) catheters no more frequently than every 72 to 96 hours. Routine replacement is thought to reduce the risk of phlebitis and bloodstream infection. Catheter insertion is an unpleasant experience for patients and replacement may be unnecessary if the catheter remains functional and there are no signs of inflammation. Costs associated with routine replacement may be considerable. This is an update of a review first published in 2010. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose The use of intravascular devices is associated with a number of potential complications. Despite a number of evidence-based clinical guidelines in this area, there continues to be nursing practice discrepancies. This study aims to examine nursing practice in a cancer care setting to identify nursing practice and areas for improvement respective to best available evidence. Methods A point prevalence survey was undertaken in a tertiary cancer care centre in Queensland, Australia. On a randomly selected day, four nurses assessed intravascular device related nursing practices and collected data using a standardized survey tool. Results 58 inpatients (100%) were assessed. Forty-eight (83%) had a device in situ, comprising 14 Peripheral Intravenous Catheters (29.2%), 14 Peripherally Inserted Central Catheters (29.2%), 14 Hickman catheters (29.2%) and six Port-a-Caths (12.4%). Suboptimal outcomes such as incidences of local site complications, incorrect/inadequate documentation, lack of flushing orders, and unclean/non intact dressings were observed. Conclusions This study has highlighted a number of intravascular device related nursing practice discrepancies compared with current hospital policy. Education and other implementation strategies can be applied to improve nursing practice. Following education strategies, it will be valuable to repeat this survey on a regular basis to provide feedback to nursing staff and implement strategies to improve practice. More research is required to provide evidence to clinical practice with regards to intravascular device related consumables, flushing technique and protocols.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduction Vascular access devices (VADs), such as peripheral or central venous catheters, are vital across all medical and surgical specialties. To allow therapy or haemodynamic monitoring, VADs frequently require administration sets (AS) composed of infusion tubing, fluid containers, pressure-monitoring transducers and/or burettes. While VADs are replaced only when necessary, AS are routinely replaced every 3–4 days in the belief that this reduces infectious complications. Strong evidence supports AS use up to 4 days, but there is less evidence for AS use beyond 4 days. AS replacement twice weekly increases hospital costs and workload. Methods and analysis This is a pragmatic, multicentre, randomised controlled trial (RCT) of equivalence design comparing AS replacement at 4 (control) versus 7 (experimental) days. Randomisation is stratified by site and device, centrally allocated and concealed until enrolment. 6554 adult/paediatric patients with a central venous catheter, peripherally inserted central catheter or peripheral arterial catheter will be enrolled over 4 years. The primary outcome is VAD-related bloodstream infection (BSI) and secondary outcomes are VAD colonisation, AS colonisation, all-cause BSI, all-cause mortality, number of AS per patient, VAD time in situ and costs. Relative incidence rates of VAD-BSI per 100 devices and hazard rates per 1000 device days (95% CIs) will summarise the impact of 7-day relative to 4-day AS use and test equivalence. Kaplan-Meier survival curves (with log rank Mantel-Cox test) will compare VAD-BSI over time. Appropriate parametric or non-parametric techniques will be used to compare secondary end points. p Values of <0.05 will be considered significant.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In his letter Cunha suggests that oral antibiotic therapy is safer and less expensive than intravenous therapy via central venous catheters (CVCs) (1). The implication is that costs will fall and increased health benefits will be enjoyed resulting in a gain in efficiency within the healthcare system. CVCs are often used in critically ill patients to deliver antimicrobial therapy, but expose patients to a risk of catheter-related bloodstream infection (CRBSI). Our current knowledge about the efficiency (i.e. costeffectiveness) of allocating resources toward interventions that prevent CRBSI in patients requiring a CVC has already been reviewed (2). If for some patient groups antimicrobial therapy can be delivered orally, instead of through a CVC, then the costs and benefits of this alternate strategy should be evaluated...

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective.  Leconotide (CVID, AM336, CNSB004) is an omega conopeptide similar to ziconotide, which blocks voltage sensitive calcium channels. However, unlike ziconotide, which must be administered intrathecally, leconotide can be given intravenously because it is less toxic. This study investigated the antihyperalgesic potency of leconotide given intravenously alone and in combinations with morphine-administered intraperitoneally, in a rat model of bone cancer pain. Design.  Syngeneic rat prostate cancer cells AT3B-1 were injected into one tibia of male Wistar rats. The tumor expanded within the bone causing hyperalgesia to heat applied to the ipsilateral hind paw. Measurements were made of the maximum dose (MD) of morphine and leconotide given alone and in combinations that caused no effect in an open-field activity monitor, rotarod, and blood pressure and heart rate measurements. Paw withdrawal thresholds from noxious heat were measured. Dose response curves for morphine (0.312–5.0 mg/kg intraperitoneal) and leconotide (0.002–200 µg/kg intravenous) given alone were plotted and responses compared with those caused by morphine and leconotide in combinations. Results.  Leconotide caused minimal antihyperalgesic effects when administered alone. Morphine given alone intraperitoneally caused dose-related antihyperalgesic effects (ED50 = 2.40 ± 1.24 mg/kg), which were increased by coadministration of leconotide 20 µg/kg (morphine ED50 = 0.16 ± 1.30 mg/kg); 0.2 µg/kg (morphine ED50 = 0.39 ± 1.27 mg/kg); and 0.02 µg/kg (morphine ED50 = 1.24 ± 1.30 mg/kg). Conclusions.  Leconotide caused a significant increase in reversal by morphine of the bone cancer-induced hyperalgesia without increasing the side effect profile of either drug. Clinical Implication.  Translation into clinical practice of the method of analgesia described here will improve the quantity and quality of analgesia in patients with bone metastases. The use of an ordinary parenteral route for administration of the calcium channel blocker (leconotide) at low dose opens up the technique to large numbers of patients who could not have an intrathecal catheter for drug administration. Furthermore, the potentiating synergistic effect with morphine on hyperalgesia without increased side effects will lead to greater analgesia with improved quality of life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background People admitted to intensive care units and those with chronic health care problems often require long-term vascular access. Central venous access devices (CVADs) are used for administering intravenous medications and blood sampling. CVADs are covered with a dressing and secured with an adhesive or adhesive tape to protect them from infection and reduce movement. Dressings are changed when they become soiled with blood or start to come away from the skin. Repeated removal and application of dressings can cause damage to the skin. The skin is an important barrier that protects the body against infection. Less frequent dressing changes may reduce skin damage, but it is unclear whether this practice affects the frequency of catheter-related infections. Objectives To assess the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections and other outcomes including pain and skin damage. Search methods In June 2015 we searched: The Cochrane Wounds Specialised Register; The Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library); Ovid MEDLINE; Ovid MEDLINE (In-Process & Other Non-Indexed Citations); Ovid EMBASE and EBSCO CINAHL. We also searched clinical trials registries for registered trials. There were no restrictions with respect to language, date of publication or study setting. Selection criteria All randomised controlled trials (RCTs) evaluating the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections on all patients in any healthcare setting. Data collection and analysis We used standard Cochrane review methodology. Two review authors independently assessed studies for inclusion, performed risk of bias assessment and data extraction. We undertook meta-analysis where appropriate or otherwise synthesised data descriptively when heterogeneous. Main results We included five RCTs (2277 participants) that compared different frequencies of CVAD dressing changes. The studies were all conducted in Europe and published between 1995 and 2009. Participants were recruited from the intensive care and cancer care departments of one children's and four adult hospitals. The studies used a variety of transparent dressings and compared a longer interval between dressing changes (5 to15 days; intervention) with a shorter interval between changes (2 to 5 days; control). In each study participants were followed up until the CVAD was removed or until discharge from ICU or hospital. - Confirmed catheter-related bloodstream infection (CRBSI) One trial randomised 995 people receiving central venous catheters to a longer or shorter interval between dressing changes and measured CRBSI. It is unclear whether there is a difference in the risk of CRBSI between people having long or short intervals between dressing changes (RR 1.42, 95% confidence interval (CI) 0.40 to 4.98) (low quality evidence). - Suspected catheter-related bloodstream infection Two trials randomised a total of 151 participants to longer or shorter dressing intervals and measured suspected CRBSI. It is unclear whether there is a difference in the risk of suspected CRBSI between people having long or short intervals between dressing changes (RR 0.70, 95% CI 0.23 to 2.10) (low quality evidence). - All cause mortality Three trials randomised a total of 896 participants to longer or shorter dressing intervals and measured all cause mortality. It is unclear whether there is a difference in the risk of death from any cause between people having long or short intervals between dressing changes (RR 1.06, 95% CI 0.90 to 1.25) (low quality evidence). - Catheter-site infection Two trials randomised a total of 371 participants to longer or shorter dressing intervals and measured catheter-site infection. It is unclear whether there is a difference in risk of catheter-site infection between people having long or short intervals between dressing changes (RR 1.07, 95% CI 0.71 to 1.63) (low quality evidence). - Skin damage One small trial (112 children) and three trials (1475 adults) measured skin damage. There was very low quality evidence for the effect of long intervals between dressing changes on skin damage compared with short intervals (children: RR of scoring ≥ 2 on the skin damage scale 0.33, 95% CI 0.16 to 0.68; data for adults not pooled). - Pain Two studies involving 193 participants measured pain. It is unclear if there is a difference between long and short interval dressing changes on pain during dressing removal (RR 0.80, 95% CI 0.46 to 1.38) (low quality evidence). Authors' conclusions The best available evidence is currently inconclusive regarding whether longer intervals between CVAD dressing changes are associated with more or less catheter-related infection, mortality or pain than shorter intervals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Catheter-related bloodstream infections are a serious problem. Many interventions reduce risk, and some have been evaluated in cost-effectiveness studies. We review the usefulness and quality of these economic studies. Evidence is incomplete, and data required to inform a coherent policy are missing. The cost-effectiveness studies are characterized by a lack of transparency, short time-horizons, and narrow economic perspectives. Data quality is low for some important model parameters. Authors of future economic evaluations should aim to model the complete policy and not just single interventions. They should be rigorous in developing the structure of the economic model, include all relevant economic outcomes, use a systematic approach for selecting data sources for model parameters, and propagate the effect of uncertainty in model parameters on conclusions. This will inform future data collection and improve our understanding of the economics of preventing these infections.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The accurate measurement of Cardiac output (CO) is vital in guiding the treatment of critically ill patients. Invasive or minimally invasive measurement of CO is not without inherent risks to the patient. Skilled Intensive Care Unit (ICU) nursing staff are in an ideal position to assess changes in CO following therapeutic measures. The USCOM (Ultrasonic Cardiac Output Monitor) device is a non-invasive CO monitor whose clinical utility and ease of use requires testing. Objectives To compare cardiac output measurement using a non-invasive ultrasonic device (USCOM) operated by a non-echocardiograhically trained ICU Registered Nurse (RN), with the conventional pulmonary artery catheter (PAC) using both thermodilution and Fick methods. Design Prospective observational study. Setting and participants Between April 2006 and March 2007, we evaluated 30 spontaneously breathing patients requiring PAC for assessment of heart failure and/or pulmonary hypertension at a tertiary level cardiothoracic hospital. Methods SCOM CO was compared with thermodilution measurements via PAC and CO estimated using a modified Fick equation. This catheter was inserted by a medical officer, and all USCOM measurements by a senior ICU nurse. Mean values, bias and precision, and mean percentage difference between measures were determined to compare methods. The Intra-Class Correlation statistic was also used to assess agreement. The USCOM time to measure was recorded to assess the learning curve for USCOM use performed by an ICU RN and a line of best fit demonstrated to describe the operator learning curve. Results In 24 of 30 (80%) patients studied, CO measures were obtained. In 6 of 30 (20%) patients, an adequate USCOM signal was not achieved. The mean difference (±standard deviation) between USCOM and PAC, USCOM and Fick, and Fick and PAC CO were small, −0.34 ± 0.52 L/min, −0.33 ± 0.90 L/min and −0.25 ± 0.63 L/min respectively across a range of outputs from 2.6 L/min to 7.2 L/min. The percent limits of agreement (LOA) for all measures were −34.6% to 17.8% for USCOM and PAC, −49.8% to 34.1% for USCOM and Fick and −36.4% to 23.7% for PAC and Fick. Signal acquisition time reduced on average by 0.6 min per measure to less than 10 min at the end of the study. Conclusions In 80% of our cohort, USCOM, PAC and Fick measures of CO all showed clinically acceptable agreement and the learning curve for operation of the non-invasive USCOM device by an ICU RN was found to be satisfactorily short. Further work is required in patients receiving positive pressure ventilation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To determine the subbasal nerve density and tortuosity at 5 corneal locations and to investigate whether these microstructural observations correlate with corneal sensitivity. Method: Sixty eyes of 60 normal human subjects were recruited into 1 of 3 age groups, group 1: aged ,35 years, group 2: aged 35–50 years, and group 3: aged .50 years. All eyes were examined using slit-lamp biomicroscopy, noncontact corneal esthesiometry, and slit scanning in vivo confocal microscopy. Results: The mean subbasal nerve density and the mean corneal sensitivity were greatest centrally (14,731 6 6056 mm/mm2 and 0.38 6 0.21 millibars, respectively) and lowest in the nasal mid periphery (7850 6 4947 mm/mm2 and 0.49 6 0.25 millibars, respectively). The mean subbasal nerve tortuosity coefficient was greatest in the temporal mid periphery (27.3 6 6.4) and lowest in the superior mid periphery (19.3 6 14.1). There was no significant difference in mean total subbasal nerve density between age groups. However, corneal sensation (P = 0.001) and subbasal nerve tortuosity (P = 0.004) demonstrated significant differences between age groups. Subbasal nerve density only showed significant correlations with corneal sensitivity threshold in the temporal cornea and with subbasal nerve tortuosity in the inferior and nasal cornea. However, these correlations were weak. Conclusions: This study quantitatively analyzes living human corneal nerve structure and an aspect of nerve function. There is no strong correlation between subbasal nerve density and corneal sensation. This study provides useful baseline data for the normal living human cornea at central and mid-peripheral locations