9 resultados para Intrabladder catheter
em DigitalCommons@The Texas Medical Center
Resumo:
This study has the purpose of determining the barriers and facilitators to nurses' acceptance of the Johnson and Johnson Protectiv®* Plus IV catheter safety needle device and implications for needlestick injuries at St. Luke's Episcopal Hospital, Houston, Texas. A one-time cross-sectional survey of 620 responding nurses was conducted by this researcher during December, 2000. The study objectives were to: (1) describe the perceived (a) organizational and individual barriers and facilitators and (b) acceptance of implementation of the IV catheter device; (2) examine the relative importance of these predictors; (3) describe (a) perceived changes in needlestick injuries after implementation of the device; (b) the reported incidence of injuries; and (c) the extent of underreporting by nurses; and (4) examine the relative importance of (a) the preceding predictors and (b) acceptance of the device in predicting perceived changes in needlestick injuries. Safety climate and training were evaluated as organizational factors. Individual factors evaluated were experience with the device, including time using it and frequency of use, and background information, including nursing unit, and length of time as a nurse in this hospital and in total nursing career. The conceptual framework was based upon the safety climate model. Descriptive statistics and multiple and logistic regression were utilized to address the study objectives. ^ The findings showed widespread acceptance of the device and a strong perception that it reduced the number of needlesticks. Acceptance was notably predicted by adequate training, appropriate time between training and device use, solid safety climate, and short length of service, in that order. A barrier to acceptance was nurses' longtime of use of previous needle technologies. Over four-fifths of nurses were compliant in always using the device. Compliance had two facilitators: length of time using device and, to a lesser extent, safety climate. Rates of compliance tended to be lower among nurses in units in which the device was frequently used. ^ High quality training and an atmosphere of caring about nurse safety stand out as primary facilitators that other institutions would need to adopt in order to achieve maximum success in implementing safety programs involving utilization of new safety devices. ^
Resumo:
A case-series analysis of approximately 811 cancer patients who developed Candidemia between 1989 and 1998 and seen at M. D. Anderson Cancer Center, was studied to assess the impact and timing of central venous catheter (CVC) removal on the outcome of fungal bloodstream infections in cancer patients with primary catheter-related Candidemia as well as secondary infections. ^ This study explored the diagnosis and the management of vascular catheter-associated fungemia in patients with cancer. The microbiologic and clinical factors were determined to predict catheter-related Candidemia. Those factors included, in addition to basic demographics, the underlying malignancy, chemotherapy, neutropenia, and other salient data. Statistical analyses included univariate and multivariate logistic regression to determine the outcome of Candidemia in relation to the timing of catheter removal, type of species, and to identify predictors of catheter-related infections. ^ The conclusions of the study aim at enhancing our mastery of issues involving CVC removal and potentially will have an impact on the management of nosocomial bloodstream infections related to timing of CVC removal and the optimal duration of treatment of catheter-related Candidemia. ^
Resumo:
Background. Health care associated catheter related blood stream infections (CRBSI) represent a significant public health concern in the United States. Several studies have suggested that precautions such as maximum sterile barrier and use of antimicrobial catheters are efficacious at reducing CRBSI, but there is concern within the medical community that the prolonged use of antimicrobial catheters may be associated with increased bacterial resistance. Clinical studies have been done showing no association and a significant decrease in microbial resistance with prolonged minocycline/rifampin (M/R) catheter use. One explanation is the emergence of community acquired methicillin resistant Staphylococcus aureus (MRSA), which is more susceptible to antibiotics, as a cause of CRBSI.^ Methods. Data from 323 MRSA isolates cultured from cancer patients at The University of Texas MD Anderson Cancer center from 1997-2007 displaying MRSA infection were analyzed to determine whether there is a relationship between resistance to minocycline and rifampin and prolonged wide spread use of minocycline (M/R) catheters. Analysis was also conducted to determine whether there was a significant change in the prevalence community acquired MRSA (CA-MRSA) during this time period and if this emergence act as a confounder masquerading the true relationship between microbial resistance and prolonged M/R catheter use.^ Results. Our study showed that the significant (p=0.008) change in strain type over time is a confounding variable; the adjusted model showed a significant protective effect (OR 0.000281, 95% CI 1.4x10 -4-5.5x10-4) in the relationship between MRSA resistance to minocycline and prolonged M/R catheter use. The relationship between resistance to rifampin and prolonged M/R catheter use was not significant.^ Conclusion. The emergence of CA-MRSA is a confounder and in the relationship between resistance to minocycline and rifampin and prolonged M/R catheter use. However, despite the adjustment for the more susceptible CA-MRSA the widespread use of M/R catheters does not promote microbial resistance. ^
Resumo:
Catheter related bloodstream infections are a significant barrier to success in many inpatient healthcare facilities. The goal of this study was to analyze and determine if an evidence based methodology to reduce the number of catheter related bloodstream infections in a pediatric inpatient healthcare facility had significant impact on the infection rate. Catheter related bloodstream infection rates were compared before and after program implementation. The patient population was selected based upon a recommendation in the 2010 National Healthcare Safety Network report on device related infections. This report indicated a need for more data on pediatric populations requiring admission to a long term care facility. The study design is a retrospective cohort study. Catheter related bloodstream infection data was gathered between 2008 and 2011. In October of 2008 a program implementation began to reduce the number of catheter related bloodstream infections. The key components of this initiative were to implement a standardized catheter maintenance checklist, introduce the usage of a chlorhexadine gluconate based product for catheter maintenance and skin antisepsis, and a multidisciplinary education plan that focused on hand hygiene and aseptic technique. The catheter related bloodstream infection rate in 2008 was 21.21 infections per 1000 patient-line days. After program implementation the 2009 catheter related bloodstream infection rate dropped to 1.11 per 1000 patient-line days. The infection rates in 2010 and 2011 were 2.19 and 1.47 respectively. Additionally, this study demonstrated that there was a potential cost savings of $620,000 to $1,240,000 between 2008 and 2009. In conclusion, an evidence based program based upon CDC guidelines can have a significant impact on catheter related bloodstream infection rates. ^
Resumo:
Background: The distinction between catheter-associated asymptomatic bacteriuria (CAABU) and catheter-associated urinary tract infection (CAUTI) has only recently been widely appreciated. Our aims were to describe the relationship between CAUTI/CAABU and subsequent bacteremia and to investigate whether CAUTI/CAABU and antimicrobial use was associated with either bacteremia or mortality within 30 days. ^ Methods: Our study design was retrospective cohort. Patients with a urinary catheter and a positive urine culture between October 2010 and June 2011 at a large tertiary care facility were included. A multivariable model for analysis was constructed which controlled for age, race, Charlson co-morbidity score, catheter type and duration, category of organism,antimicrobials and classification of the catheter-associated bacteriuria as CAUTI or CAABU. ^ Results: Data from 444 catheter associated urine culture episodes in 308 unique patients were included in the analysis. Overall mortality was 21.1% (61 of 308 patients) within 30 days. Among the 444 urine culture episodes, 402 (90.5%) of these episodes were associated with antibiotic use. 52 (11.7%) of episodes were associated with bacteremia, but only 3 episodes of bacteremia (0.7% of 444 CAB episodes) were caused by an organism from the urinary tract. One of these episodes was CAABU and the other 2 were CAUTI. Bacteremia within 30 days was associated with having CAUTI rather than CAABU and having an indwelling urinary catheter rather than a condom catheter. The variables which were found to be significant for mortality within 30 days were a higher Charlson co-morbidity score and the presence of Candida in the urine culture. Use of antimicrobial agents to treat the bacteriuria was not associated with an increase or decrease in either bacteremia or mortality. ^ Conclusions: Our findings call into question the practice of giving antimicrobial agents to treat bacteriuria in an inpatient population with nearly universal antimicrobial use. A better practice may be targeted treatment of bacteriuria in patients with risk factors predictive of bacteremia and mortality.^
Resumo:
An observational study was conducted in a SICU to determine the frequency of subclavian vein catheter-related infection at 72 hours, to identify the hospital cost of exchange via a guidewire and the estimated hospital cost-savings of a 72 hour vs 144 hour exchange policy.^ An overall catheter-related infection ($\geq$15 col. by Maki's technique (1977)) occurred in 3% (3/100) of the catheter tips cultured. Specific infections rates were: 9.7% (3/31) for triple lumen catheters, 0% (0/30) for Swan-Ganz catheters, 0% (0/30) for Cordes catheters, and 0% (0/9) for single lumen catheters.^ An estimated annual hospital cost-savings of $35,699.00 was identified if exchange of 72 hour policy were changed to every 144 hours.^ It was recommended that a randomized clinical trial be conducted to determine the effect of changing a subclavian vein catheter via a guidewire every 72 hours vs 144 hours. ^
Resumo:
Virtual colonoscopy (VC) is a minimally invasive means for identifying colorectal polyps and colorectal lesions by insufflating a patient’s bowel, applying contrast agent via rectal catheter, and performing multi-detector computed tomography (MDCT) scans. The technique is recommended for colonic health screening by the American Cancer Society but not funded by the Centers for Medicare and Medicaid Services (CMS) partially because of potential risks from radiation exposure. To date, no in‐vivo organ dose measurements have been performed for MDCT scans; thus, the accuracy of any current dose estimates is currently unknown. In this study, two TLDs were affixed to the inner lumen of standard rectal catheters used in VC, and in-vivo rectal dose measurements were obtained within 6 VC patients. In order to calculate rectal dose, TLD-100 powder response was characterized at diagnostic doses such that appropriate correction factors could be determined for VC. A third-order polynomial regression with a goodness of fit factor of R2=0.992 was constructed from this data. Rectal dose measurements were acquired with TLDs during simulated VC within a modified anthropomorphic phantom configured to represent three sizes of patients undergoing VC. The measured rectal doses decreased in an exponential manner with increasing phantom effective diameter, with R2=0.993 for the exponential regression model and a maximum percent coefficient of variation (%CoV) of 4.33%. In-vivo measurements yielded rectal doses ranged from that decreased exponentially with increasing patient effective diameter, in a manner that was also favorably predicted by the size specific dose estimate (SSDE) model for all VC patients that were of similar age, body composition, and TLD placement. The measured rectal dose within a younger patient was favorably predicted by the anthropomorphic phantom dose regression model due to similarities in the percentages of highly attenuating material at the respective measurement locations and in the placement of the TLDs. The in-vivo TLD response did not increase in %CoV with decreasing dose, and the largest %CoV was 10.0%.
Resumo:
Opioids remain the drugs of choice in chronic pain treatment, but opioid tolerance, defined as a decrease in analgesic effect after prolonged or repeated use, dramatically limits their clinical utility. Opioid tolerance has classically been studied by implanting spinal catheters in animals for drug administration. This procedure has significant morbidity and mortality, as well as causing an inflammatory response which decreases the potency of opioid analgesia and possibly affects tolerance development. Therefore, we developed and validated a new method, intermittent lumbar puncture (Dautzenberg et al.), for the study of opioid analgesia and tolerance. Using this method, opioid tolerance was reliably induced without detectable morbidity. The dose of morphine needed to induce analgesia and tolerance using this method was about 100-fold lower than that required when using an intrathecal catheter. Only slight inflammation was found at the injection site, dissipated within seven mm. ^ DAMGO, an opioid μ receptor agonist, has been reported to inhibit morphine tolerance, but results from different studies are inconclusive. We evaluated the effect of DAMGO on morphine tolerance using our newly-developed ILP method, as well as other intrathecal catheter paradigms. We found that co-administration of sub-analgesic DAMGO with morphine using ILP did not inhibit morphine tolerance, but instead blocked the analgesic effects of morphine. Tolerance to morphine still developed. Tolerance to morphine can only be blocked by sub-analgesic dose of DAMGO when administered in a lumbar catheter, but not in cervical catheter settings. ^ Finally, we evaluated the effects of Gabapentin (GBP) on analgesia and morphine tolerance. We demonstrated that GBP enhanced analgesia mediated by both subanalgesic and analgesic doses of morphine although GBP itself was not analgesic. GBP increased potency and efficacy of morphine. GBP inhibited the expression, but not the development, of morphine tolerance. GBP blocked tolerance to analgesic morphine but not to subanalgesic morphine. GBP reversed the expression of morphine tolerance even after tolerance was established. These studies may begin to provide new insights into mechanisms of morphine tolerance development and improve clinical chronic pain management. ^
Resumo:
Central Line-Associated Bloodstream Infections (CLABSIs) are one of the most costly and preventable cases of morbidity and mortality among intensive care units (ICUs) in health care today. In 2008, the Centers for Medicare and Medicaid Services Medicare Program, under the Deficit Reduction Act, announced it will no longer reimburse hospitals for such adverse events among those related to CLABSIs. This reveals the financial burden shift onto the hospital rather than the health care payer who can now withhold reimbursements. With this weighing more heavily on hospital management, decision makers will need to find a way to completely prevent cases of CLABSI or simply pay for the financial consequences. ^ To reduce the risk of CLABSIs, several clinical, preventive interventions have been studied and even instituted including the Central Line (CL) Bundle and Antimicrobial Coated Central Venous Catheters (AM-CVCs). I carried out a formal systematic review on the topic to compare the cost-effectiveness of the Central Line (CL) Bundle to the commercially available antimicrobial coated central venous catheters (AM-CVCs) in preventing CLABSIs among critically and chronically ill patients in the U.S. Evidence was assessed for inclusion against predefined criteria. I, myself, conducted the data extraction. Ten studies were included in the review. Efficacy in reducing the mean incidence rate of CLABSI by the CL Bundle and AM-CVC interventions were compared with one another including costs. ^ The AM-CVC impregnated with antibiotics, rifampin-minocycline (AI-RM) is more clinically effective than the CL Bundle in reducing the mean rate of CLABSI per 1,000 catheter days. The lowest mean incidence rate of CLABSI per 1,000 catheter days among the AM-CVC studies was as low as zero in favor of the AI-RM. Moreover, the review revealed that the AI-RM appears to be more cost-effective than the CL Bundle. Results showed the adjusted incremental cost of the CL Bundle per ICU patient requiring a CVC to be approximately $196 while the AI-RM at only an additional cost of $48 per ICU patient requiring a CVC. ^ Limited data regarding the cost of the CL Bundle made it difficult to make a true comparison to the direct cost of the AM-CVCs. However, using the result I did have from this review, I concluded that the AM-CVCs do appear to be more cost-effective in decreasing the mean rate of CLABSI while also minimizing incremental costs per CVC than the CL Bundle. This review calls for further research addressing the cost of the CL Bundle and compliance and more effective study designs such as randomized control trials comparing the efficacy and cost of the CL Bundle to the AM-CVCs. Barriers that may face health care managers when implementing the CL Bundle or AM-CVCs include additional costs associated with the intervention, educational training and ongoing reinforcement as well as creating a new culture of understanding.^