850 resultados para Intermittent catheter
Resumo:
O uso de cateter intermitente representa um procedimento muito utilizado em ambiente hospitalar. Este estudo, de natureza experimental, teve como propósito a análise comparativa entre o cateter intermitente, preenchido com solução de heparina 100 U/ml e o preenchido com soro fisiológico 0,9%. Foi realizado um estudo comparativo, avaliando a formação de coágulos e se estes estivessem presentes seriam quantificados. Para tanto, foram utilizados quatro grupos de 7 coelhos, da linhagem Norfolk, sendo realizado acesso venoso em veia auricular conectado ao dispositivo de infusão múltipla. em dois destes grupos, foi usada solução salina; em um grupo, caso fosse encontrada resistência, esta seria forçada; no outro, esta resistência não seria forçada. Nos dois outros grupos, foi usada solução de heparina; em um grupo, caso fosse encontrada resistência, esta seria forçada; no outro esta resistência não seria forçada. O cateter intermitente foi mantido por 72 horas e, a cada 12 horas foi infundida água destilada, simulando a administração de medicamento. Durante o experimento foram observadas as seguintes variáveis: presença da formação de coágulos, comprimento do coágulo, peso fresco do coágulo, peso seco do coágulo, contagem plaquetária e desconforto. Na análise estatística feita pelo método não paramétrico Kruskal-Wallis, com nível de significância de sigma = 5%, não foi encontrada diferença significativa entre os grupos e momentos. Pôde-se constatar a presença de formação de coágulo em todos os grupos independentemente da solução de preenchimento. Concluímos, principalmente, que há necessidade de elaboração de novas normas e rotinas na utilização de cateter intermitente.
Resumo:
Background Thoracoscopic anterior scoliosis instrumentation is a safe and viable surgical option for corrective fusion of progressive adolescent idiopathic scoliosis (AIS) and has been performed at our centre on 205 patients since 2000. However, there is a paucity of literature reporting on or examining optimum methods of analgesia following this type of surgery. A retrospective study was designed to present the authors’ technique for delivering intermittent local anaesthetic boluses via an intrapleural catheter following thoracoscopic scoliosis surgery; report the pain levels that may be expected and any adverse effects associated with the use of intrapleural analgesia, as part of a combined postoperative analgesia regime. Methods Records for 32 patients who underwent thoracoscopic anterior correction for AIS were reviewed. All patients received an intrapleural catheter inserted during surgery, in addition to patient-controlled opiate analgesia and oral analgesia. After surgery, patients received a bolus of 0.25% bupivacaine every four hours via the intrapleural catheter. Patient’s perceptions of their pain control was measured using the visual analogue pain scale scores which were recorded before and after local anaesthetic administration and the quantity and time of day that any other analgesia was taken, were also recorded. Results 28 female and four male patients (mean age 14.5 ± 1.5 years) had a total of 230 boluses of local anaesthetic administered in the 96 hour period following surgery. Pain scores significantly decreased following the administration of a bolus (p < 0.0001), with the mean pain score decreasing from 3.66 to 1.83. The quantity of opiates via patient-controlled analgesia after surgery decreased steadily between successive 24 hours intervals after an initial increase in the second 24 hour period when patients were mobilised. One intrapleural catheter required early removal due to leakage; there were no other associated complications with the intermittent intrapleural analgesia method. Conclusions Local anaesthetic administration via an intrapleural catheter is a safe and effective method of analgesia following thoracoscopic anterior scoliosis correction. Post-operative pain following anterior thoracic scoliosis surgery can be reduced to ‘mild’ levels by combined analgesia regimes. Keywords: Adolescent idiopathic scoliosis; Thoracoscopic anterior spinal fusion; Anterior fusion; Intrapleural analgesia; Endoscopic anterior surgery; Pain relief; Scoliosis surgery
Resumo:
Introduction: Thoracoscopic anterior instrumented fusion (TASF) is a safe and viable surgical option for corrective stabilisation of progressive adolescent idiopathic scoliosis (AIS) [1-2]. However, there is a paucity of literature examining optimum methods of analgesia following this type of surgery. The aim of this study was to identify; if local anaesthetic bolus via an intrapleural catheter provides effective analgesia following thoracoscopic scoliosis correction; what pain levels may be expected; and any adverse effects associated with the use of intermittent intrapleural analgesia at our centre. Methods: A subset of the most recent 80 patients from a large single centre consecutive series of 201 patients (April 2000 to present) who had undergone TASF had their medical records reviewed. 32 patients met the inclusion criteria for the analysis (i.e. pain scores must have been recorded within the hour prior and within two hours following an intrapleural bolus being given). All patients received an intrapleural catheter inserted during surgery, in addition to patient-controlled opiate analgesia and oral analgesia as required. After surgery, patients received a bolus of 0.25% bupivacaine every four hours via the intrapleural catheter. Visual analogue pain scale scores were recorded before and after the bolus of local anaesthetic and the quantity and time of day that any other analgesia was taken, were also recorded. Results and Discussion: 28 female and four male patients (mean age 14.5 ± 1.5 years) had a total of 230 boluses of local anaesthetic administered intrapleurally, directly onto the spine, in the 96 hour period following surgery. Pain scores significantly decreased following the administration of a bolus (p<0.0001), with the mean pain score decreasing from 3.66 to 1.83. The quantity of opiates via patient-controlled analgesia after surgery decreased steadily between successive 24 hours intervals after an initial increase in the second 24 hour period when patients were mobilised. One intrapleural catheter required early removal at 26 hours postop due to leakage; there were no other associated complications with the intermittent intrapleural analgesia method. Post-operative pain following anterior scoliosis correction was decreased significantly with the administration of regular local anaesthetic boluses and can be reduced to ‘mild’ levels by combined analgesia regimes. The intermittent intrapleural analgesia method was not associated with any adverse events or complications in the full cohort of 201 patients.
Resumo:
Background Guidelines and clinical practice for the prevention of complications associated with central venous catheters (CVC) around the world vary greatly. Most institutions recommend the use of heparin to prevent occlusion, however there is debate regarding the need for heparin and evidence to suggest 0.9% sodium chloride (normal saline) may be as effective. The use of heparin is not without risk, may be unnecessary and is also associated with increased cost. Objectives To assess the clinical effects (benefits and harms) of intermittent flushing of heparin versus normal saline to prevent occlusion in long term central venous catheters in infants and children. Search Methods The Cochrane Vascular Trials Search Co-ordinator searched the Specialised Register (last searched April 2015) and the Cochrane Register of Studies (Issue 3, 2015). We also searched the reference lists of retrieved trials. Selection criteria Randomised controlled trials that compared the efficacy of normal saline with heparin to prevent occlusion of long term CVCs in infants and children aged up to 18 years of age were included. We excluded temporary CVCs and peripherally inserted central catheters (PICC). Data Collection and Analysis Two review authors independently assessed trial inclusion criteria, trial quality and extracted data. Rate ratios were calculated for two outcome measures - occlusion of the CVC and central line-associated blood stream infection. Other outcome measures included duration of catheter placement, inability to withdraw blood from the catheter, use of urokinase or recombinant tissue plasminogen, incidence of removal or re-insertion of the catheter, or both, and other CVC-related complications such as dislocation of CVCs, other CVC site infections and thrombosis. Main Results Three trials with a total of 245 participants were included in this review. The three trials directly compared the use of normal saline and heparin, however, between studies, all used different protocols for the standard and experimental arms with different concentrations of heparin and different frequency of flushes reported. In addition, not all studies reported on all outcomes. The quality of the evidence ranged from low to very low because there was no blinding, heterogeneity and inconsistency between studies was high and the confidence intervals were wide. CVC occlusion was assessed in all three trials (243 participants). We were able to pool the results of two trials for the outcomes of CVC occlusion and CVC-associated blood stream infection. The estimated rate ratio for CVC occlusion per 1000 catheter days between the normal saline and heparin group was 0.75 (95% CI 0.10 to 5.51, two studies, 229 participants, very low quality evidence). The estimated rate ratio for CVC-associated blood stream infection was 1.48 (95% CI 0.24 to 9.37, two studies, 231 participants; low quality evidence). The duration of catheter placement was reported to be similar between the two study arms, in one study (203 participants). Authors' Conclusions The review found that there was not enough evidence to determine the effects of intermittent flushing of heparin versus normal saline to prevent occlusion in long term central venous catheters in infants and children. Ultimately, if this evidence were available, the development of evidenced-based clinical practice guidelines and consistency of practice would be facilitated.
Resumo:
Background Around the world, guidelines and clinical practice for the prevention of complications associated with central venous catheters (CVC) vary greatly. To prevent occlusion, most institutions recommend the use of heparin when the CVC is not in use. However, there is debate regarding the need for heparin and evidence to suggest normal saline may be as effective. The use of heparin is not without risk, may be unnecessary and is also associated with increased costs. Objectives To assess the clinical effects (benefits and harms) of heparin versus normal saline to prevent occlusion in long-term central venous catheters in infants, children and adolescents. Design A Cochrane systematic review of randomised controlled trials was undertaken. - Data sources: The Cochrane Vascular Group Specialised Register (including MEDLINE, CINAHL, EMBASE and AMED) and the Cochrane Register of Studies were searched. Hand searching of relevant journals and reference lists of retrieved articles was also undertaken. - Review Methods: Data were extracted and appraisal undertaken. We included studies that compared the efficacy of normal saline with heparin to prevent occlusion. We excluded temporary CVCs and peripherally inserted central catheters. Rate ratios per 1000 catheter days were calculated for two outcomes, occlusion of the CVC, and CVC-associated blood stream infection. Results Three trials with a total of 245 participants were included in this review. The three trials directly compared the use of normal saline and heparin. However, between studies, all used different protocols with various concentrations of heparin and frequency of flushes. The quality of the evidence ranged from low to very low. The estimated rate ratio for CVC occlusion per 1000 catheter days between the normal saline and heparin group was 0.75 (95% CI 0.10 to 5.51, two studies, 229 participants, very low quality evidence). The estimated rate ratio for CVC-associated blood stream infection was 1.48 (95% CI 0.24 to 9.37, two studies, 231 participants; low quality evidence). Conclusions It remains unclear whether heparin is necessary for CVC maintenance. More well-designed studies are required to understand this relatively simple, but clinically important question. Ultimately, if this evidence were available, the development of evidenced-based clinical practice guidelines and consistency of practice would be facilitated.
Resumo:
PURPOSE: To report percutaneous fenestration of aortic dissection flaps to relieve distal ischemia using a novel intravascular ultrasound (IVUS)-guided fenestration device. CASE REPORTS: Two men (47 and 62 years of age) with aortic dissection and intermittent claudication had percutaneous ultrasound-guided fenestration performed under local anesthesia. Using an ipsilateral transfemoral approach, the intimal flap was punctured under real-time IVUS guidance using a needle-catheter combination through which a guidewire was placed across the dissection flap into the false lumen. The fenestration was achieved using balloon catheters of increasing diameter introduced over the guidewire. Stenting of the re-entry was performed in 1 patient to equalize pressure across the dissection membrane in both lumens. The procedures were performed successfully and without complications. In both patients, ankle-brachial indexes improved from 0.76 to 1.07 and from 0.8 to 1.1, respectively. Both patients were without claudication at the 3- and 6-month follow-up examination. CONCLUSION: Percutaneous intravascular ultrasound-guided fenestration and stenting at the level of the iliac artery in aortic dissection patients with claudication is a technically feasible and safe procedure and relieves symptoms.
Resumo:
To shed light on the potential efficacy of cycling as a testing modality in the treatment of intermittent claudication (IC), this study compared physiological and symptomatic responses to graded walking and cycling tests in claudicants. Sixteen subjects with peripheral arterial disease (resting ankle: brachial index (ABI) < 0.9) and IC completed a maximal graded treadmill walking (T) and cycle (C) test after three familiarization tests on each mode. During each test, symptoms, oxygen uptake (VO2), minute ventilation (VE), respiratory exchange ratio (RER) and heart rate (HR) were measured, and for 10 min after each test the brachial and ankle systolic pressures were recorded. All but one subject experienced calf pain as the primary limiting symptom during T; whereas the symptoms were more varied during C and included thigh pain, calf pain and dyspnoea. Although maximal exercise time was significantly longer on C than T (690 +/- 67 vs. 495 +/- 57 s), peak VO2, peak VE and peak heart rate during C and T were not different; whereas peak RER was higher during C. These responses during C and T were also positively correlated (P < 0.05) with each other, with the exception of RER. The postexercise systolic pressures were also not different between C and T. However, the peak decline in ankle pressures from resting values after C and T were not correlated with each other. These data demonstrate that cycling and walking induce a similar level of metabolic and cardiovascular strain, but that the primary limiting symptoms and haemodynamic response in an individual's extremity, measured after exercise, can differ substantially between these two modes.
Complex Impedance Measurement During RF Catheter Ablation: A More Accurate Measure of Power Delivery
Resumo:
Catheter-related bloodstream infections are a serious problem. Many interventions reduce risk, and some have been evaluated in cost-effectiveness studies. We review the usefulness and quality of these economic studies. Evidence is incomplete, and data required to inform a coherent policy are missing. The cost-effectiveness studies are characterized by a lack of transparency, short time-horizons, and narrow economic perspectives. Data quality is low for some important model parameters. Authors of future economic evaluations should aim to model the complete policy and not just single interventions. They should be rigorous in developing the structure of the economic model, include all relevant economic outcomes, use a systematic approach for selecting data sources for model parameters, and propagate the effect of uncertainty in model parameters on conclusions. This will inform future data collection and improve our understanding of the economics of preventing these infections.
Resumo:
Background The accurate measurement of Cardiac output (CO) is vital in guiding the treatment of critically ill patients. Invasive or minimally invasive measurement of CO is not without inherent risks to the patient. Skilled Intensive Care Unit (ICU) nursing staff are in an ideal position to assess changes in CO following therapeutic measures. The USCOM (Ultrasonic Cardiac Output Monitor) device is a non-invasive CO monitor whose clinical utility and ease of use requires testing. Objectives To compare cardiac output measurement using a non-invasive ultrasonic device (USCOM) operated by a non-echocardiograhically trained ICU Registered Nurse (RN), with the conventional pulmonary artery catheter (PAC) using both thermodilution and Fick methods. Design Prospective observational study. Setting and participants Between April 2006 and March 2007, we evaluated 30 spontaneously breathing patients requiring PAC for assessment of heart failure and/or pulmonary hypertension at a tertiary level cardiothoracic hospital. Methods SCOM CO was compared with thermodilution measurements via PAC and CO estimated using a modified Fick equation. This catheter was inserted by a medical officer, and all USCOM measurements by a senior ICU nurse. Mean values, bias and precision, and mean percentage difference between measures were determined to compare methods. The Intra-Class Correlation statistic was also used to assess agreement. The USCOM time to measure was recorded to assess the learning curve for USCOM use performed by an ICU RN and a line of best fit demonstrated to describe the operator learning curve. Results In 24 of 30 (80%) patients studied, CO measures were obtained. In 6 of 30 (20%) patients, an adequate USCOM signal was not achieved. The mean difference (±standard deviation) between USCOM and PAC, USCOM and Fick, and Fick and PAC CO were small, −0.34 ± 0.52 L/min, −0.33 ± 0.90 L/min and −0.25 ± 0.63 L/min respectively across a range of outputs from 2.6 L/min to 7.2 L/min. The percent limits of agreement (LOA) for all measures were −34.6% to 17.8% for USCOM and PAC, −49.8% to 34.1% for USCOM and Fick and −36.4% to 23.7% for PAC and Fick. Signal acquisition time reduced on average by 0.6 min per measure to less than 10 min at the end of the study. Conclusions In 80% of our cohort, USCOM, PAC and Fick measures of CO all showed clinically acceptable agreement and the learning curve for operation of the non-invasive USCOM device by an ICU RN was found to be satisfactorily short. Further work is required in patients receiving positive pressure ventilation.
Resumo:
Background: A bundled approach to central venous catheter care is currently being promoted as an effective way of preventing catheter-related bloodstream infection (CR-BSI). Consumables used in the bundled approach are relatively inexpensive which may lead to the conclusion that the bundle is cost-effective. However, this fails to consider the nontrivial costs of the monitoring and education activities required to implement the bundle, or that alternative strategies are available to prevent CR-BSI. We evaluated the cost-effectiveness of a bundle to prevent CR-BSI in Australian intensive care patients. ---------- Methods and Findings: A Markov decision model was used to evaluate the cost-effectiveness of the bundle relative to remaining with current practice (a non-bundled approach to catheter care and uncoated catheters), or use of antimicrobial catheters. We assumed the bundle reduced relative risk of CR-BSI to 0.34. Given uncertainty about the cost of the bundle, threshold analyses were used to determine the maximum cost at which the bundle remained cost-effective relative to the other approaches to infection control. Sensitivity analyses explored how this threshold alters under different assumptions about the economic value placed on bed-days and health benefits gained by preventing infection. If clinicians are prepared to use antimicrobial catheters, the bundle is cost-effective if national 18-month implementation costs are below $1.1 million. If antimicrobial catheters are not an option the bundle must cost less than $4.3 million. If decision makers are only interested in obtaining cash-savings for the unit, and place no economic value on either the bed-days or the health benefits gained through preventing infection, these cost thresholds are reduced by two-thirds.---------- Conclusions: A catheter care bundle has the potential to be cost-effective in the Australian intensive care setting. Rather than anticipating cash-savings from this intervention, decision makers must be prepared to invest resources in infection control to see efficiency improvements.
Resumo:
BACKGROUND: There has been some difficulty getting standard laboratory rats to voluntarily consume large amounts of ethanol without the use of initiation procedures. It has previously been shown that standard laboratory rats will voluntarily consume high levels of ethanol if given intermittent-access to 20% ethanol in a 2-bottle-choice setting [Wise, Psychopharmacologia 29 (1973), 203]. In this study, we have further characterized this drinking model. METHODS: Ethanol-naïve Long-Evans rats were given intermittent-access to 20% ethanol (three 24-hour sessions per week). No sucrose fading was needed and water was always available ad libitum. Ethanol consumption, preference, and long-term drinking behaviors were investigated. Furthermore, to pharmacologically validate the intermittent-access 20% ethanol drinking paradigm, the efficacy of acamprosate and naltrexone in decreasing ethanol consumption were compared with those of groups given continuous-access to 10 or 20% ethanol, respectively. Additionally, ethanol consumption was investigated in Wistar and out-bred alcohol preferring (P) rats following intermittent-access to 20% ethanol. RESULTS: The intermittent-access 20% ethanol 2-bottle-choice drinking paradigm led standard laboratory rats to escalate their ethanol intake over the first 5 to 6 drinking sessions, reaching stable baseline consumption of high amounts of ethanol (Long-Evans: 5.1 +/- 0.6; Wistar: 5.8 +/- 0.8 g/kg/24 h, respectively). Furthermore, the cycles of excessive drinking and abstinence led to an increase in ethanol preference and increased efficacy of both acamprosate and naltrexone in Long-Evans rats. P-rats initiate drinking at a higher level than both Long-Evans and Wistar rats using the intermittent-access 20% ethanol paradigm and showed a trend toward a further escalation in ethanol intake over time (mean ethanol intake: 6.3 +/- 0.8 g/kg/24 h). CONCLUSION: Standard laboratory rats will voluntarily consume ethanol using the intermittent-access 20% ethanol drinking paradigm without the use of any initiation procedures. This model promises to be a valuable tool in the alcohol research field.
Resumo:
Drying is very energy intensive process and consumes about 20-25% of the energy used by food processing industry. The energy efficiency of the process and quality of dried product are two key factors in food drying. Global energy crisis and demand for quality dried food further challenge researchers to explore innovative techniques in food drying to address these issues. Intermittent drying is considered one of the promising solutions for improving energy efficiency and product quality without increasing the capital cost of the drier. Intermittent drying has already received much attention. However, a comprehensive review of recent progresses and overall assessment of energy efficiency and product quality in intermittent drying is lacking. The objective of this article is to discuss, analyze and evaluate the recent advances in intermittent drying research with energy efficiency and product quality as standpoint. Current available modelling techniques for intermittent drying are reviewed and their merits and demerits are analyzed. Moreover, intermittent application of ultrasound, infrared (IR) and microwave in combined drying technology have been reviewed and discussed. In this review article the gaps in the current literature are highlighted, some important future scopes for theoretical and experimental studies are identified and the direction of further research is suggested.