842 resultados para Continuous Renal Replacement Therapy
Resumo:
Hyperhomocysteinemia is a potential risk factor for vascular disease and is associated with endothelial dysfunction, a predictor of adverse cardiovascular events. Renal patients (end-stage renal failure (ESRF) and transplant recipients (RTR)) exhibit both hyperhomocysteinemia and endothelial dysfunction with increasing evidence of a causative link between the 2 conditions. The elevated homocysteine appears to be due to altered metabolism in the kidney (intrarenal) and in the uremic circulation ( extrarenal). This review will discuss 18 supplementation studies conducted in ESRF and 6 in RTR investigating the effects of nutritional therapy to lower homocysteine. The clinical significance of lowering homocysteine in renal patients will be discussed with data on the effects of B vitamin supplementation on cardiovascular outcomes such as endothelial function presented. Folic acid is the most effective nutritional therapy to lower homocysteine. In ESRF patients, supplementation with folic acid over a wide dose range ( 2 - 20 mg/day) either individually or in combination with other B vitamins will decrease but not normalize homocysteine. In contrast, in RTR similar doses of folic acid normalizes homocysteine. Folic acid improves endothelial function in ESRF patients, however this has yet to be investigated in RTR. Homocysteine-lowering therapy is more effective in ESRF patients than RTR.
Resumo:
Objective: To evaluate the effectiveness of continuous positive airway pressure (CPAP) therapy in the treatment of hypernasality following traumatic brain injury (17111). Design: An A-B-A experimental research design. Assessments were conducted prior to commencement of the program, midway, immediately posttreatment, and 1 month after completion of the CPAP therapy program. Participants: Three adults with dysarthria and moderate to severe hypernasality subsequent to TBI. Outcome Measures: Perceptual evaluation using the Frenchay Dysarthria Assessment, the Assessment of Intelligibility of Dysarthric Speech, and a speech sample analysis, and instrumental evaluation using the Nasometer. Results: Between assessment periods, varying degrees of improvement in hypernasality and sentence intelligibility were noted. At the 1-month post-CPAP assessment, all 3 participants demonstrated reduced nasalance values, and 2 exhibited increased sentence intelligibility. Conclusions: CPAP may be a valuable treatment of impaired velopharyngeal function in the TBI population.
Resumo:
Background and purpose: Patients' knowledge and beliefs about their illnesses are known to influence a range of health related variables, including treatment compliance. It may, therefore, be important to quantify these variables to assess their impact on compliance, particularly in chronic illnesses such as Obstructive Sleep Apnea (OSA) that rely on self-administered treatments. The aim of this study was to develop two new tools, the Apnea Knowledge Test (AKT) and the Apnea Beliefs Scale (ABS), to assess illness knowledge and beliefs in OSA patients. Patients and methods: The systematic test construction process followed to develop the AKT and the ABS included consultation with sleep experts and OSA patients. The psychometric properties of the AKT and ABS were then investigated in a clinical sample of 81 OSA patients and 33 healthy, non-sleep disordered adults. Results: Results suggest both measures are easily understood by OSA patients, have adequate internal consistency, and are readily accepted by patients. A preliminary investigation of the validity of these tools, conducted by comparing patient data to that of the 33 healthy adults, revealed that apnea patients knew more about OSA, had more positive attitudes towards continuous positive airway pressure (CPAP) treatment, and attributed more importance to treating sleep disturbances than non-clinical groups. Conclusions: Overall, the results of psychometric analyses of these tests suggest these measures will be useful clinical tools with numerous beneficial applications, particularly in CPAP compliance studies and apnea education program evaluations. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Aim To develop an appropriate dosing strategy for continuous intravenous infusions (CII) of enoxaparin by minimizing the percentage of steady-state anti-Xa concentration (C-ss) outside the therapeutic range of 0.5-1.2 IU ml(-1). Methods A nonlinear mixed effects model was developed with NONMEM (R) for 48 adult patients who received CII of enoxaparin with infusion durations that ranged from 8 to 894 h at rates between 100 and 1600 IU h(-1). Three hundred and sixty-three anti-Xa concentration measurements were available from patients who received CII. These were combined with 309 anti-Xa concentrations from 35 patients who received subcutaneous enoxaparin. The effects of age, body size, height, sex, creatinine clearance (CrCL) and patient location [intensive care unit (ICU) or general medical unit] on pharmacokinetic (PK) parameters were evaluated. Monte Carlo simulations were used to (i) evaluate covariate effects on C-ss and (ii) compare the impact of different infusion rates on predicted C-ss. The best dose was selected based on the highest probability that the C-ss achieved would lie within the therapeutic range. Results A two-compartment linear model with additive and proportional residual error for general medical unit patients and only a proportional error for patients in ICU provided the best description of the data. Both CrCL and weight were found to affect significantly clearance and volume of distribution of the central compartment, respectively. Simulations suggested that the best doses for patients in the ICU setting were 50 IU kg(-1) per 12 h (4.2 IU kg(-1) h(-1)) if CrCL < 30 ml min(-1); 60 IU kg(-1) per 12 h (5.0 IU kg(-1) h(-1)) if CrCL was 30-50 ml min(-1); and 70 IU kg(-1) per 12 h (5.8 IU kg(-1) h(-1)) if CrCL > 50 ml min(-1). The best doses for patients in the general medical unit were 60 IU kg(-1) per 12 h (5.0 IU kg(-1) h(-1)) if CrCL < 30 ml min(-1); 70 IU kg(-1) per 12 h (5.8 IU kg(-1) h(-1)) if CrCL was 30-50 ml min(-1); and 100 IU kg(-1) per 12 h (8.3 IU kg(-1) h(-1)) if CrCL > 50 ml min(-1). These best doses were selected based on providing the lowest equal probability of either being above or below the therapeutic range and the highest probability that the C-ss achieved would lie within the therapeutic range. Conclusion The dose of enoxaparin should be individualized to the patients' renal function and weight. There is some evidence to support slightly lower doses of CII enoxaparin in patients in the ICU setting.
Resumo:
Gram-positive microorganisms, specifically coagulase-negative staphylococci are the most common species recovered from clinical culture specimens of patients with end-stage renal disease. The propensity of coagulase-negative staphylococci (CNS) to cause infection in this patient group has been widely debated. However, it is still unclear how this usually avirulent commensal microorganism produces infection that contributes to high rates of morbidity and mortality in patients with end-stage renal disease. The aim of this thesis was to investigate the rate, geographical distribution, molecular and phenotypic mechanisms of Gram-positive microorganisms associated with infection in renal dialysis patients. In addition, it sought to assess the value of early serological diagnosis of dialysis catheter-associated infection and the effect of antimicrobial treatment regimens on the faecal carriage of enteric microorganisms. In this study, the incidence of haemodialysis catheter-associated infection was established with the Meditrend audit tool. This tool was used to assess the infection outcomes of catheter insertion and management procedures until the catheter was explanted. Introduction of a catheter management protocol decreased the incidence of catheter-related infection. Staphylococcal species recovered from episodes of haemodialysis catheter-associated infection and continuous ambulatory peritoneal dialysis (CAPD)-associated peritonitis were genotyped by determination of macrorestriction profiles with pulsed-field gel electrophoresis. This highlighted horizontal transfer of microorganisms between different patients and the environment. The phenotypic characteristics of these strains were also investigated to determine characteristics that could be used as markers for dialysis catheter-associated infection. The expression of elastase, lipase and esterase by CNS was significantly associated with infection. A rapid enzyme-linked immunosorbent assay incorporating a novel staphylococcal antigen (lipid S) was used to evaluate the early detection of anti-staphylococcal immunoglobulin gamma in patient sera. The comparison of culture positive and culture negative patients demonstrated a steady state of immune activation in both groups. However anti-lipid S serum antibody titres > 1000 were found to be a predictor of infection. The effect on faecal carriage of vancomycin resistant enterococci (VRE) and Clostridium difficile toxins in patients treated with CAPD when empiric cephalosporin therapy was substituted for piperacillin/tazobactam was investigated. The introduction of piperacillin/tazobactam demonstrated a decrease in the faecal carriage of VRE.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
End-stage renal failure is a life-threatening condition, often treated with home-based peritoneal dialysis (PD). PD is a demanding regimen, and the patients who practise it must make numerous lifestyle changes and learn complicated biomedical techniques. In our experience, the renal nurses who provide mostPDeducation frequently express concerns that patient compliance with their teaching is poor. These concerns are mirrored in the renal literature. It has been argued that the perceived failure of health professionals to improve compliance rates with PD regimens is because ‘compliance’ itself has never been adequately conceptualized or defined; thus, it is difficult to operationalize and quantify. This paper examines how a group of Australian renal nurses construct patient compliance with PD therapy. These empirical data illuminate how PD compliance operates in one practice setting; how it is characterized by multiple and often competing energies; and how ultimately it might be pointless to try to tame ‘compliance’ through rigid definitions and measurement, or to rigidly enforce it in PD patients. The energies involved are too fractious and might be better spent, as many of the more experienced nurses in this study argue, in augmenting the energies that do work well together to improve patient outcomes.
Resumo:
Continuous infusion (CI) ticarcillin–clavulanate is a potential therapeutic improvement over conventional intermittent dosing because the major pharmacodynamic (PD) predictor of efficacy of β-lactams is the time that free drug levels exceed the MIC. This study incorporated a 6-year retrospective arm evaluating efficacy and safety of CI ticarcillin–clavulanate in the home treatment of serious infections and a prospective arm additionally evaluating pharmacokinetics (PK) and PD. In the prospective arm, steady-state serum ticarcillin and clavulanate levels and MIC testing of significant pathogens were performed. One hundred and twelve patients (median age, 56 years) were treated with a CI dose of 9.3–12.4 g/day and mean CI duration of 18.0 days. Infections treated included osteomyelitis (50 patients), septic arthritis (6), cellulitis (17), pulmonary infections (12), febrile neutropenia (7), vascular infections (7), intra-abdominal infections (2), and Gram-negative endocarditis (2); 91/112 (81%) of patients were cured, 14 (13%) had partial response and 7 (6%) failed therapy. Nine patients had PICC line complications and five patients had drug adverse events. Eighteen patients had prospective PK/PD assessment although only four patients had sufficient data for a full PK/PD evaluation (both serum steady-state drug levels and ticarcillin and clavulanate MICs from a bacteriological isolate), as this was difficult to obtain in home-based patients, particularly as serum clavulanate levels were found to deteriorate rapidly on storage. Three of four patients with matched PK/PD assessment had free drug levels exceeding the MIC of the pathogen. Home CI of ticarcillin–clavulanate is a safe, effective, convenient and practical therapy and is a therapeutic advance over traditional intermittent dosing when used in the home setting.
Resumo:
Due to the limitation of current condition monitoring technologies, the estimates of asset health states may contain some uncertainties. A maintenance strategy ignoring this uncertainty of asset health state can cause additional costs or downtime. The partially observable Markov decision process (POMDP) is a commonly used approach to derive optimal maintenance strategies when asset health inspections are imperfect. However, existing applications of the POMDP to maintenance decision-making largely adopt the discrete time and state assumptions. The discrete-time assumption requires the health state transitions and maintenance activities only happen at discrete epochs, which cannot model the failure time accurately and is not cost-effective. The discrete health state assumption, on the other hand, may not be elaborate enough to improve the effectiveness of maintenance. To address these limitations, this paper proposes a continuous state partially observable semi-Markov decision process (POSMDP). An algorithm that combines the Monte Carlo-based density projection method and the policy iteration is developed to solve the POSMDP. Different types of maintenance activities (i.e., inspections, replacement, and imperfect maintenance) are considered in this paper. The next maintenance action and the corresponding waiting durations are optimized jointly to minimize the long-run expected cost per unit time and availability. The result of simulation studies shows that the proposed maintenance optimization approach is more cost-effective than maintenance strategies derived by another two approximate methods, when regular inspection intervals are adopted. The simulation study also shows that the maintenance cost can be further reduced by developing maintenance strategies with state-dependent maintenance intervals using the POSMDP. In addition, during the simulation studies the proposed POSMDP shows the ability to adopt a cost-effective strategy structure when multiple types of maintenance activities are involved.
Resumo:
Objective: Adherence to Continuous Positive Airway Pressure Therapy (CPAP) for Obstructive Sleep Apnoea (OSA) is poor. We assessed the effectiveness of a motivational interviewing intervention (MINT) in addition to best practice standard care to improve acceptance and adherence to CPAP therapy in people with a new diagnosis of OSA. Method: 106 Australian adults (69% male) with a new diagnosis of obstructive sleep apnoea and clinical recommendation for CPAP treatment were recruited from a tertiary sleep disorders centre. Participants were randomly assigned to receive either three sessions of a motivational interviewing intervention ‘MINT’ (n=53; mean age=55.4 years), or no intervention ‘Control’ (n=53; mean age=57.74). The primary outcome was the difference between the groups in objective CPAP adherence at 1 month, 2 months, 3 months and 12 months follow-up. Results: Fifty (94%) participants in the MINT group and 50 (94%) of participants in the control group met all inclusion and exclusion criteria and were included in the primary analysis. The hours of CPAP use per night in the MINT group at 3 months was 4.63 hours and 3.16 hours in the control group (p=0.005). This represents almost 50% better adherence in the MINT group relative to the control group. Patients in the MINT group were substantially more likely to accept CPAP treatment. Conclusions: MINT is a brief, manualized, effective intervention which improves CPAP acceptance and objective adherence rates as compared to standard care alone.
Resumo:
BACKGROUND: US Centers for Disease Control guidelines recommend replacement of peripheral intravenous (IV) catheters no more frequently than every 72 to 96 hours. Routine replacement is thought to reduce the risk of phlebitis and bloodstream infection. Catheter insertion is an unpleasant experience for patients and replacement may be unnecessary if the catheter remains functional and there are no signs of inflammation. Costs associated with routine replacement may be considerable. This is an update of a review first published in 2010. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present.
Resumo:
Patients presenting for knee replacement on warfarin for medical reasons often require higher levels of anticoagulation peri-operatively than primary thromboprophylaxis and may require bridging therapy with heparin. We performed a retrospective case control study on 149 consecutive primary knee arthroplasty patients to investigate whether anti-coagulation affected short-term outcomes. Specific outcome measures indicated significant increases in prolonged wound drainage (26.8% of cases vs 7.3% of controls, p<0.001); superficial infection (16.8% vs 3.3%, p<0.001); deep infection (6.0% vs 0%, p<0.001); return-to-theatre for washout (4.7% vs 0.7%, p=0.004); and revision (4.7% vs 0.3%, p=0.001). Management of patients on long-term warfarin therapy following TKR is particularly challenging, as the surgeon must balance risk of thromboembolism against post-operative complications on an individual patient basis in order to optimise outcomes.
Resumo:
The spontaneous reaction between microrods of an organic semiconductor molecule, copper 7,7,8,8-tetracyanoquinodimethane (CuTCNQ) with [AuBr4]− ions in an aqueous environment is reported. The reaction is found to be redox in nature which proceeds via a complex galvanic replacement mechanism, wherein the surface of the CuTCNQ microrods is replaced with metallic gold nanoparticles. Unlike previous reactions reported in acetonitrile, the galvanic replacement reaction in aqueous solution proceeds via an entirely different reaction mechanism, wherein a cyclical reaction mechanism involving continuous regeneration of CuTCNQ consumed during the galvanic replacement reaction occurs in parallel with the galvanic replacement reaction. This results in the driving force of the galvanic replacement reaction in aqueous medium being largely dependent on the availability of [AuBr4]− ions during the reaction. Therefore, this study highlights the importance of the choice of an appropriate solvent during galvanic replacement reactions, which can significantly impact upon the reaction mechanism. The reaction progress with respect to different gold salt concentration was monitored using Fourier transform infrared (FT-IR), Raman, and X-ray photoelectron spectroscopy (XPS), as well as XRD and EDX analysis, and SEM imaging. The CuTCNQ/Au nanocomposites were also investigated for their potential photocatalytic properties, wherein the destruction of the organic dye, Congo red, in a simulated solar light environment was found to be largely dependent on the degree of gold nanoparticle surface coverage. The approach reported here opens up new possibilities of decorating metal–organic charge transfer complexes with a host of metals, leading to potentially novel applications in catalysis and sensing.