885 resultados para Venous ulcers


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Distal-to-proximal technique has been recommended for anti-cancer therapy administration. There is no evidence to suggest that a 24-hour delay of treatment is necessary for patients with a previous uncomplicated venous puncture proximal to the administration site. Objectives: This study aims to identify if the practice of 24-hour delay between a venous puncture and subsequent cannulation for anti-cancer therapies at a distal site is necessary for preventing extravasation. Methods: A prospective cohort study was conducted with 72 outpatients receiving anti-cancer therapy via an administration site distal to at least one previous uncomplicated venous puncture on the same arm in a tertiary cancer centre in Australia. Participants were interviewed and assessed at baseline data before treatment and on day 7 for incidence of extravasation/phlebitis. Results: Of 72 participants with 99 occasions of treatment, there was one incident of infiltration (possible extravasation) at the venous puncture site proximal to the administration site and two incidents of phlebitis at the administration site. Conclusions: A 24 hour delay is unnecessary if an alternative vein can be accessed for anti-cancer therapy after a proximal venous puncture. Implications for practice: Extravasation can occur at a venous puncture site proximal to an administration site in the same vein. However, the nurse can administer anti-cancer therapy at a distal site if the nurse can confidently determine the vein of choice is not in any way connected to the previous puncture site through visual inspection and palpation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

NICE guidelines have stated that patients undergoing elective hip surgery are at increased risk for venous thromboembolic events (VTE) following surgery and have recommended thromboprophylaxis for 28-35 days1, 2. However the studies looking at the new direct thrombin inhibitors have only looked at major bleeding. We prospectively looked at wound discharge in patients who underwent hip arthroplasty and were given dabigatran postoperatively between March 2010 and April 2010 (n=56). We retrospectively compared these results to a matched group of patients who underwent similar operations six months earlier when all patients were given dalteparin routinely postoperatively until discharge, and discharged home on 150mg aspirin daily for 6 weeks (n=67). Wound discharge after 5 days was significantly higher in the patients taking dabigatran (32% dabigatran n=18, 10% dalteparin n=17, p=0.003) and our rate of delayed discharges due to wound discharge significantly increased from 7% in the dalteparin group (n=5) to 27% for dabigatran (n=15, p=0.004). Patients who received dabigatran were more than five times as likely to return to theatre with a wound complication as those who received dalteparin (7% dabigatran n=4, vs. 1% dalteparin n=1), however, this was not statistically significant (p=0.18). The significantly higher wound discharge and return to theatre rates demonstrated in this study have meant that we have changed our practice to administering dalteparin until the wound is dry and then starting dabigatran. Our study demonstrates the need for further clinical studies regarding wound discharge and dabigatran.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bedsores (ulcers) are caused by multiple factors which include, but are not limited to; pressure, shear force, friction, temperature, age and medication. Specialised support services, such as specialised mattresses, sheepskin coverings etc., are thought to decrease or relieve pressure, resulting in a lowering of pressure ulcer incidence [3]. The primary aim of this study was to compare the upper/central body pressure distribution between normal lying in a hospital bed versus the use of a pressure redistribution belt. The study involved 16 healthy voluntary subjects lying on a hospital bed with and without wearing the belt. Results showed that the use of a pressure redistribution belt results in reduced pressure peaks and prevents the pressure from increasing over time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction—Human herpesvirus 8 (HHV8) is necessary for Kaposi sarcoma (KS) to develop, but whether peripheral blood viral load is a marker of KS burden (total number of KS lesions), KS progression (the rate of eruption of new KS lesions), or both is unclear. We investigated these relationships in persons with AIDS. Methods—Newly diagnosed patients with AIDS-related KS attending Mulago Hospital, in Kampala, Uganda, were assessed for KS burden and progression by questionnaire and medical examination. Venous blood samples were taken for HHV8 load measurements by PCR. Associations were examined with odds ratio (OR) and 95% confidence intervals (CI) from logistic regression models and with t-tests. Results—Among 74 patients (59% men), median age was 34.5 years (interquartile range [IQR], 28.5-41). HHV8 DNA was detected in 93% and quantified in 77% patients. Median virus load was 3.8 logs10/106 peripheral blood cells (IQR 3.4-5.0) and was higher in men than women (4.4 vs. 3.8 logs; p=0.04), in patients with faster (>20 lesions per year) than slower rate of KS lesion eruption (4.5 vs. 3.6 logs; p<0.001), and higher, but not significantly, among patients with more (>median [20] KS lesions) than fewer KS lesions (4.4 vs. 4.0 logs; p=0.16). HHV8 load was unrelated to CD4 lymphocyte count (p=0.23). Conclusions—We show significant association of HHV8 load in peripheral blood with rate of eruption of KS lesions, but not with total lesion count. Our results suggest that viral load increases concurrently with development of new KS lesions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A physiological control system was developed for a rotary left ventricular assist device (LVAD) in which the target pump flow rate (LVADQ) was set as a function of left atrial pressure (LAP), mimicking the Frank-Starling mechanism. The control strategy was implemented using linear PID control and was evaluated in a pulsatile mock circulation loop using a prototyped centrifugal pump by varying pulmonary vascular resistance to alter venous return. The control strategy automatically varied pump speed (2460 to 1740 to 2700 RPM) in response to a decrease and subsequent increase in venous return. In contrast, a fixed-speed pump caused a simulated ventricular suction event during low venous return and higher ventricular volumes during high venous return. The preload sensitivity was increased from 0.011 L/min/mmHg in fixed speed mode to 0.47L/min/mmHg, a value similar to that of the native healthy heart. The sensitivity varied automatically to maintain the LAP and LVADQ within a predefined zone. This control strategy requires the implantation of a pressure sensor in the left atrium and a flow sensor around the outflow cannula of the LVAD. However, appropriate pressure sensor technology is not yet commercially available and so an alternative measure of preload such as pulsatility of pump signals should be investigated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction Critical care patients frequently receive blood transfusions. Some reports show an association between aged or stored blood and increased morbidity and mortality, including the development of transfusion-related acute lung injury (TRALI). However, the existence of conflicting data endorses the need for research to either reject this association, or to confirm it and elucidate the underlying mechanisms. Methods Twenty-eight sheep were randomised into two groups, receiving saline or lipopolysaccharide (LPS). Sheep were further randomised to also receive transfusion of pooled and heat-inactivated supernatant from fresh (Day 1) or stored (Day 42) non-leucoreduced human packed red blood cells (PRBC) or an infusion of saline. TRALI was defined by hypoxaemia during or within two hours of transfusion and histological evidence of pulmonary oedema. Regression modelling compared physiology between groups, and to a previous study, using stored platelet concentrates (PLT). Samples of the transfused blood products also underwent cytokine array and biochemical analyses, and their neutrophil priming ability was measured in vitro. Results TRALI did not develop in sheep that first received saline-infusion. In contrast, 80% of sheep that first received LPS-infusion developed TRALI following transfusion with "stored PRBC." The decreased mean arterial pressure and cardiac output as well as increased central venous pressure and body temperature were more severe for TRALI induced by "stored PRBC" than by "stored PLT." Storage-related accumulation of several factors was demonstrated in both "stored PRBC" and "stored PLT", and was associated with increased in vitro neutrophil priming. Concentrations of several factors were higher in the "stored PRBC" than in the "stored PLT," however, there was no difference to neutrophil priming in vitro. Conclusions In this in vivo ovine model, both recipient and blood product factors contributed to the development of TRALI. Sick (LPS infused) sheep rather than healthy (saline infused) sheep predominantly developed TRALI when transfused with supernatant from stored but not fresh PRBC. "Stored PRBC" induced a more severe injury than "stored PLT" and had a different storage lesion profile, suggesting that these outcomes may be associated with storage lesion factors unique to each blood product type. Therefore, the transfusion of fresh rather than stored PRBC may minimise the risk of TRALI.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We examined the effects of progressive resistance training (PRT) and supplementation with calcium-vitamin D(3) fortified milk on markers of systemic inflammation, and the relationship between inflammation and changes in muscle mass, size and strength. Healthy men aged 50-79 years (n = 180) participated in this 18-month randomized controlled trial that comprised a factorial 2 x 2 design. Participants were randomized to (1) PRT + fortified milk supplement, (2) PRT, (3) fortified milk supplement, or (4) a control group. Participants assigned to PRT trained 3 days per week, while those in the supplement groups consumed 400 ml day(-1) of milk containing 1,000 mg calcium plus 800 IU vitamin D(3). We collected venous blood samples at baseline, 12 and 18 months to measure the serum concentrations of IL-6, TNF-alpha and hs-CRP. There were no exercise x supplement interactions, but serum IL-6 was 29% lower (95% CI, -62, 0) in the PRT group compared with the control group after 12 months. Conversely, IL-6 was 31% higher (95% CI, -2, 65) in the supplement group compared with the non-supplemented groups after 12 and 18 months. These between-group differences did not persist after adjusting for changes in fat mass. In the PRT group, mid-tibia muscle cross-sectional area increased less in men with higher pre-training inflammation compared with those men with lower inflammation (net difference similar to 2.5%, p < 0.05). In conclusion, serum IL-6 concentration decreased following PRT, whereas it increased after supplementation with fortified milk concomitant with changes in fat mass. Furthermore, low-grade inflammation at baseline restricted muscle hypertrophy following PRT.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The effects of increased training (IT) load on plasma concentrations of lipopolysaccharides (LPS), proinflammatory cytokines, and anti-LPS antibodies during exercise in the heat were investigated in 18 male runners, who performed 14 days of normal training (NT) or 14 days of 20% IT load in 2 equal groups. Before (trial 1) and after (trial 2) the training intervention, all subjects ran at 70% maximum oxygen uptake on a treadmill under hot (35 degrees C) and humid (similar to 40%) conditions, until core temperature reached 39.5 degrees C or volitional exhaustion. Venous blood samples were drawn before, after, and 1.5 h after exercise. Plasma LPS concentration after exercise increased by 71% (trial 1, p < 0.05) and 21% (trial 2) in the NT group and by 92% (trial 1, p < 0.01) and 199% (trial 2, p < 0.01) in the IT group. Postintervention plasma LPS concentration was 35% lower before exercise (p < 0.05) and 47% lower during recovery (p < 0.01) in the IT than in the NT group. Anti-LPS IgM concentration during recovery was 35% lower in the IT than in the NT group (p < 0.05). Plasma interleukin (IL)-6 and tumor necrosis factor (TNF)-alpha concentrations after exercise (IL-6, 3-7 times, p < 0.01, and TNF-alpha, 33%, p < 0.01) and during recovery (IL-6, 2-4 times, p < 0.05, and TNF-alpha, 30%, p < 0.01) were higher than at rest within each group. These data suggest that a short-term tolerable increase in training load may protect against developing endotoxemia during exercise in the heat.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Precise protein quantification and recommendation is essential in clinical dietetics, particularly in the management of individuals with chronic kidney disease, malnutrition, burns, wounds, pressure ulcers, and those in active sports. The Expedited 10g Protein Counter (EP-10) was developed to simplify the quantification of dietary protein for assessment and recommendation of protein intake.1 Instead of using separate protein exchanges for different food groups to quantify the dietary protein intake of an individual, every exchange in the EP-10 accounts for an exchange each of 3g non-protein-rich food and 7g protein-rich food (Table 1). The EP-10 was recently validated and published in the Journal of Renal Nutrition recently.1 This study demonstrated that using the EP-10 for dietary protein intake quantification had clinically acceptable validity and reliability when compared with the conventional 7g protein exchange while requiring less time.2 In clinical practice, the use of efficient, accurate and practical methods to facilitate assessment and treatment plans is important. The EP-10 can be easily implemented in the nutrition assessment and recommendation for a patient in the clinical setting. This patient education tool was adapted from materials printed in the Journal of Renal Nutrition.1 The tool may be used as presented or adapted to assist patients to achieve their recommended daily protein intake.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Summary Appropriate assessment and management of diabetes-related foot ulcers (DRFUs) is essential to reduce amputation risk. Management requires debridement, wound dressing, pressure off-loading, good glycaemic control and potentially antibiotic therapy and vascular intervention. As a minimum, all DRFUs should be managed by a doctor and a podiatrist and/or wound care nurse. Health professionals unable to provide appropriate care for people with DRFUs should promptly refer individuals to professionals with the requisite knowledge and skills. Indicators for immediate referral to an emergency department or multidisciplinary foot care team (MFCT) include gangrene, limb-threatening ischaemia, deep ulcers (bone, joint or tendon in the wound base), ascending cellulitis, systemic symptoms of infection and abscesses. Referral to an MFCT should occur if there is lack of wound progress after 4 weeks of appropriate treatment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Foot ulcers are a frequent reason for diabetes-related hospitalisation. Clinical training is known to have a beneficial impact on foot ulcer outcomes. Clinical training using simulation techniques has rarely been used in the management of diabetes-related foot complications or chronic wounds. Simulation can be defined as a device or environment that attempts to replicate the real world. The few non-web-based foot-related simulation courses have focused solely on training for a single skill or “part task” (for example, practicing ingrown toenail procedures on models). This pilot study aimed to primarily investigate the effect of a training program using multiple methods of simulation on participants’ clinical confidence in the management of foot ulcers. Methods: Sixteen podiatrists participated in a two-day Foot Ulcer Simulation Training (FUST) course. The course included pre-requisite web-based learning modules, practicing individual foot ulcer management part tasks (for example, debriding a model foot ulcer), and participating in replicated clinical consultation scenarios (for example, treating a standardised patient (actor) with a model foot ulcer). The primary outcome measure of the course was participants’ pre- and post completion of confidence surveys, using a five-point Likert scale (1 = Unacceptable-5 = Proficient). Participants’ knowledge, satisfaction and their perception of the relevance and fidelity (realism) of a range of course elements were also investigated. Parametric statistics were used to analyse the data. Pearson’s r was used for correlation, ANOVA for testing the differences between groups, and a paired-sample t-test to determine the significance between pre- and post-workshop scores. A minimum significance level of p < 0.05 was used. Results: An overall 42% improvement in clinical confidence was observed following completion of FUST (mean scores 3.10 compared to 4.40, p < 0.05). The lack of an overall significant change in knowledge scores reflected the participant populations’ high baseline knowledge and pre-requisite completion of web-based modules. Satisfaction, relevance and fidelity of all course elements were rated highly. Conclusions: This pilot study suggests simulation training programs can improve participants’ clinical confidence in the management of foot ulcers. The approach has the potential to enhance clinical training in diabetes-related foot complications and chronic wounds in general.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Diabetic foot complications are recognised as the most common reason for diabetic related hospitalisation and lower extremity amputations. Multi-faceted strategies to reduce diabetic foot hospitalisation and amputation rates have been successful. However, most diabetic foot ulcers are managed in ambulatory settings where data availability is poor and studies limited. The project aimed to develop and evaluate strategies to improve the management of diabetic foot complications in three diverse ambulatory settings and measure the subsequent impact on ospitalisation and amputation. Methods Multifaceted strategies were implemented in 2008, including: multi-disciplinary teams, clinical pathways and training, clinical indicators, telehealth support and surveys. A retrospective audit of consecutive patient records from July 2006 – June 2007 determined baseline clinical indicators (n = 101). A clinical pathway teleform was implemented as a clinical record and clinical indicator analyser in all sites in 2008 (n = 327) and followed up in 2009 (n = 406). Results Prior to the intervention, clinical pathways were not used and multi-disciplinary teams were limited. There was an absolute improvement in treating according to risk of 15% in 2009 and surveillance of the high risk population of 34% and 19% in 2008 and 2009 respectively (p < 0.001). Improvements of 13 – 66% (p < 0.001) were recorded in 2008 for individual clinical activities to a performance > 92% in perfusion, ulcer depth, infection assessment and management, offloading and education. Hospitalisation impacts recorded reductions of up to 64% in amputation rates / 100,000 population (p < 0.001) and 24% average length of stay (p < 0.001) Conclusion These findings support the use of multi-faceted strategies in diverse ambulatory services to standardise practice, improve diabetic foot complications management and positively impact on hospitalisation outcomes. As of October 2010, these strategies had been rolled out to over 25 ambulatory sites, representing 66% of Queensland Health districts, managing 1,820 patients and 13,380 occasions of service, including 543 healed ulcer patients. It is expected that this number will rise dramatically as an incentive payment for the use of the teleform is expanded.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To assess the effects of pre-cooling volume on neuromuscular function and performance in free-paced intermittent-sprint exercise in the heat. Methods: Ten male, teamsport athletes completed four randomized trials involving an 85-min free-paced intermittentsprint exercise protocol in 33°C±33% relative humidity. Pre-cooling sessions included whole body (WB), head+hand (HH), head (H) and no cooling (CONT), applied for 20-min pre-exercise and 5-min mid exercise. Maximal voluntary contractions (MVC) were assessed pre- and postintervention and mid- and post-exercise. Exercise performance was assessed with sprint times, % decline and distances covered during free-paced bouts. Measures of core(Tc) and skin (Tsk) temperatures, heart rate, perceptual exertion and thermal stress were monitored throughout. Venous and capillary blood was analyzed for metabolite, muscle damage and inflammatory markers. Results: WB pre-cooling facilitated the maintenance of sprint times during the exercise protocol with reduced % decline (P=0.04). Mean and total hard running distances increased with pre cooling 12% compared to CONT (P<0.05), specifically, WB was 6-7% greater than HH (P=0.02) and H (P=0.001) respectively. No change was evident in mean voluntary or evoked force pre- to post-exercise with WB and HH cooling (P>0.05). WB and HH cooling reduced Tc by 0.1-0.3°C compared to other conditions (P<0.05). WB Tsk was suppressed for the entire session(P=0.001). HR responses following WB cooling were reduced(P=0.05; d=1.07) compared to CONT conditions during exercise. Conclusion: A relationship between pre-cooling volume and exercise performance seems apparent, as larger surface area coverage augmented subsequent free-paced exercise capacity, in conjunction with greater suppression of physiological load. Maintenance of MVC with pre-cooling, despite increased work output suggests the role of centrally-mediated mechanisms in exercise pacing regulation and subsequent performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study examined the effects of pre-cooling duration on performance and neuromuscular function for self-paced intermittent-sprint shuttle running in the heat. Eight male, team-sport athletes completed two 35-min bouts of intermittent-sprint shuttle running separated by a 15-min recovery on three separate occasions (33°C, 34% relative humidity). Mixed-method pre-cooling was completed for 20 min (COOL20), 10-min (COOL10) or no cooling (CONT) and reapplied for 5-min mid-exercise. Performance was assessed via sprint times, percentage decline and shuttle-running distance covered. Maximal voluntary contractions (MVC), voluntary activation (VA) and evoked twitch properties were recorded pre- and post-intervention and mid- and post-exercise. Core temperature (T c), skin temperature, heart rate, capillary blood metabolites, sweat losses, perceptual exertion and thermal stress were monitored throughout. Venous blood draws pre- and post-exercise were analyzed for muscle damage and inflammation markers. Shuttle-running distances covered were increased 5.2 ± 3.3% following COOL20 (P < 0.05), with no differences observed between COOL10 and CONT (P > 0.05). COOL20 aided in the maintenance of mid- and post-exercise MVC (P < 0.05; d > 0.80), despite no conditional differences in VA (P > 0.05). Pre-exercise T c was reduced by 0.15 ± 0.13°C with COOL20 (P < 0.05; d > 1.10), and remained lower throughout both COOL20 and COOL10 compared to CONT (P < 0.05; d > 0.80). Pre-cooling reduced sweat losses by 0.4 ± 0.3 kg (P < 0.02; d > 1.15), with COOL20 0.2 ± 0.4 kg less than COOL10 (P = 0.19; d = 1.01). Increased pre-cooling duration lowered physiological demands during exercise heat stress and facilitated the maintenance of self-paced intermittent-sprint performance in the heat. Importantly, the dose-response interaction of pre-cooling and sustained neuromuscular responses may explain the improved exercise performance in hot conditions.