30 resultados para ENTERAL STENTS
em Queensland University of Technology - ePrints Archive
Resumo:
Background Studies amongst older people with acute dysphagic stroke requiring thickened fluids have assessed fluid intakes from combinations of beverage, food, enteral and parenteral sources, but not all sources simultaneously. The study aimed to comprehensively assess total water intake from food, beverages, enteral and parenteral sources amongst dysphagic adult in-patients receiving thickened fluids. Methods Patients requiring thickened fluid following dysphagia diagnosis were recruited consecutively from a tertiary teaching hospital’s medical and neurosurgical wards. Fluid intake from food and beverages was assessed by wastage, direct observation and quantified from enteral and parenteral sources through clinical medical records. Results No patients achieved their calculated fluid requirements unless enteral or parenteral fluids were received. The mean daily fluid intake from food was greater than from beverages whether receiving diet alone (food 807±363mL, food and beverages 370±179mL, p<0.001) or diet with enteral or parenteral fluid support (food 455±408mL, food and beverages 263±232mL, p<0.001). Greater daily fluid intakes occurred when receiving enteral and parenteral fluid in addition to oral dietary intake, irrespective of age group, whether assistance was required, diagnosis and whether stage 3 or stage 2 thickened fluids were required (p<0.05). After enteral and parenteral sources, food provided the most important contribution to daily fluid intakes. Conclusions The greatest contribution to oral fluid intake was from food, not beverages. Designing menus and food services which promote and encourage the enjoyment of fluid dense foods, in contrast to thickened beverages, may present an important way to improve fluid intakes of those with dysphagia. Supplemental enteral or parenteral fluid may be necessary to achieve minimum calculated fluid requirements.
Resumo:
There is debate as to whether percutaneous coronary intervention (PCI) with drug-eluting stents or coronary artery bypass surgery (CABG) is the best procedure for subjects with type 2 diabetes and coronary artery disease requiring revascularization. There is some evidence that following these procedures there is less further revascularization with CABG than PCI in subjects with diabetes. Two recent studies; the FREEDOM (Future Revascularization Evaluation in patients with Diabetes mellitus: Optimal Management of Multivessel Disease) trial, and a trial using a real world diabetic population from a Registry, have shown that the benefits of CABG over PCI in subjects with type 2 diabetes extends to lower rates of death and myocardial infarct, in addition to lower rates of revascularization. However, the rates of stroke may be higher with CABG than PCI with drug-eluting stents in this population. Thus, if CABG is going to be preferred to PCI in subjects with type 2 diabetes and multivessel coronary disease, consideration should be given to how to reduce the rates of stroke with CABG.
Resumo:
Diarrhoea is a common complication observed in critically ill patients. Relationships between diarrhoea, enteral nutrition and aerobic intestinal microflora have been disconnectedly examined in this patient cohort. This research used a two-study, observational design to examine these associations. Higher diarrhoea incidence rates were observed when patients received enteral tube feeding, had abnormal serum blood results, received multiple medications and had aerobic microflora dysbiosis. Further, significant aerobic intestinal microflora changes were observed over time in patients who experienced diarrhoea. These results establish a platform for further work to improve the intestinal health of critically ill patients.
Resumo:
Background Malnutrition is common in patients with advanced epithelial ovarian cancer (EOC), and is associated with impaired quality of life (QoL), longer hospital stay and higher risk of treatment-related adverse events. This phase III multi-centre randomised clinical trial tested early enteral feeding versus standard care on postoperative QoL. Methods From 2009 to 2013, 109 patients requiring surgery for suspected advanced EOC, moderately to severely malnourished were enrolled at five sites across Queensland and randomised to intervention (n = 53) or control (n = 56) groups. Intervention involved intraoperative nasojejunal tube placement and enteral feeding until adequate oral intake could be maintained. Despite being randomised to intervention, 20 patients did not receive feeds (13 did not receive the feeding tube; 7 had it removed early). Control involved postoperative diet as tolerated. QoL was measured at baseline, 6 weeks postoperatively and 30 days after the third cycle of chemotherapy. The primary outcome measure was the difference in QoL between the intervention and the control group. Secondary endpoints included treatment-related adverse event occurrence, length of stay, postoperative services use, and nutritional status. Results Baseline characteristics were comparable between treatment groups. No significant difference in QoL was found between the groups at any time point. There was a trend towards better nutritional status in patients who received the intervention but the differences did not reach statistical significance except for the intention-to-treat analysis at 7 days postoperatively (11.8 intervention vs. 13.8 control, p 0.04). Conclusion Early enteral feeding did not significantly improve patients' QoL compared to standard of care but may improve nutritional status.
Resumo:
Cardiovascular diseases refer to the class of diseases that involve the heart or blood vessels (arteries and veins). Examples of medical devices for treating the cardiovascular diseases include ventricular assist devices (VADs), artificial heart valves and stents. Metallic biomaterials such as titanium and its alloy are commonly used for ventricular assist devices. However, titanium and its alloy show unacceptable thrombosis, which represents a major obstacle to be overcome. Polyurethane (PU) polymer has better blood compatibility and has been used widely in cardiovascular devices. Thus one aim of the project was to coat a PU polymer onto a titanium substrate by increasing the surface roughness, and surface functionality. Since the endothelium of a blood vessel has the most ideal non-thrombogenic properties, it was the target of this research project to grow an endothelial cell layer as a biological coating based on the tissue engineering strategy. However, seeding endothelial cells on the smooth PU coating surfaces is problematic due to the quick loss of seeded cells which do not adhere to the PU surface. Thus it was another aim of the project to create a porous PU top layer on the dense PU pre-layer-coated titanium substrate. The method of preparing the porous PU layer was based on the solvent casting/particulate leaching (SCPL) modified with centrifugation. Without the step of centrifugation, the distribution of the salt particles was not uniform within the polymer solution, and the degree of interconnection between the salt particles was not well controlled. Using the centrifugal treatment, the pore distribution became uniform and the pore interconnectivity was improved even at a high polymer solution concentration (20%) as the maximal salt weight was added in the polymer solution. The titanium surfaces were modified by alkli and heat treatment, followed by functionlisation using hydrogen peroxide. A silane coupling agent was coated before the application of the dense PU pre-layer and the porous PU top layer. The ability of the porous top layer to grow and retain the endothelial cells was also assessed through cell culture techniques. The bonding strengths of the PU coatings to the modified titanium substrates were measured and related to the surface morphologies. The outcome of the project is that it has laid a foundation to achieve the strategy of endothelialisation for the blood compatibility of medical devices. This thesis is divided into seven chapters. Chapter 2 describes the current state of the art in the field of surface modification in cardiovascular devices such as ventricular assist devices (VADs). It also analyses the pros and cons of the existing coatings, particularly in the context of this research. The surface coatings for VADs have evolved from early organic/ inorganic (passive) coatings, to bioactive coatings (e.g. biomolecules), and to cell-based coatings. Based on the commercial applications and the potential of the coatings, the relevant review is focused on the following six types of coatings: (1) titanium nitride (TiN) coatings, (2) diamond-like carbon (DLC) coatings, (3) 2-methacryloyloxyethyl phosphorylcholine (MPC) polymer coatings, (4) heparin coatings, (5) textured surfaces, and (6) endothelial cell lining. Chapter 3 reviews the polymer scaffolds and one relevant fabrication method. In tissue engineering, the function of a polymeric material is to provide a 3-dimensional architecture (scaffold) which is typically used to accommodate transplanted cells and to guide their growth and the regeneration of tissue. The success of these systems is dependent on the design of the tissue engineering scaffolds. Chapter 4 describes chemical surface treatments for titanium and titanium alloys to increase the bond strength to polymer by altering the substrate surface, for example, by increasing surface roughness or changing surface chemistry. The nature of the surface treatment prior to bonding is found to be a major factor controlling the bonding strength. By increasing surface roughness, an increase in surface area occurs, which allows the adhesive to flow in and around the irregularities on the surface to form a mechanical bond. Changing surface chemistry also results in the formation of a chemical bond. Chapter 5 shows that bond strengths between titanium and polyurethane could be significantly improved by surface treating the titanium prior to bonding. Alkaline heat treatment and H2O2 treatment were applied to change the surface roughness and the surface chemistry of titanium. Surface treatment increases the bond strength by altering the substrate surface in a number of ways, including increasing the surface roughness and changing the surface chemistry. Chapter 6 deals with the characterization of the polyurethane scaffolds, which were fabricated using an enhanced solvent casting/particulate (salt) leaching (SCPL) method developed for preparing three-dimensional porous scaffolds for cardiac tissue engineering. The enhanced method involves the combination of a conventional SCPL method and a step of centrifugation, with the centrifugation being employed to improve the pore uniformity and interconnectivity of the scaffolds. It is shown that the enhanced SCPL method and a collagen coating resulted in a spatially uniform distribution of cells throughout the collagen-coated PU scaffolds.In Chapter 7, the enhanced SCPL method is used to form porous features on the polyurethane-coated titanium substrate. The cavities anchored the endothelial cells to remain on the blood contacting surfaces. It is shown that the surface porosities created by the enhanced SCPL may be useful in forming a stable endothelial layer upon the blood contacting surface. Chapter 8 finally summarises the entire work performed on the fabrication and analysis of the polymer-Ti bonding, the enhanced SCPL method and the PU microporous surface on the metallic substrate. It then outlines the possibilities for future work and research in this area.
Resumo:
Background The purpose of this study was to provide a detailed evaluation of adherence to nutrition supplements by patients with a lower limb fracture. Methods These descriptive data are from 49 nutritionally“ at-risk” patients aged 70+ years admitted to the hospital after a fall-related lower limb fracture and allocated to receive supplementation as part of a randomized, controlled trial. Supplementation commenced on day 7 and continued for 42 days. Prescribed volumes aimed to meet 45% of individually estimated theoretical energy requirements to meet the shortfall between literature estimates of energy intake and requirements. The supplement was administered by nursing staff on medication rounds in the acute or residential care settings and supervised through thrice-weekly home visits postdischarge. Results Median daily percent of the prescribed volume of nutrition supplement consumed averaged over the 42 days was 67% (interquartile range [IQR], 31–89, n = 49). There was no difference in adherence for gender, accommodation, cognition, or whether the supplement was self-administered or supervised. Twenty-three participants took some supplement every day, and a further 12 missed <5 days. For these 35 “nonrefusers,” adherence was 82% (IQR, 65–93), and they lost on average 0.7% (SD, 4.0%) of baseline weight over the 6 weeks of supplementation compared with a loss of 5.5% (SD, 5.4%) in the “refusers” (n = 14, 29%), p = .003. Conclusions We achieved better volume and energy consumption than previous studies of hip fracture patients but still failed to meet target supplement volumes prescribed to meet 45% of theoretical energy requirements. Clinicians should consider alternative methods of feeding such as a nasogastric tube, particularly in those patients where adherence to oral nutrition supplements is poor and dietary intake alone is insufficient to meet estimated energy requirements.
Resumo:
Objective: Diarrhoea in the enterally tube fed (ETF) intensive care unit (ICU) patient is a multifactorial problem. Diarrhoeal aetiologies in this patient cohort remain debatable; however, the consequences of diarrhoea have been well established and include electrolyte imbalance, dehydration, bacterial translocation, peri anal wound contamination and sleep deprivation. This study examined the incidence of diarrhoea and explored factors contributing to the development of diarrhoea in the ETF, critically ill, adult patient. ---------- Method: After institutional ethical review and approval, a single centre medical chart audit was undertaken to examine the incidence of diarrhoea in ETF, critically ill patients. Retrospective, non-probability sequential sampling was used of all emergency admission adult ICU patients who met the inclusion/exclusion criteria. ---------- Results: Fifty patients were audited. Faecal frequency, consistency and quantity were considered important criteria in defining ETF diarrhoea. The incidence of diarrhoea was 78%. Total patient diarrhoea days (r = 0.422; p = 0.02) and total diarrhoea frequency (r = 0.313; p = 0.027) increased when the patient was ETF for longer periods of time. Increased severity of illness, peripheral oxygen saturation (Sp02), glucose control, albumin and white cell count were found to be statistically significant factors for the development of diarrhoea. ---------- Conclusion: Diarrhoea in ETF critically ill patients is multi-factorial. The early identification of diarrhoea risk factors and the development of a diarrhoea risk management algorithm is recommended.
Resumo:
Objective: The aim of this literature review is to identify the role of probiotics in the management of enteral tube feeding (ETF) diarrhoea in critically ill patients.---------- Background: Diarrhoea is a common gastrointestinal problem seen in ETF patients. The incidence of diarrhoea in tube fed patients varies from 2% to 68% across all patients. Despite extensive investigation, the pathogenesis surrounding ETF diarrhoea remains unclear. Evidence to support probiotics to manage ETF diarrhoea in critically ill patients remains sparse.---------- Method: Literature on ETF diarrhoea and probiotics in critically ill, adult patients was reviewed from 1980 to 2010. The Cochrane Library, Pubmed, Science Direct, Medline and the Cumulative Index of Nursing and Allied Health Literature (CINAHL) electronic databases were searched using specific inclusion/exclusion criteria. Key search terms used were: enteral nutrition, diarrhoea, critical illness, probiotics, probiotic species and randomised clinical control trial (RCT).---------- Results: Four RCT papers were identified with two reporting full studies, one reporting a pilot RCT and one conference abstract reporting an RCT pilot study. A trend towards a reduction in diarrhoea incidence was observed in the probiotic groups. However, mortality associated with probiotic use in some severely and critically ill patients must caution the clinician against its use.---------- Conclusion: Evidence to support probiotic use in the management of ETF diarrhoea in critically ill patients remains unclear. This paper argues that probiotics should not be administered to critically ill patients until further research has been conducted to examine the causal relationship between probiotics and mortality, irrespective of the patient's disease state or projected prophylactic benefit of probiotic administration.
Resumo:
BACKGROUND: The efficacy of nutritional support in the management of malnutrition in chronic obstructive pulmonary disease (COPD) is controversial. Previous meta-analyses, based on only cross-sectional analysis at the end of intervention trials, found no evidence of improved outcomes. OBJECTIVE: The objective was to conduct a meta-analysis of randomized controlled trials (RCTs) to clarify the efficacy of nutritional support in improving intake, anthropometric measures, and grip strength in stable COPD. DESIGN: Literature databases were searched to identify RCTs comparing nutritional support with controls in stable COPD. RESULTS: Thirteen RCTs (n = 439) of nutritional support [dietary advice (1 RCT), oral nutritional supplements (ONS; 11 RCTs), and enteral tube feeding (1 RCT)] with a control comparison were identified. An analysis of the changes induced by nutritional support and those obtained only at the end of the intervention showed significantly greater increases in mean total protein and energy intakes with nutritional support of 14.8 g and 236 kcal daily. Meta-analyses also showed greater mean (±SE) improvements in favor of nutritional support for body weight (1.94 ± 0.26 kg, P < 0.001; 11 studies, n = 308) and grip strength (5.3%, P < 0.050; 4 studies, n = 156), which was not shown by ANOVA at the end of the intervention, largely because of bias associated with baseline imbalance between groups. CONCLUSION: This systematic review and meta-analysis showed that nutritional support, mainly in the form of ONS, improves total intake, anthropometric measures, and grip strength in COPD. These results contrast with the results of previous analyses that were based on only cross-sectional measures at the end of intervention trials.
Resumo:
Currently there is confusion about the value of using nutritional support to treat malnutrition and improve functional outcomes in chronic obstructive pulmonary disease (COPD). This systematic review and meta-analysis of randomised controlled trials (RCTs) aimed to clarify the effectiveness of nutritional support in improving functional outcomes in COPD. A systematic review identified 12 RCTs (n = 448) in stable COPD patients investigating the effects of nutritional support [dietary advice (1 RCT), oral nutritional supplements (ONS; 10 RCTs), enteral tube feeding (1 RCT)] versus control on functional outcomes. Meta-analysis of the changes induced by intervention found that whilst respiratory function (FEV(1,) lung capacity, blood gases) was unresponsive to nutritional support, both inspiratory and expiratory muscle strength (PI max +3.86 SE 1.89 cm H(2) O, P = 0.041; PE max +11.85 SE 5.54 cm H(2) O, P = 0.032) and handgrip strength (+1.35 SE 0.69 kg, P = 0.05) were significantly improved, and associated with weight gains of ≥ 2 kg. Nutritional support produced significant improvements in quality of life in some trials, although meta-analysis was not possible. It also led to improved exercise performance and enhancement of exercise rehabilitation programmes. This systematic review and meta-analysis demonstrates that nutritional support in COPD results in significant improvements in a number of clinically relevant functional outcomes, complementing a previous review showing improvements in nutritional intake and weight.
Resumo:
Background: Nutrition screening is usually administered by nurses. However, most studies on nutrition screening tools have not used nurses to validate the tools. The 3-Minute Nutrition Screening (3-MinNS) assesses weight loss, dietary intake and muscle wastage, with the composite score of each used to determine risk of malnutrition. The aim of the study was to determine the validity and reliability of 3-MinNS administered by nurses, who are the intended assessors. Methods: In this cross sectional study, three ward-based nurses screened 121 patients aged 21 years and over using 3-MinNS in three wards within 24 hours of admission. A dietitian then assessed the patients’ nutritional status using Subjective Global Assessment within 48 hours of admission, whilst blinded to the results of the screening. To assess the reliability of 3-MinNS, 37 patients screened by the first nurse were re-screened by a second nurse within 24 hours, who was blinded to the results of the first nurse. The sensitivity, specificity and best cutoff score for 3-MinNS were determined using the Receiver Operator Characteristics Curve. Results: The best cutoff score to identify all patients at risk of malnutrition using 3-MinNS was three, with sensitivity of 89% and specificity of 88%. This cutoff point also identified all (100%) severely malnourished patients. There was strong correlation between 3-MinNS and SGA (r=0.78, p<0.001). The agreement between two nurses conducting the 3-MinNS tool was 78.3%. Conclusion: 3-Minute Nutrition Screening is a valid and reliable tool for nurses to identify patients at risk of malnutrition.
Resumo:
Background and aims The Australasian Nutrition Care Day Survey (ANCDS) reported two-in-five patients consume ≤50% of the offered food in Australian and New Zealand hospitals. After controlling for confounders (nutritional status, age, disease type and severity), the ANCDS also established an independent association between poor food intake and increased in-hospital mortality. This study aimed to evaluate if medical nutrition therapy (MNT) could improve dietary intake in hospital patients eating poorly. Methods An exploratory pilot study was conducted in the respiratory, neurology and orthopaedic wards of an Australian hospital. At baseline, percentage food intake (0%, 25%, 50%, 75%, and 100%) was evaluated for each main meal and snack for a 24-hour period in patients hospitalised for ≥2 days and not under dietetic review. Patients consuming ≤50% of offered meals due to nutrition-impact symptoms were referred to ward dietitians for MNT. Food intake was re-evaluated on the seventh day following recruitment (post-MNT). Results 184 patients were observed over four weeks; 32 patients were referred for MNT. Although baseline and post-MNT data for 20 participants (68±17years, 65% females) indicated a significant increase in median energy and protein intake post-MNT (3600kJ/day, 40g/day) versus baseline (2250kJ/day, 25g/day) (p<0.05), the increased intake met only 50% of dietary requirements. Persistent nutrition impact symptoms affected intake. Conclusion In this pilot study whilst dietary intake improved, it remained inadequate to meet participants’ estimated requirements due to ongoing nutrition-impact symptoms. Appropriate medical management and early enteral feeding could be a possible solution for such patients.