866 resultados para Nutrition-associated Complications
Resumo:
Genome-wide association studies (GWAS) have identified multiple common genetic variants associated with an increased risk of prostate cancer (PrCa), but these explain less than one-third of the heritability. To identify further susceptibility alleles, we conducted a meta-analysis of four GWAS including 5953 cases of aggressive PrCa and 11 463 controls (men without PrCa). We computed association tests for approximately 2.6 million SNPs and followed up the most significant SNPs by genotyping 49 121 samples in 29 studies through the international PRACTICAL and BPC3 consortia. We not only confirmed the association of a PrCa susceptibility locus, rs11672691 on chromosome 19, but also showed an association with aggressive PrCa [odds ratio = 1.12 (95% confidence interval 1.03-1.21), P = 1.4 × 10(-8)]. This report describes a genetic variant which is associated with aggressive PrCa, which is a type of PrCa associated with a poorer prognosis.
Resumo:
The implementation guide for the surveillance of CLABSI in intensive care units (ICU) was produced by the Healthcare Associated Infection (HAI) Technical Working Group of the Australian Commission on Safety and Quality in Health Care(ACSQHC), and endorsed by the ACSQHC HAI Advisory Committee. State surveillance units, the ACSQHC and the Australian and New Zealand Intensive Care Society (ANZICS) have representatives on the Technical Working Group, and have provided input into this document.
Resumo:
Background Undernutrition, weight loss and dehydration are major clinical issues for people with dementia in residential care, with excessive weight loss contributing to increased risk of frailty, immobility, illness and premature morbidity. This paper discusses a nutritional knowledge and attitudes survey conducted as part of a larger project focused on improving nutritional intake of people with dementia within a residential care facility in Brisbane, Australia. Aims The specific aims of the survey were to identify (i) knowledge of the nutritional needs of aged care facility residents; (ii) mealtime practices; and (iii) attitudes towards mealtime practices and organisation. Methods A survey based on those used in other healthcare settings was completed by 76 staff members. The survey included questions about nutritional knowledge, opinions of the food service, frequency of feeding assistance provided and feeding assessment practices. Results Nutritional knowledge scores ranged from 1 to 9 of a possible 10, with a mean score of 4.67. While 76% of respondents correctly identified risk factors associated with malnutrition in nursing home residents, only 38% of participants correctly identified the need for increased protein and energy in residents with pressure ulcers, and just 15% exhibited correct knowledge of fluid requirements. Further, while nutritional assessment was considered an important part of practice by 83% of respondents, just 53% indicated that they actually carried out such assessments. Identified barriers to promoting optimal nutrition included insufficient time to observe residents (56%); being unaware of residents' feeding issues (46%); poor knowledge of nutritional assessments (44%); and unappetising appearance of food served (57%). Conclusion An important step towards improving health and quality of life for residents of aged care facilities would be to enhance staff nutritional awareness and assessment skills. This should be carried out through increased attention to both preservice curricula and on-the-job training. Implications for practice The residential facility staff surveyed demonstrated low levels of nutrition knowledge, which reflects findings from the international literature. This has implications for the provision of responsive care to residents of these facilities and should be explored further.
Resumo:
Background & aim: This paper describes nutrition care practices in acute care hospitals across Australia and New Zealand. Methods: A survey on nutrition care practices in Australian and New Zealand hospitals was completed by Directors of dietetics departments of 56 hospitals that participated in the Australasian Nutrition Care Day Survey 2010. Results: Overall 370 wards representing various specialities participated in the study. Nutrition risk screening was conducted in 64% (n=234) of the wards. Seventy nine percent(n=185) of these wards reported using the Malnutrition Screening Tool, 16% using the Malnutrition Universal Screening Tool (n=37), and 5% using local tools (n=12). Nutrition risk rescreening was conducted in 14% (n=53) of the wards. More than half the wards referred patients at nutrition risk to dietitians and commenced a nutrition intervention protocol. Feeding assistance was provided in 89% of the wards. “Protected” meal times were implemented in 5% of the wards. Conclusion: A large number of acute care hospital wards in Australia and New Zealand do not comply with evidence-based practice guidelines for nutritional management of malnourished patients. This study also provides recommendations for practice.
Resumo:
Background & aims: One aim of the Australasian Nutrition Care Day Survey was to determine the nutritional status and dietary intake of acute care hospital patients. Methods: Dietitians from 56 hospitals in Australia and New Zealand completed a 24-h survey of nutritional status and dietary intake of adult hospitalised patients. Nutritional risk was evaluated using the Malnutrition Screening Tool. Participants ‘at risk’ underwent nutritional assessment using Subjective Global Assessment. Based on the International Classification of Diseases (Australian modification), participants were also deemed malnourished if their body mass index was <18.5 kg/m2. Dietitians recorded participants’ dietary intake at each main meal and snacks as 0%, 25%, 50%, 75%, or 100% of that offered. Results: 3122 patients (mean age: 64.6 ± 18 years) participated in the study. Forty-one percent of the participants were “at risk” of malnutrition. Overall malnutrition prevalence was 32%. Fifty-five percent of malnourished participants and 35% of well-nourished participants consumed ≤50% of the food during the 24-h audit. “Not hungry” was the most common reason for not consuming everything offered during the audit. Conclusion: Malnutrition and sub-optimal food intake is prevalent in acute care patients across hospitals in Australia and New Zealand and warrants appropriate interventions.
Resumo:
Rationale: The Australasian Nutrition Care Day Survey (ANCDS) evaluated if malnutrition and decreased food intake are independent risk factors for negative outcomes in hospitalised patients. Methods: A multicentre (56 hospitals) cross-sectional survey was conducted in two phases. Phase 1 evaluated nutritional status (defined by Subjective Global Assessment) and 24-hour food intake recorded as 0, 25, 50, 75, and 100% intake. Phase 2 data, which included length of stay (LOS), readmissions and mortality, were collected 90 days post-Phase 1. Logistic regression was used to control for confounders: age, gender, disease type and severity (using Patient Clinical Complexity Level scores). Results: Of 3122 participants (53% males, mean age: 65±18 years) 32% were malnourished and 23% consumed�25% of the offered food. Median LOS for malnourished (MN) patients was higher than well-nourished (WN) patients (15 vs. 10 days, p<0.0001). Median LOS for patients consuming �25% of the food was higher than those consuming �50% (13 vs. 11 days, p<0.0001). MN patients had higher readmission rates (36% vs. 30%, p = 0.001). The odds ratios of 90-day in-hospital mortality were 1.8 times greater for MN patients (CI: 1.03 3.22, p = 0.04) and 2.7 times greater for those consuming �25% of the offered food (CI: 1.54 4.68, p = 0.001). Conclusion: The ANCDS demonstrates that malnutrition and/or decreased food intake are associated with longer LOS and readmissions. The survey also establishes that malnutrition and decreased food intake are independent risk factors for in-hospital mortality in acute care patients; and highlights the need for appropriate nutritional screening and support during hospitalisation. Disclosure of Interest: None Declared.
Resumo:
Background: Extracorporeal circulation (ECC), the diversion of blood flow through a circuit located outside of the body, has been one of the major advances in modern medicine. Cardio-pulmonary bypass (CPB), renal dialysis, apheresis and extracorporeal membrane oxygenation (ECMO) are all different forms of ECC. Despite its major benefits, when blood comes into contact with foreign material, both the coagulation and inflammation cascades are activated simultaneously. Short periods of exposure to ECC e.g. CPB (�2 h duration), are known to be associated with haemolysis, coagulopathies, bleeding and inflammation which demand blood product support. Therefore, it is not unexpected that these complications would be exaggerated with prolonged periods of ECC such as in ECMO (days to weeks duration). The variability and complexities of the underlying pathologies of patients requiring ECC makes it difficult to study the cause and effect of these complications. To overcome this problem we developed an ovine (sheep) model of ECC. Method: Healthy female sheep (1–3 y.o.) weighing 40–50 kg were fasted overnight, anaesthetised, intubated and ventilated [1]. Half the group received smoke induced acute lung injury (S-ALI group) (n = 8) and the other half did not (healthy group) (n = 8). Sheep were subsequently cannulated (Medtronic Inc, Minneapolis, MN, USA) and veno-venous ECMO commenced using PLS ECMO circuit and Quadrox D oxygenator (Maquet Cardiopulmonary AG, Hechinger Straße, Germany). There was continuous physiological monitoring and blood was collected at specified time intervals for full blood counts, platelet function analysis (by Multiplate®), routine coagulation and assessment of clot formation and lysis (by ROTEM®). Preliminary results Full blood counts and routine coagulation results from normal healthy sheep were comparable to those of normal human adults. Within 15 min of initiating of ECMO, PT, PTT and EXTEM clot formation time increased, whilst EXTEM maximum clot firmness decreased in both cohorts. Discussion & Conclusions: Preliminary results of sheep from both 2 h ECMO cohorts showed that the anatomy, haematology and coagulation parameters of an adult sheep are comparable to that a human adult. Experiments are currently underway with healthy (n = 8) and S-ALI (n = 8) sheep on ECMO for 24 h. In addition to characterising how ECMO alters haematology and coagulation parameters, we hope that it will also define which blood components will be most effective to correct bleeding or clotting complications during ECMO support.
Resumo:
Background: Extra corporeal membrane oxygenation (ECMO) is a complex rescue therapy used to provide cardiac and/or respiratory support for critically ill patients who have failed maximal conventional medical management. ECMO is based on a modified cardiopulmonary bypass (CPB) circuit, and can provide cardiopulmonary support for up-to several months. It can be used in a veno venous configuration for isolated respiratory failure, (VV-ECMO), or in a veno arterial configuration (VA-ECMO) where support is necessary for cardiac +/- respiratory failure. The ECMO circuit consists of five main components: large bore cannulae (access cannulae) for drainage of the venous system, and return cannulae to either the venous (in VV-ECMO) or arterial (in VA ECMO) system. An oxygenator, with a vast surface area of hollow filaments, allows addition of oxygen and removal of carbon dioxide; a centrifugal blood pump allows propulsion of blood through the circuit at upto 10 L/minute; a control module and a thermoregulatory unit, which allows for exact temperature control of the extra corporeal blood. Methods: The first successful use of ECMO for ARDS in adults occurred in 1972, and its use has become more commonplace over the last 30 years, supported by the improvement in design and biocompatibility of the equipment, which has reduced the morbidity associated with this modality. Whilst the use of ECMO in neonatal population has been supported by numerous studies, the evidence upon which ECMO was integrated into adult practice was substantially less robust. Results: Recent data, including the CESAR study (Conventional Ventilatory Support versus Extra corporeal membrane oxygenation for Severe Respiratory failure) has added a degree of evidence to the role of ECMO in such a patient population. The CESAR study analysed 180 patients, and confirmed that ECMO was associated with an improved rate of survival. More recently, ECMO has been utilized in numerous situations within the critical care area, including support in high-risk percutaneous interventions in cardiac catheter lab; the operating room, emergency department, as well in specialized inter-hospital retrieval services. The increased understanding of the risk:benefit profile of ECMO, along with a reduction in morbidity associated with its use will doubtless lead to a substantial rise in the utilisation of this modality. As with all extra-corporeal circuits, ECMO opposes the basic premises of the mammalian inflammation and coagulation cascade where blood comes into foreign circulation, both these cascades are activated. Anti-coagulation is readily dealt with through use of agents such as heparin, but the inflammatory excess, whilst less macroscopically obvious, continues un-abated. Platelet consumption and neutrophil activation occur rapidly, and the clinician is faced with balancing the need of anticoagulation for the circuit, against haemostasis in an acutely bleeding patient. Alterations in pharmacokinetics may result in inadequate levels of disease modifying therapeutics, such as antibiotics, hence paradoxically delaying recovery from conditions such as pneumonia. Key elements of nutrition and the innate immune system maysimilarly be affected. Summary: This presentation will discuss the basic features of ECMO to the nonspecialist, and review the clinical conundrum faced by the team treating these most complex cases.
Resumo:
Currently there is confusion about the value of using nutritional support to treat malnutrition and improve functional outcomes in chronic obstructive pulmonary disease (COPD). This systematic review and meta-analysis of randomised controlled trials (RCTs) aimed to clarify the effectiveness of nutritional support in improving functional outcomes in COPD. A systematic review identified 12 RCTs (n = 448) in stable COPD patients investigating the effects of nutritional support [dietary advice (1 RCT), oral nutritional supplements (ONS; 10 RCTs), enteral tube feeding (1 RCT)] versus control on functional outcomes. Meta-analysis of the changes induced by intervention found that whilst respiratory function (FEV(1,) lung capacity, blood gases) was unresponsive to nutritional support, both inspiratory and expiratory muscle strength (PI max +3.86 SE 1.89 cm H(2) O, P = 0.041; PE max +11.85 SE 5.54 cm H(2) O, P = 0.032) and handgrip strength (+1.35 SE 0.69 kg, P = 0.05) were significantly improved, and associated with weight gains of ≥ 2 kg. Nutritional support produced significant improvements in quality of life in some trials, although meta-analysis was not possible. It also led to improved exercise performance and enhancement of exercise rehabilitation programmes. This systematic review and meta-analysis demonstrates that nutritional support in COPD results in significant improvements in a number of clinically relevant functional outcomes, complementing a previous review showing improvements in nutritional intake and weight.
Resumo:
Background: Recommendations for the introduction of solids and fluids to an infant’s diet have changed over the past decade. Since these changes, there has been minimal research to determine patterns in the introduction of foods and fluids to infants. Methods: This retrospective cohort study surveyed mothers who birthed in Queensland, Australia, from February 1 to May 31, 2010, around 4 months postpartum. Frequencies of foods and fluids given to infants at 4, 8, 13, and 17 weeks were described. Logistic regression determined associations between infant feeding practices, the introduction of other foods and fluids at 17 weeks, and sociodemographic characteristics. Results: Response rate was 35.8%. At 17 weeks, 68% of infants were breastfed and 33% exclusively breastfed. Solids and water had been introduced in 8.6% and 35.0% of infants, respectively. The introduction of solids by 17 weeks was associated with younger maternal age and the infant being given water and infant formula at 4 weeks. The infant being given water at 17 weeks was associated with younger maternal age, the infant being given infant formula at 4 weeks, level of education, relative socioeconomic disadvantage, parity, and birth facility. Conclusion: Over the past decade, there has been a significant reduction in the proportion of infants in Australia who have been given solids by 17 weeks. Sociodemographic characteristics and formula feeding practices at 4 weeks were associated with the introduction of solids and water by 17 weeks. Further research should examine these barriers to improve compliance with current infant feeding recommendations.
Resumo:
Proteinuria was observed in 27% of 153 patients taking tenofovir for more than 1 year. Concomitant protease inhibitor therapy and cumulative tenofovir exposure were independently associated with proteinuria in this cohort. Proteinuria was reversible in 11 of 12 patients who ceased tenofovir because of proteinuria without altering other medications. Clinicians should be aware that tenofovir can cause reversible proteinuria in patients with HIV.
Resumo:
A genome-wide search for markers associated with BSE incidence was performed by using Transmission-Disequilibrium Tests (TDTs). Significant segregation distortion, i.e., unequal transmission probabilities of alleles within a locus, was found for three marker loci on Chromosomes (Chrs) 5, 10, and 20. Although TDTs are robust to false associations owing to hidden population substructures, it cannot distinguish segregation distortion caused by a true association between a marker and bovine spongiform encephalopathy (BSE) from a population-wide distortion. An interaction test and a segregation distortion analysis in half-sib controls were used to disentangle these two alternative hypotheses. None of the markers showed any significant interaction between allele transmission rates and disease status, and only the marker on Chr 10 showed a significant segregation distortion in control individuals. Nevertheless, the control group may have been a mixture of resistant and susceptible but unchallenged individuals. When new genotypes were generated in the vicinity of these three markers, evidence for an association with BSE was confirmed for the locus on Chr 5.
Resumo:
Aim This cross-sectional study explores associations between migrant Indian mothers’ use of controlling feeding practices (pressure to eat, restriction and monitoring) and their concerns and perceptions regarding their children’s weight and picky eating behaviour. Methods Two hundred and thirty mothers with children aged 1-5 years, residing in Australia for 1-8 years, participated by completing a self-reported questionnaire. Results Perceptions and concerns regarding children’s weight were not associated with any of the controlling feeding practices. A positive association was noted between pressure feeding and perceptions of pickiness after adjusting for covariates: children’s age, gender and weight-for-age Z-score. Girls, older children, and children with higher weight-for-age z scores were pressure fed to a greater extent. Conclusions This study supports the generalisation of findings from Caucasian literature that pressure feeding and perceptions of pickiness are positively related.
Resumo:
The ageing population is increasing worldwide, as are a range of chronic diseases, conditions, and physical and cognitive disabilities associated with later life. The older population is also neurologically diverse, with unique and specific challenges around mobility and engagement with the urban environment. Older people tend to interact less with cities and neighbourhoods, putting them at risk of further illnesses and co-morbidities associated with being less physically and socially active. Empirical evidence has shown that reduced access to healthcare services, health-related resources and social interaction opportunities is associated with increases in morbidity and premature mortality. While it is crucial to respond to the needs of this ageing population, there is insufficient evidence for interventions regarding their experiences of public space from the vantage point of neurodiversity. This paper provides a conceptual and methodological framework to investigate relationships between the sensory and cognitive abilities of older people, and their use and negotiation of the urban environment. The paper will refer to a case example of the city of Logan, an urban area in Queensland, Australia, where current urban development provides opportunities for the design of spaces that take experiences of neurodiversity into account. The framework will inform the development of principles for urban design for increasingly neurologically diverse populations.