515 resultados para Post-release outcome


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurately quantifying total freshwater storage methane release to atmosphere requires the spatial–temporal measurement of both diffusive and ebullitive emissions. Existing floating chamber techniques provide localised assessment of methane flux, however, significant errors can arise when weighting and extrapolation to the entire storage, particularly when ebullition is significant. An improved technique has been developed that compliments traditional chamber based experiments to quantify the storage-scale release of methane gas to atmosphere through ebullition using the measurements from an Optical Methane Detector (OMD) and a robotic boat. This provides a conservative estimate of the methane emission rate from ebullition along with the bubble volume distribution. It also georeferences the area of ebullition activity across entire storages at short temporal scales. An assessment on Little Nerang Dam in Queensland, Australia, demonstrated whole storage methane release significantly differed spatially and throughout the day. Total methane emission estimates showed a potential 32-fold variation in whole-of-dam rates depending on the measurement and extrapolation method and time of day used. The combined chamber and OMD technique showed that 1.8–7.0% of the surface area of Little Nerang Dam is accounting for up to 97% of total methane release to atmosphere throughout the day. Additionally, over 95% of detectable ebullition occurred in depths less than 12 m during the day and 6 m at night. This difference in spatial and temporal methane release rate distribution highlights the need to monitor significant regions of, if not the entire, water storage in order to provide an accurate estimate of ebullition rates and their contribution to annual methane emissions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Post-transplantation lymphoproliferative disorders (PTLD) arise in the immunosuppressed and are frequently Epstein-Barr virus (EBV) associated. The most common PTLD histological sub-type is diffuse large B-cell lymphoma (EBV+DLBCL-PTLD). Restoration of EBV-specific T-cell immunity can induce EBV+DLBCL-PTLD regression. The most frequent B-cell lymphoma in the immunocompetent is also DLBCL. ‘EBV-positive DLBCL of the elderly’ (EBV+DLBCL) is a rare but well-recognized DLBCL entity that occurs in the overtly immunocompetent, that has an adverse outcome relative to EBV-negative DLBCL. Unlike PTLD (which is classified as viral latency III), literature suggests EBV+DLBCL is typically latency II, i.e. expression is limited to the immuno-subdominant EBNA1, LMP1 and LMP2 EBV-proteins. If correct, this would be a major impediment for T-cell immunotherapeutic strategies. Unexpectedly we observed EBV+DLBCL-PTLD and EBV+DLBCL both shared features consistent with type III EBV-latency, including expression of the immuno-dominant EBNA3A protein. Extensive analysis showed frequent polymorphisms in EBNA1 and LMP1 functionally defined CD8+ T-cell epitope encoding regions, whereas EBNA3A polymorphisms were very rare making this an attractive immunotherapy target. As with EBV+DLBCL-PTLD, the antigen presenting machinery within lymphomatous nodes was intact. EBV+DLBCL express EBNA3A suggesting it is amenable to immunotherapeutic strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background and significance: Nurses' job dissatisfaction is associated with negative nursing and patient outcomes. One of the most powerful reasons for nurses to stay in an organisation is satisfaction with leadership. However, nurses are frequently promoted to leadership positions without appropriate preparation for the role. Although a number of leadership programs have been described, none have been tested for effectiveness, using a randomised control trial methodology. Aims: The aims of this research were to develop an evidence based leadership program and to test its effectiveness on nurse unit managers' (NUMs') and nursing staff's (NS's) job satisfaction, and on the leader behaviour scores of nurse unit managers. Methods: First, the study used a comprehensive literature review to examine the evidence on job satisfaction, leadership and front-line manager competencies. From this evidence a summary of leadership practices was developed to construct a two component leadership model. The components of this model were then combined with the evidence distilled from previous leadership development programs to develop a Leadership Development Program (LDP). This evidence integrated the program's design, its contents, teaching strategies and learning environment. Central to the LDP were the evidence-based leadership practices associated with increasing nurses' job satisfaction. A randomised controlled trial (RCT) design was employed for this research to test the effectiveness of the LDP. A RCT is one of the most powerful tools of research and the use of this method makes this study unique, as a RCT has never been used previously to evaluate any leadership program for front-line nurse managers. Thirty-nine consenting nurse unit managers from a large tertiary hospital were randomly allocated to receive either the leadership program or only the program's written information about leadership. Demographic baseline data were collected from participants in the NUM groups and the nursing staff who reported to them. Validated questionnaires measuring job satisfaction and leader behaviours were administered at baseline, at three months after the commencement of the intervention and at six months after the commencement of the intervention, to the nurse unit managers and to the NS. Independent and paired t-tests were used to analyse continuous outcome variables and Chi Square tests were used for categorical data. Results: The study found that the nurse unit managers' overall job satisfaction score was higher at 3-months (p = 0.016) and at 6-months p = 0.027) post commencement of the intervention in the intervention group compared with the control group. Similarly, at 3-months testing, mean scores in the intervention group were higher in five of the six "positive" sub-categories of the leader behaviour scale when compared to the control group. There was a significant difference in one sub-category; effectiveness, p = 0.015. No differences were observed in leadership behaviour scores between groups by 6-months post commencement of the intervention. Over time, at three month and six month testing there were significant increases in four transformational leader behaviour scores and in one positive transactional leader behaviour scores in the intervention group. Over time at 3-month testing, there were significant increases in the three leader behaviour outcome scores, however at 6-months testing; only one of these leader behaviour outcome scores remained significantly increased. Job satisfaction scores were not significantly increased between the NS groups at three months and at six months post commencement of the intervention. However, over time within the intervention group at 6-month testing there was a significant increase in job satisfaction scores of NS. There were no significant increases in NUM leader behaviour scores in the intervention group, as rated by the nursing staff who reported to them. Over time, at 3-month testing, NS rated nurse unit managers' leader behaviour scores significantly lower in two leader behaviours and two leader behaviour outcome scores. At 6-month testing, over time, one leader behaviour score was rated significantly lower and the nontransactional leader behaviour was rated significantly higher. Discussion: The study represents the first attempt to test the effectiveness of a leadership development program (LDP) for nurse unit managers using a RCT. The program's design, contents, teaching strategies and learning environment were based on a summary of the literature. The overall improvement in role satisfaction was sustained for at least 6-months post intervention. The study's results may reflect the program's evidence-based approach to developing the LDP, which increased the nurse unit managers' confidence in their role and thereby their job satisfaction. Two other factors possibly contributed to nurse unit managers' increased job satisfaction scores. These are: the program's teaching strategies, which included the involvement of the executive nursing team of the hospital, and the fact that the LDP provided recognition of the importance of the NUM role within the hospital. Consequently, participating in the program may have led to nurse unit managers feeling valued and rewarded for their service; hence more satisfied. Leadership behaviours remaining unchanged between groups at the 6 months data collection time may relate to the LDP needing to be conducted for a longer time period. This is suggested because within the intervention group, over time, at 3 and 6 months there were significant increases in self-reported leader behaviours. The lack of significant changes in leader behaviour scores between groups may equally signify that leader behaviours require different interventions to achieve change. Nursing staff results suggest that the LDP's design needs to consider involving NS in the program's aims and progress from the outset. It is also possible that by including regular feedback from NS to the nurse unit managers during the LDP that NS's job satisfaction and their perception of nurse unit managers' leader behaviours may alter. Conclusion/Implications: This study highlights the value of providing an evidence-based leadership program to nurse unit managers to increase their job satisfaction. The evidence based leadership program increased job satisfaction but its effect on leadership behaviour was only seen over time. Further research is required to test interventions which attempt to change leader behaviours. Also further research on NS' job satisfaction is required to test the indirect effects of LDP on NS whose nurse unit managers participate in LDPs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chlamydia trachomatis infections of the male and female reproductive tracts are the world's leading sexually transmitted bacterial disease, and can lead to damaging pathology, scarring and infertility. The resolution of chlamydial infection requires the development of adaptive immune responses to infection, and includes cell-mediated and humoral immunity. Whilst cluster of differentiation (CD)4+ T cells are known to be essential in clearance of infection [1], they are also associated with immune cell infiltration, autoimmunity and infertility in the testes [2-3]. Conversely, antibodies are less associated with inflammation, are readily transported into the reproductive tracts, and can offer lumenal neutralization of chlamydiae prior to infection. Antibodies, or immunoglobulins (Ig), play a supportive role in the resolution of chlamydial infections, and this thesis sought to define the function of IgA and IgG, against a variety of chlamydial antigens expressed during the intracellular and extracellular stages of the chlamydial developmental cycle. Transport of IgA and IgG into the mucosal lumen is facilitated by receptor-mediated transcytosis yet the expression profile (under normal conditions and during urogenital chlamydial infection) of the polymeric immunoglobulin receptor (pIgR) and the neonatal Fc receptor (FcRn) remains unknown. The expression profile of pIgR and FcRn in the murine male reproductive tract was found to be polarized to the lower and upper reproductive tract tissues respectively. This demonstrates that the two receptors have a tissue tropism, which must be considered when targeting pathogens that colonize different sites. In contrast, the expression of pIgR and FcRn in the female mouse was found to be distributed in both the upper and lower reproductive tracts. When urogenitally infected with Chlamydia muridarum, both male and female reproductive tracts up-regulated expression of pIgR and down-regulated expression of FcRn. Unsurprisingly, the up-regulation of pIgR increased the concentration of IgA in the lumen. However, down-regulation of FcRn, prevented IgG uptake and led to an increase or pooling of IgG in lumenal secretions. As previous studies have identified the importance of pIgR-mediated delivery of IgA, as well as the potential of IgA to bind and neutralize intracellular pathogens, IgA against a variety of chlamydial antigens was investigated. The protection afforded by IgA against the extracellular antigen major outer membrane protein (MOMP), was found to be dependent on pIgR expression in vitro and in vivo. It was also found that in the absence of pIgR, no protection was afforded to mice previously immunized with MOMP. The protection afforded from polyclonal IgA against the intracellular chlamydial antigens; inclusion membrane protein A (IncA), inclusion membrane proteins (IncMem) and secreted chlamydial protease-like activity factor (CPAF) were produced and investigated in vitro. Antigen-specific intracellular IgA was found to bind to the respective antigen within the infected cell, but did not significantly reduce inclusion formation (p > 0.05). This suggests that whilst IgA specific for the selected antigens was transported by pIgR to the chlamydial inclusion, it was unable to prevent growth. Similarly, immunization of male mice with intracellular chlamydial antigens (IncA or IncMem), followed by depletion CD4+ T cells, and subsequent urogenital C. muridarum challenge, provided minimal pIgR-mediated protection. Wild type male mice immunized with IncA showed a 57 % reduction (p < 0.05), and mice deficient in pIgR showed a 35 % reduction (p < 0.05) in reproductive tract chlamydial burden compared to control antigen, and in the absence of CD4+ T cells. This suggests that pIgR and secretory IgA (SIgA) were playing a protective role (21 % pIgR-mediated) in unison with another antigen-specific immune mechanism (36 %). Interestingly, IgA generated during a primary respiratory C. muridarum infection did not provide a significant amount of protection to secondary urogenital C. muridarum challenge. Together, these data suggest that IgA specific for an extracellular antigen (MOMP) can play a strong protective role in chlamydial infections, and that IgA targeting intracellular antigens is also effective but dependent on pIgR expression in tissues. However, whilst not investigated here, IgA targeting and blocking other intracellular chlamydial antigens, that are more essential for replication or type III secretion, may be more efficacious in subunit vaccines. Recently, studies have demonstrated that IgG can neutralize influenza virus by trafficking IgG-bound virus to lysosomes [4]. We sought to determine if this process could also traffic chlamydial antigens for degradation by lysosomes, despite Chlamydia spp. actively inhibiting fusion with the host endocytic pathway. As observed in pIgR-mediated delivery of anti-IncA IgA, FcRn similarly transported IgG specific for IncA which bound the inclusion membrane. Interestingly, FcRn-mediated delivery of anti-IncA IgG significantly decreased inclusion formation by 36 % (p < 0.01), and induced aberrant inclusion morphology. This suggests that unlike IgA, IgG can facilitate additional host cellular responses which affect the intracellular niche of chlamydial growth. Fluorescence microscopy revealed that IgG also bound the inclusion, but unlike influenza studies, did not induce the recruitment of lysosomes. Notably, anti-IncA IgG recruited sequestosomes to the inclusion membrane, markers of the ubiquitin/proteasome pathway and major histocompatibility complex (MHC) class I loading. To determine if the protection against C. muridarum infection afforded by IncA IgG in vitro translated in vivo, wild type mice and mice deficient in functional FcRn and MHC-I, were immunized, depleted of CD4+, and urogenitally infected with C. muridarum. Unlike in pIgR-deficient mice, the protection afforded from IncA immunization was completely abrogated in mice lacking functional FcRn and MHC-I/CD8+. Thus, both anti-IncA IgA and IgG can bind the inclusion in a pIgR and FcRn-mediated manner, respectively. However, only IgG mediates a higher reduction in chlamydial infection in vitro and in vivo suggesting more than steric blocking of IncA had occurred. Unlike anti-MOMP IgA, which reduced chlamydial infection of epithelial cells and male mouse tissues, IgG was found to enhance infectivity in vitro, and in vivo. Opsonization of EBs with MOMP-IgG enhanced inclusion formation of epithelial cells in a MOMP-IgG dose-dependent and FcRn-dependent manner. When MOMP-IgG opsonized EBs were inoculated into the vagina of female mice, a small but non-significant (p > 0.05) enhancement of cervicovaginal C. muridarum shedding was observed three days post infection in mice with functional FcRn. Interestingly, infection with opsonized EBs reduced the intensity of the peak of infection (day six) but protracted the duration of infection by 60 % in wild type mice only. Infection with EBs opsonized in IgG also significantly increased (p < 0.05) hydrosalpinx formation in the oviducts and induced lymphocyte infiltration uterine horns. As MOMP is an immunodominant antigen, and is widely used in vaccines, the ability of IgG specific to extracellular chlamydial antigens to enhance infection and induce pathology needs to be considered. Together, these data suggest that immunoglobulins play a dichotomous role in chlamydial infections, and are dependent on antigen specificity, FcRn and pIgR expression. FcRn was found to be highly expressed in upper male reproductive tract, whilst pIgR was dominantly expressed in the lower reproductive tract. Conversely, female mice expressed FcRn and pIgR in both the lower and upper reproductive tracts. In response to a normal chlamydial infection, pIgR is up-regulated increasing secretory IgA release, but FcRn is down-regulated preventing IgG uptake. Similarly to other studies [5-6], we demonstrate that IgA and IgG generated during primary chlamydial infections plays a minor role in recall immunity, and that antigen-specific subunit vaccines can offer more protection. We also show that both IgA and IgG can be used to target intracellular chlamydial antigens, but that IgG is more effective. Finally, IgA against the extracellular antigen MOMP can afford protection, whist IgG plays a deleterious role by increasing infectivity and inducing damaging immunopathology. Further investigations with additional antigens or combination subunit vaccines will enhance our understanding the protection afforded by antibodies against intracellular and extracellular pathogenic antigens, and help improve the development of an efficacious chlamydial vaccine.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Runt related transcription factor 2 (RUNX2) is a key regulator of osteoblast differentiation. Several variations within RUNX2 have been found to be associated with significant changes in BMD, which is a major risk factor for fracture. In this study we report that an 18bp deletion within the polyalanine tract (17A>11A) of RUNX2 is significantly associated with fracture. Carriers of the 11A allele were found to be nearly twice as likely to have sustained fracture. Within the fracture category, there was a significant tendency of 11A carriers to present with fractures of bones of intramembranous origin compared to bones of endochondral origin (p=0.005). In a population of random subjects, the 11A allele was associated with decreased levels of serum collagen cross links (CTx, p=0.01), suggesting decreased bone turnover. The transactivation function of the 11A allele was quantitatively decreased. Interestingly, we found no effect of the 11A allele on BMD at multiple skeletal sites, although these were not the sites where a relationship with fracture was most evident. These findings suggest that the 11A allele is a biologically relevant polymorphism that influences serum CTx and confers enhanced fracture risk in a site-selective manner related to intramembranous bone ossification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While scientists are still debating the level of climate change impact to new weather patterns, there have been some devastating natural disasters worldwide in the last decade. From cyclones to earthquakes and from Tsunamis to landslides, these disasters occur with formidable forces and crushing effects. As one of the most important arrangements to erase the negative influence of natural disasters and help with the recovery and redevelopment of the hit area, reconstruction is of utmost importance in light of sustainable objectives. However, current reconstruction practice confronts quite a lot of criticisms for focusing on providing short-term necessities. How to conduct the post disaster reconstruction in a long-term perspective and achieve sustainable development is thereby a highlight for industry practice and research. This paper introduced an on-going research project which is aimed at establishing an operational framework for improving sustainability performance of post disaster reconstruction by identifying critical sustainable factors and exploring their internal relationships. The research reported in this paper is part of the project. After a comprehensive literature review, 17 potential critical sustainability factors for post disaster reconstruction were identified. Preliminary examination and discussion of the factors was conducted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Patients presenting for knee replacement on warfarin for medical reasons often require higher levels of anticoagulation peri-operatively than primary thromboprophylaxis and may require bridging therapy with heparin. We performed a retrospective case control study on 149 consecutive primary knee arthroplasty patients to investigate whether anti-coagulation affected short-term outcomes. Specific outcome measures indicated significant increases in prolonged wound drainage (26.8% of cases vs 7.3% of controls, p<0.001); superficial infection (16.8% vs 3.3%, p<0.001); deep infection (6.0% vs 0%, p<0.001); return-to-theatre for washout (4.7% vs 0.7%, p=0.004); and revision (4.7% vs 0.3%, p=0.001). Management of patients on long-term warfarin therapy following TKR is particularly challenging, as the surgeon must balance risk of thromboembolism against post-operative complications on an individual patient basis in order to optimise outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Farm men and women in Australia have higher levels of problematic alcohol use than their urban counterparts and experience elevated health risks associated with excessive alcohol consumption. The Sustainable Farm Families (SFF) program has worked successfully with farm men and women to address health, well- being and safety and has identified that further research and training is required to understand and address alcohol misuse behaviours. This project will add an innovative component to the program by training health professionals working with farm men and women to discuss and respond to alcohol-related physical and mental health problems. Methods/Design A mixed method design with multi-level evaluation will be implemented following the development and delivery of a training program (The Alcohol Intervention Training Program {AITP}) for Sustainable Farm Families health professionals. Pre-, post- and follow-up surveys will be used to assess both the impact of the training on the knowledge, confidence and skills of the health professionals to work with alcohol misuse and associated problems, and the impact of the training on the attitudes, behaviour and mental health of farm men and women who participate in the SFF project. Evaluations will take a range of forms including self-rated outcome measures and interviews. Discussion The success of this project will enhance the health and well-being of a critical population, the farm men and women of Australia, by producing an evidence-based strategy to assist them to adopt more positive alcohol-related behaviours that will lead to better physical and mental health.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Chlamydia trachomatis is the most commonly diagnosed bacterial sexually transmitted infection in the developed world and diagnosis rates have increased dramatically over the last decade. Repeat infections of chlamydia are very common and may represent re-infection from an untreated partner or treatment failure. The aim of this cohort study is to estimate the proportion of women infected with chlamydia who experience treatment failure after treatment with 1 gram azithromycin. Methods/design This cohort study will follow women diagnosed with chlamydia for up to 56 days post treatment. Women will provide weekly genital specimens for further assay. The primary outcome is the proportion of women who are classified as having treatment failure 28, 42 or 56 days after recruitment. Comprehensive sexual behavior data collection and the detection of Y chromosome DNA and high discriminatory chlamydial genotyping will be used to differentiate between chlamydia re-infection and treatment failure. Azithromycin levels in high-vaginal specimens will be measured using a validated liquid chromatography – tandem mass spectrometry method to assess whether poor azithromycin absorption could be a cause of treatment failure. Chlamydia culture and minimal inhibitory concentrations will be performed to further characterize the chlamydia infections. Discussion Distinguishing between treatment failure and re-infection is important in order to refine treatment recommendations and focus infection control mechanisms. If a large proportion of repeat chlamydia infections are due to antibiotic treatment failure, then international recommendations on chlamydia treatment may need to be re-evaluated. If most are re-infections, then strategies to expedite partner treatment are necessary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The construct of total wellness includes a holistic approach to the body, mind and spirit components of life. While the health benefits of reducing sedentary behavior and increasing physical activity are well documented, little is known about the influence on total wellness of an internet-based physical activity monitor designed to help people to achieve higher physical activity levels. Purpose The purpose of this four-week, personal activity monitor-based intervention program was to reduce sedentary behavior and increase physical activity levels in daily living for sedentary adults and to determine if these changes would also be associated with improvement in total wellness. Methods Twenty-two men and 11 women (27 years ± 4.0) were randomly assigned to either an intervention (n = 18) or control group (n = 15). The intervention group interacted with an online personal activity monitor (Gruve Solution™) designed to reduce sedentary time and increase physical activity during activities of daily living. The control group did not interact with the monitor, as they were asked to follow their normal daily physical activities and sedentary behavior routines. The Wellness Evaluation of Lifestyle (WEL) inventory was used to assess total wellness. Sedentary time, light, walking, moderate and vigorous intensity physical activities were assessed for both intervention and control groups at baseline and at week-4 by the 7-day Sedentary and Light Intensity Physical Activity Log (7-day SLIPA Log) and the International Physical Activity Questionnaire (IPAQ). Results Significant increases in pre-post total wellness scores (from 64% ± 5.7 to 75% ± 8.5) (t (17) = -6.5, p < 0.001) were observed in the intervention group by the end of week four. Intervention participants decreased their sedentary time (21%, 2.3 hours/day) and increased their light (36.7%, 2.5 hours/day), walking (65%, 1057 MET-min/week), moderate (67%, 455 MET-min/week) and vigorous intensity (60%, 442 MET-min/week) physical activity (all p < 0.001). No significant differences for total wellness were observed between the groups at baseline and no pre-post significant differences were observed for any outcome variable in the control group. Conclusion Total wellness is improved when sedentary, but sufficiently physically active adults, reduce sedentary time and increase physical activity levels (i.e. light, walking, moderate and vigorous).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The pattern of protein intake following exercise may impact whole-body protein turnover and net protein retention. We determined the effects of different protein feeding strategies on protein metabolism in resistance-trained young men. Methods: Participants were randomly assigned to ingest either 80g of whey protein as 8x10g every 1.5h (PULSE; n=8), 4x20g every 3h (intermediate, INT; n=7), or 2x40g every 6h (BOLUS; n=8) after an acute bout of bilateral knee extension exercise (4x10 repetitions at 80% maximal strength). Whole-body protein turnover (Q), synthesis (S), breakdown (B), and net balance (NB) were measured throughout 12h of recovery by a bolus ingestion of [ 15N]glycine with urinary [15N]ammonia enrichment as the collected end-product. Results PULSE Q rates were greater than BOLUS (?19%, P<0.05) with a trend towards being greater than INT (?9%, P=0.08). Rates of S were 32% and 19% greater and rates of B were 51% and 57% greater for PULSE as compared to INT and BOLUS, respectively (P<0.05), with no difference between INT and BOLUS. There were no statistical differences in NB between groups (P=0.23); however, magnitude-based inferential statistics revealed likely small (mean effect90%CI; 0.590.87) and moderate (0.800.91) increases in NB for PULSE and INT compared to BOLUS and possible small increase (0.421.00) for INT vs. PULSE. Conclusion We conclude that the pattern of ingested protein, and not only the total daily amount, can impact whole-body protein metabolism. Individuals aiming to maximize NB would likely benefit from repeated ingestion of moderate amounts of protein (?20g) at regular intervals (?3h) throughout the day.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction Malnutrition is common among hospitalised patients, with poor follow-up of nutrition support post-discharge. Published studies on the efficacy of ambulatory nutrition support (ANS) for malnourished patients post-discharge are scarce. The aims of this study were to evaluate the rate of dietetics follow-up of malnourished patients post-discharge, before (2008) and after (2010) implementation of a new ANS service, and to evaluate nutritional outcomes post-implementation. Materials and Methods Consecutive samples of 261 (2008) and 163 (2010) adult inpatients referred to dietetics and assessed as malnourished using Subjective Global Assessment (SGA) were enrolled. All subjects received inpatient nutrition intervention and dietetic outpatient clinic follow-up appointments. For the 2010 cohort, ANS was initiated to provide telephone follow-up and home visits for patients who failed to attend the outpatient clinic. Subjective Global Assessment, body weight, quality of life (EQ-5D VAS) and handgrip strength were measured at baseline and five months post-discharge. Paired t-test was used to compare pre- and post-intervention results. Results In 2008, only 15% of patients returned for follow-up with a dietitian within four months post-discharge. After implementation of ANS in 2010, the follow-up rate was 100%. Mean weight improved from 44.0 ± 8.5kg to 46.3 ± 9.6kg, EQ-5D VAS from 61.2 ± 19.8 to 71.6 ± 17.4 and handgrip strength from 15.1 ± 7.1 kg force to 17.5 ± 8.5 kg force; p<0.001 for all. Seventy-four percent of patients improved in SGA score. Conclusion Ambulatory nutrition support resulted in significant improvements in follow-up rate, nutritional status and quality of life of malnourished patients post-discharge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The prevalence of protein-energy malnutrition in older adults is reported to be as high as 60% and is associated with poor health outcomes. Inadequate feeding assistance and mealtime interruptions may contribute to malnutrition and poor nutritional intake during hospitalisation. Despite being widely implemented in practice in the United Kingdom and increasingly in Australia, there have been few studies examining the impact of strategies such as Protected Mealtimes and dedicated feeding assistant roles on nutritional outcomes of elderly inpatients. AIMS: The aim of this research was to implement and compare three system-level interventions designed to specifically address mealtime barriers and improve energy intakes of medical inpatients aged ≥65 years. This research also aimed to evaluate the sustainability of any changes to mealtime routines six months post-intervention and to gain an understanding of staff perceptions of the post-intervention mealtime experience. METHODS: Three mealtime assistance interventions were implemented in three medical wards at Royal Brisbane and Women's Hospital: AIN-only: Additional assistant-in-nursing (AIN) with dedicated nutrition role. PM-only: Multidisciplinary approach to meals, including Protected Mealtimes. PM+AIN: Combined intervention: AIN + multidisciplinary approach to meals. An action research approach was used to carefully design and implement the three interventions in partnership with ward staff and managers. Significant time was spent in consultation with staff throughout the implementation period to facilitate ownership of the interventions and increase likelihood of successful implementation. A pre-post design was used to compare the implementation and nutritional outcomes of each intervention to a pre-intervention group. Using the same wards, eligible participants (medical inpatients aged ≥65 years) were recruited to the preintervention group between November 2007 and March 2008 and to the intervention groups between January and June 2009. The primary nutritional outcome was daily energy and protein intake, which was determined by visually estimating plate waste at each meal and mid-meal on Day 4 of admission. Energy and protein intakes were compared between the pre and post intervention groups. Data were collected on a range of covariates (demographics, nutritional status and known risk factors for poor food intake), which allowed for multivariate analysis of the impact of the interventions on nutritional intake. The provision of mealtime assistance to participants and activities of ward staff (including mealtime interruptions) were observed in the pre-intervention and intervention groups, with staff observations repeated six months post-intervention. Focus groups were conducted with nursing and allied health staff in June 2009 to explore their attitudes and behaviours in response to the three mealtime interventions. These focus group discussions were analysed using thematic analysis. RESULTS: A total of 254 participants were recruited to the study (pre-intervention: n=115, AIN-only: n=58, PM-only: n=39, PM+AIN: n=42). Participants had a mean age of 80 years (SD 8), and 40% (n=101) were malnourished on hospital admission, 50% (n=108) had anorexia and 38% (n=97) required some assistance at mealtimes. Occasions of mealtime assistance significantly increased in all interventions (p<0.01). However, no change was seen in mealtime interruptions. No significant difference was seen in mean total energy and protein intake between the preintervention and intervention groups. However, when total kilojoule intake was compared with estimated requirements at the individual level, participants in the intervention groups were more likely to achieve adequate energy intake (OR=3.4, p=0.01), with no difference noted between interventions (p=0.29). Despite small improvements in nutritional adequacy, the majority of participants in the intervention groups (76%, n=103) had inadequate energy intakes to meet their estimated energy requirements. Patients with cognitive impairment or feeding dependency appeared to gain substantial benefit from mealtime assistance interventions. The increase in occasions of mealtime assistance by nursing staff during the intervention period was maintained six-months post-intervention. Staff focus groups highlighted the importance of clearly designating and defining mealtime responsibilities in order to provide adequate mealtime care. While the purpose of the dedicated feeding assistant was to increase levels of mealtime assistance, staff indicated that responsibility for mealtime duties may have merely shifted from nursing staff to the assistant. Implementing the multidisciplinary interventions empowered nursing staff to "protect" the mealtime from external interruptions, but further work is required to empower nurses to prioritise mealtime activities within their own work schedules. Staff reported an increase in the profile of nutritional care on all wards, with additional non-nutritional benefits noted including improved mobility and functional independence, and better identification of swallowing difficulties. IMPLICATIONS: The PhD research provides clinicians with practical strategies to immediately introduce change to deliver better mealtime care in the hospital setting, and, as such, has initiated local and state-wide roll-out of mealtime assistance programs. Improved nutritional intakes of elderly inpatients was observed; however given the modest effect size and reducing lengths of hospital stays, better nutritional outcomes may be achieved by targeting the hospital-to-home transition period. Findings from this study suggest that mealtime assistance interventions for elderly inpatients with cognitive impairment and/or functional dependency show promise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction Malnutrition is common among hospitalised patients, with poor follow-up of nutrition support post-discharge. Published studies on the efficacy of ambulatory nutrition support (ANS) for malnourished patients post-discharge are scarce. The aims of this study were to evaluate the rate of dietetics follow-up of malnourished patients post-discharge, before (2008) and after (2010) implementation of a new ANS service, and to evaluate nutritional outcomes post-implementation. Materials and Methods Consecutive samples of 261 (2008) and 163 (2010) adult inpatients referred to dietetics and assessed as malnourished using Subjective Global Assessment (SGA) were enrolled. All subjects received inpatient nutrition intervention and dietetic outpatient clinic follow-up appointments. For the 2010 cohort, ANS was initiated to provide telephone follow-up and home visits for patients who failed to attend the outpatient clinic. Subjective Global Assessment, body weight, quality of life (EQ-5D VAS) and handgrip strength were measured at baseline and five months post-discharge. Paired t-test was used to compare pre- and post-intervention results. Results In 2008, only 15% of patients returned for follow-up with a dietitian within four months post-discharge. After implementation of ANS in 2010, the follow-up rate was 100%. Mean weight improved from 44.0 ± 8.5kg to 46.3 ± 9.6kg, EQ-5D VAS from 61.2 ± 19.8 to 71.6 ± 17.4 and handgrip strength from 15.1 ± 7.1 kg force to 17.5 ± 8.5 kg force; p<0.001 for all. Seventy-four percent of patients improved in SGA score. Conclusion Ambulatory nutrition support resulted in significant improvements in follow-up rate, nutritional status and quality of life of malnourished patients post-discharge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Natural disasters can have adverse effect on human lives. To raise the awareness of research and better combat future events, it is important to identify recent research trends in the area of post disaster reconstruction (PDR). The authors used a three-round literature review strategy to study journal papers published in the last decade that are related to PDR with specific conditions using the Scopus search engine. A wide range of PDR related papers from a general perspective was examined in the first two rounds while the final round established 88 papers as target publications through visual examination of the abstracts, keywords and as necessary, main texts. These papers were analysed in terms of research origins, active researchers, research organisations, most cited papers, regional concerns, major themes and deliverables, for clues of the past trends and future directions. The need for appropriate PDR research is increasingly recognised. The publication number multiplied 5 times from 2002 to 2012. For PDR research with a construction perspective, the increase is sixfold. Developing countries such as those in Asia attract almost 50% researchers' attention for regional concerns while the US is the single most concentrated (24%) country. Africa is hardly represented. Researchers in developed countries lead in worldwide PDR research. This contrasts to the need for expertise in developing countries. Past works focused on waste management, stakeholder analysis, resourcing, infrastructure issue, resilience and vulnerability, reconstruction approach, sustainable reconstruction and governance issues. Future research should respond to resourcing, integrated development, sustainability and resilience building to cover the gaps. By means of a holistic summary and structured analysis of key patterns, the authors hope to provide a streamlined access to existing research findings and make predictions of future trends. They also hope to encourage a more holistic approach to PDR research and international collaborations.