909 resultados para long-term hospitalization
Resumo:
Background Dementia is a chronic illness without cure or effective treatment, which results in declining mental and physical function and assistance from others to manage activities of daily living. Many people with dementia live in long term care facilities, yet research into their quality of life (QoL) was rare until the last decade. Previous studies failed to incorporate important variables related to the facility and care provision or to look closely at the daily lives of residents. This paper presents a protocol for a comprehensive, multi-perspective assessment of QoL of residents with dementia living in long term care in Australia. A secondary aim is investigating the effectiveness of self-report instruments for measuring QoL. Methods The study utilizes a descriptive, mixed methods design to examine how facility, care staff, and resident factors impact QoL. Over 500 residents with dementia from a stratified, random sample of 53 facilities are being recruited. A sub-sample of 12 residents is also taking part in qualitative interviews and observations. Conclusions This national study will provide a broad understanding of factors underlying QoL for residents with dementia in long term care. The present study uses a similar methodology to the US-based Collaborative Studies of Long Term Care (CS-LTC) Dementia Care Study, applying it to the Australian setting.
Resumo:
Background Over half of the residents in long-term care have a diagnosis of dementia. Maintaining quality of life is important, as there is no cure for dementia. Quality of life may be used as a benchmark for caregiving, and can help to enhance respect for the person with dementia and to improve care provision. The purpose of this study was to describe quality of life as reported by people living with dementia in long-term care in terms of the influencers of, as well as the strategies needed, to improve quality of life. Methods A descriptive exploratory approach. A subsample of twelve residents across two Australian states from a national quantitative study on quality of life was interviewed. Data were analysed thematically from a realist perspective. The approach to the thematic analysis was inductive and data-driven. Results Three themes emerged in relation to influencers and strategies related to quality of life: (a) maintaining independence; (b) having something to do, and; (c) the importance of social interaction. Conclusions The findings highlight the importance of understanding individual resident needs and consideration of the complexity of living in large group living situations, in particular in regard to resident decision-making.
Resumo:
We investigated functional, morphological and molecular adaptations to strength training exercise and cold water immersion (CWI) through two separate studies. In one study, 21 physically active men strength trained for 12 weeks (2 d⋅wk–1), with either 10 min of CWI or active recovery (ACT) after each training session. Strength and muscle mass increased more in the ACT group than in the CWI group (P<0.05). Isokinetic work (19%), type II muscle fibre cross-sectional area (17%) and the number of myonuclei per fibre (26%) increased in the ACT group (all P<0.05) but not the CWI group. In another study, nine active men performed a bout of single-leg strength exercises on separate days, followed by CWI or ACT. Muscle biopsies were collected before and 2, 24 and 48 h after exercise. The number of satellite cells expressing neural cell adhesion molecule (NCAM) (10−30%) and paired box protein (Pax7)(20−50%) increased 24–48 h after exercise with ACT. The number of NCAM+ satellitecells increased 48 h after exercise with CWI. NCAM+- and Pax7+-positivesatellite cell numbers were greater after ACT than after CWI (P<0.05). Phosphorylation of p70S6 kinaseThr421/Ser424 increased after exercise in both conditions but was greater after ACT (P<0.05). These data suggest that CWI attenuates the acute changes in satellite cell numbers and activity of kinases that regulate muscle hypertrophy, which may translate to smaller long-term training gains in muscle strength and hypertrophy. The use of CWI as a regular post-exercise recovery strategy should be reconsidered.
Resumo:
This cross-sectional study assessed intellect, cognition, academic function, behaviour, and emotional health of long-term survivors after childhood liver transplantation. Eligible children were >5 yr post-transplant, still attending school, and resident in Queensland. Hearing and neurocognitive testing were performed on 13 transplanted children and six siblings including two twin pairs where one was transplanted and the other not. Median age at testing was 13.08 (range 6.52-16.99) yr; time elapsed after transplant 10.89 (range 5.16-16.37) yr; and age at transplant 1.15 (range 0.38-10.00) yr. Mean full-scale IQ was 97 (81-117) for transplanted children and 105 (87-130) for siblings. No difficulties were identified in intellect, cognition, academic function, and memory and learning in transplanted children or their siblings, although both groups had reduced mathematical ability compared with normal. Transplanted patients had difficulties in executive functioning, particularly in self-regulation, planning and organization, problem-solving, and visual scanning. Thirty-one percent (4/13) of transplanted patients, and no siblings, scored in the clinical range for ADHD. Emotional difficulties were noted in transplanted patients but were not different from their siblings. Long-term liver transplant survivors exhibit difficulties in executive function and are more likely to have ADHD despite relatively intact intellect and cognition.
Resumo:
Organ-specific immunity is a feature of many infectious diseases, including visceral leishmaniasis caused by Leishmania donovani. Experimental visceral leishmaniasis in genetically susceptible mice is characterized by an acute, resolving infection in the liver and chronic infection in the spleen. CD4+ T cell responses are critical for the establishment and maintenance of hepatic immunity in this disease model, but their role in chronically infected spleens remains unclear. In this study, we show that dendritic cells are critical for CD4+ T cell activation and expansion in all tissue sites examined. We found that FTY720-mediated blockade of T cell trafficking early in infection prevented Ag-specific CD4+ T cells from appearing in lymph nodes, but not the spleen and liver, suggesting that early CD4+ T cell priming does not occur in liver-draining lymph nodes. Extended treatment with FTY720 over the first month of infection increased parasite burdens, although this associated with blockade of lymphocyte egress from secondary lymphoid tissue, as well as with more generalized splenic lymphopenia. Importantly, we demonstrate that CD4+ T cells are required for the establishment and maintenance of antiparasitic immunity in the liver, as well as for immune surveillance and suppression of parasite outgrowth in chronically infected spleens. Finally, although early CD4+ T cell priming appeared to occur most effectively in the spleen, we unexpectedly revealed that protective CD4+ T cell-mediated hepatic immunity could be generated in the complete absence of all secondary lymphoid tissues.
Resumo:
The recent trend for journals to require open access to primary data included in publications has been embraced by many biologists, but has caused apprehension amongst researchers engaged in long-term ecological and evolutionary studies. A worldwide survey of 73 principal investigators (Pls) with long-term studies revealed positive attitudes towards sharing data with the agreement or involvement of the PI, and 93% of PIs have historically shared data. Only 8% were in favor of uncontrolled, open access to primary data while 63% expressed serious concern. We present here their viewpoint on an issue that can have non-trivial scientific consequences. We discuss potential costs of public data archiving and provide possible solutions to meet the needs of journals and researchers.
Resumo:
Malnutrition is common in end-stage liver disease, but a correction after transplantation is expected. Body cell mass (BCM) assessment using total body potassium (TBK) measurements is considered the gold standard for assessing nutritional status. The aim of this study was to examine the BCM and, therefore, nutritional status of long-term survivors after childhood liver transplantation. © 2014 American Association for the Study of Liver Diseases.
Resumo:
Objectives: To describe longitudinal height, weight, and body mass index changes up to 15 years after childhood liver transplantation. Study design: Retrospective chart review of patients who underwent liver transplant from 1985-2004 was performed. Subjects were age <18 years at transplant, survived ≥5 years, with at least 2 recorded measurements, of which one was ≥5 years post-transplant. Measurements were recorded pre-transplant, 1, 5, 10, and 15 years later. Results: Height and weight data were available in 98 and 104 patients, respectively; 47% were age <2 years at transplant; 58% were Australian, and the rest were from Japan. Height recovery continued for at least 10 years to reach the 26th percentile (Z-score -0.67) 15 years after transplant. Australians had better growth recovery and attained 47th percentile (Z-score -0.06) at 15 years. Weight recovery was most marked in the first year and continued for 15 years even in well-nourished children. Growth impaired and malnourished children at transplant exhibited the best growth, but remained significantly shorter and lighter even 15 years later. No effect of sex or age at transplant was noted on height or weight recovery. Post-transplant factors significantly impact growth recovery and likely caused the dichotomous growth recovery between Australian and Japanese children; 9% (9/98) of patients were overweight on body mass index calculations at 10-15 years but none were obese. Conclusions: After liver transplant, children can expect ongoing height and weight recovery for at least 10-15 years. Growth impairment at transplant and post-transplant care significantly impact long-term growth recovery. Copyright © 2013 Mosby Inc. All rights reserved.
Resumo:
Protocols for secure archival storage are becoming increasingly important as the use of digital storage for sensitive documents is gaining wider practice. Wong et al.[8] combined verifiable secret sharing with proactive secret sharing without reconstruction and proposed a verifiable secret redistribution protocol for long term storage. However their protocol requires that each of the receivers is honest during redistribution. We proposed[3] an extension to their protocol wherein we relaxed the requirement that all the recipients should be honest to the condition that only a simple majority amongst the recipients need to be honest during the re(distribution) processes. Further, both of these protocols make use of Feldman's approach for achieving integrity during the (redistribution processes. In this paper, we present a revised version of our earlier protocol, and its adaptation to incorporate Pedersen's approach instead of Feldman's thereby achieving information theoretic secrecy while retaining integrity guarantees.
Resumo:
Background Guidelines and clinical practice for the prevention of complications associated with central venous catheters (CVC) around the world vary greatly. Most institutions recommend the use of heparin to prevent occlusion, however there is debate regarding the need for heparin and evidence to suggest 0.9% sodium chloride (normal saline) may be as effective. The use of heparin is not without risk, may be unnecessary and is also associated with increased cost. Objectives To assess the clinical effects (benefits and harms) of intermittent flushing of heparin versus normal saline to prevent occlusion in long term central venous catheters in infants and children. Search Methods The Cochrane Vascular Trials Search Co-ordinator searched the Specialised Register (last searched April 2015) and the Cochrane Register of Studies (Issue 3, 2015). We also searched the reference lists of retrieved trials. Selection criteria Randomised controlled trials that compared the efficacy of normal saline with heparin to prevent occlusion of long term CVCs in infants and children aged up to 18 years of age were included. We excluded temporary CVCs and peripherally inserted central catheters (PICC). Data Collection and Analysis Two review authors independently assessed trial inclusion criteria, trial quality and extracted data. Rate ratios were calculated for two outcome measures - occlusion of the CVC and central line-associated blood stream infection. Other outcome measures included duration of catheter placement, inability to withdraw blood from the catheter, use of urokinase or recombinant tissue plasminogen, incidence of removal or re-insertion of the catheter, or both, and other CVC-related complications such as dislocation of CVCs, other CVC site infections and thrombosis. Main Results Three trials with a total of 245 participants were included in this review. The three trials directly compared the use of normal saline and heparin, however, between studies, all used different protocols for the standard and experimental arms with different concentrations of heparin and different frequency of flushes reported. In addition, not all studies reported on all outcomes. The quality of the evidence ranged from low to very low because there was no blinding, heterogeneity and inconsistency between studies was high and the confidence intervals were wide. CVC occlusion was assessed in all three trials (243 participants). We were able to pool the results of two trials for the outcomes of CVC occlusion and CVC-associated blood stream infection. The estimated rate ratio for CVC occlusion per 1000 catheter days between the normal saline and heparin group was 0.75 (95% CI 0.10 to 5.51, two studies, 229 participants, very low quality evidence). The estimated rate ratio for CVC-associated blood stream infection was 1.48 (95% CI 0.24 to 9.37, two studies, 231 participants; low quality evidence). The duration of catheter placement was reported to be similar between the two study arms, in one study (203 participants). Authors' Conclusions The review found that there was not enough evidence to determine the effects of intermittent flushing of heparin versus normal saline to prevent occlusion in long term central venous catheters in infants and children. Ultimately, if this evidence were available, the development of evidenced-based clinical practice guidelines and consistency of practice would be facilitated.
Resumo:
- Objective We sought to assess the effect of long-term exposure to ambient air pollution on the prevalence of self-reported health outcomes in Australian women. - Design Cross-sectional study - Setting and participants The geocoded residential addresses of 26 991 women across 3 age cohorts in the Australian Longitudinal Study on Women's Health between 2006 and 2011 were linked to nitrogen dioxide (NO2) exposure estimates from a land-use regression model. Annual average NO2 concentrations and residential proximity to roads were used as proxies of exposure to ambient air pollution. - Outcome measures Self-reported disease presence for diabetes mellitus, heart disease, hypertension, stroke, asthma, chronic obstructive pulmonary disease and self-reported symptoms of allergies, breathing difficulties, chest pain and palpitations. - Methods Disease prevalence was modelled by population-averaged Poisson regression models estimated by generalised estimating equations. Associations between symptoms and ambient air pollution were modelled by multilevel mixed logistic regression. Spatial clustering was accounted for at the postcode level. - Results No associations were observed between any of the outcome and exposure variables considered at the 1% significance level after adjusting for known risk factors and confounders. - Conclusions Long-term exposure to ambient air pollution was not associated with self-reported disease prevalence in Australian women. The observed results may have been due to exposure and outcome misclassification, lack of power to detect weak associations or an actual absence of associations with self-reported outcomes at the relatively low annual average air pollution exposure levels across Australia.
Resumo:
A laboratory study was undertaken to determine the persistence and efficacy of spinosad against Rhyzopertha dominica (F.) in wheat stored for 9 months at 30 degrees C and 55 and 70% relative humidity. The aim was to investigate the potential of spinosad for protecting wheat from R. dominica during long-term storage in warm climates. Wheat was treated with spinosad at 0.1, 0.5 and 1 mg kg(-1) grain and sampled after 0, 1.5, 3, 4.5, 6, 7.5 and 9 months of storage for bioassays and residue analyses. Residues were estimated to have declined by 30% during 9 months of storage at 30 degrees C and there was no effect of relative humidity. Spinosad applied at 0.5 or 1 mg kg(-1) was completely effective for 9 months, with 100% adult mortality after 14 days of exposure and no five F, adults produced. Adult mortality was < 100% in some samples of wheat treated with 0.1 mg kg(-1) of spinosad, and live progeny were produced in all samples treated at this level. The results show that spinosad is likely to be an effective grain protectant against R. dominica in wheat stored in warm climates.
Resumo:
Residue retention is an important issue in evaluating the sustainability of production forestry. However, its long-term impacts have not been studied extensively, especially in sub-tropical environments. This study investigated the long-term impact of harvest residue retention on tree nutrition, growth and productivity of a F1 hybrid (Pinus elliottii var. elliottii × Pinus caribaea var. hondurensis) exotic pine plantation in sub-tropical Australia, under three harvest residue management regimes: (1) residue removal, RR0; (2) single residue retention, RR1; and (3) double residue retention, RR2. The experiment, established in 1996, is a randomised complete block design with 4 replicates. Tree growth measurements in this study were carried out at ages 2, 4, 6, 8 and 10 years, while foliar nutrient analyses were carried out at ages 2, 4, 6 and 10 years. Litter production and litter nitrogen (N) and phosphorus (P) measurements were carried out quarterly over a 15-month period between ages 9 and 10 years. Results showed that total tree growth was still greater in residue-retained treatments compared to the RR0 treatment. However, mean annual increments of diameter at breast height (MAID) and basal area (MAIB) declined significantly after age 4 years to about 68-78% at age 10 years. Declining foliar N and P concentrations accounted for 62% (p < 0.05) of the variation of growth rates after age 4 years, and foliar N and P concentrations were either marginal or below critical concentrations. In addition, litter production, and litter N and P contents were not significantly different among the treatments. This study suggests that the impact of residue retention on tree nutrition and growth rates might be limited over a longer period, and that the integration of alternative forest management practices is necessary to sustain the benefits of harvest residues until the end of the rotation.
Resumo:
Dwindling water supplies for irrigation are prompting alternative management choices by irrigators. Limited irrigation, where less water is applied than full crop demand, may be a viable approach. Application of limited irrigation to corn was examined in this research. Corn was grown in crop rotations with dryland, limited irrigation, or full irrigation management from 1985 to 1999. Crop rotations included corn following corn (continuous corn), corn following wheat, followed by soybean (wheat-corn-soybean), and corn following soybean (corn-soybean). Full irrigation was managed to meet crop evapotranspiration requirements (ETc). Limited irrigation was managed with a seasonal target of no more than 150 mm applied. Precipitation patterns influenced the outcomes of measured parameters. Dryland yields had the most variation, while fully irrigated yields varied the least. Limited irrigation yields were 80% to 90%> of fully irrigated yields, but the limited irrigation plots received about half the applied water. Grain yields were significantly different among irrigation treatments. Yields were not significantly different among rotation treatments for all years and water treatments. For soil water parameters, more statistical differences were detected among the water management treatments than among the crop rotation treatments. Economic projections of these management practices showed that full irrigation produced the most income if water was available. Limited irrigation increased income significantly from dryland management.
Resumo:
Three anaerobic ponds used to store and treat piggery wastes were fully covered with permeable materials manufactured from polypropylene geofabric, polyethylene shade cloth and supported straw. The covers were assessed in terms of efficacy in reducing odour emission rates over a 40-month period. Odour samples were collected from the surface of the covers, the surface of the exposed liquor and from the surface of an uncovered (control) pond at one of the piggeries. Relative to the emission rate of the exposed liquor at each pond, the polypropylene, shade cloth and straw covers reduced average emission rates by 76%, 69% and 66%, respectively. At the piggery with an uncovered control pond, the polypropylene covers reduced average odour emission rates by 50% and 41%, respectively. A plausible hypothesis, consistent with likely mechanisms for the odour reduction and the olfactometric method used to quantifying the efficacy of the covers, is offered.