954 resultados para Long-term care facilities
Resumo:
This thesis provides the first inquiry into the use of creative activities in dementia care in residential aged care facilities in Australia. The study used descriptive method design, incorporating a mix of quantitative and qualitative approaches to explore the incidence and the characteristics of these activities from the carers' perspective. Information about the use of creative activities and the appreciation of these activities by residents and carers is essential to the provision of dementia care and treatment to improve the quality of life of people with dementia.
Resumo:
Flexor digitorum longus transfer and medial displacement alcaneal osteotomy is a wellrecognised form of treatment or stage II posterior tibial tendon dysfunction. Although excellent short- and medium-term results have been reported, the long-term outcome is unknown. We reviewed the clinical outcome of 31 patients with a symptomatic flexible flatfoot deformity who underwent this procedure between 1994 and 1996. There were 21 women and ten men with a mean age of 54.3 years (42 to 70). The mean follow-up was 15.2 years (11.4 to 16.5). All scores improved significantly (p < 0.001). The mean American Orthopedic Foot and Ankle Society (AOFAS) score improved from 48.4 pre-operatively to 90.3 (54 to 100) at the final follow-up. The mean pain component improved from 12.3 to 35.2 (20 to 40). The mean function score improved from 35.2 to 45.6 (30 to 50). The mean visual analogue score for pain improved from 7.3 to 1.3 (0 to 6). The mean Short Form-36 physical component score was 40.6 (SD 8.9), and this showed a significant correlation with the mean AOFAS score (r = 0.68, p = 0.005). A total of 27 patients (87%) were pain free and functioning well at the final follow-up. We believe that flexor digitorum longus transfer and calcaneal osteotomy provides long-term pain relief and satisfactory function in the treatment of stage II posterior tibial tendon dysfunction.
Resumo:
Long-term systematic population monitoring data sets are rare but are essential in identifying changes in species abundance. In contrast, community groups and natural history organizations have collected many species lists. These represent a large, untapped source of information on changes in abundance but are generally considered of little value. The major problem with using species lists to detect population changes is that the amount of effort used to obtain the list is often uncontrolled and usually unknown. It has been suggested that using the number of species on the list, the "list length," can be a measure of effort. This paper significantly extends the utility of Franklin's approach using Bayesian logistic regression. We demonstrate the value of List Length Analysis to model changes in species prevalence (i.e., the proportion of lists on which the species occurs) using bird lists collected by a local bird club over 40 years around Brisbane, southeast Queensland, Australia. We estimate the magnitude and certainty of change for 269 bird species and calculate the probabilities that there have been declines and increases of given magnitudes. List Length Analysis confirmed suspected species declines and increases. This method is an important complement to systematically designed intensive monitoring schemes and provides a means of utilizing data that may otherwise be deemed useless. The results of List Length Analysis can be used for targeting species of conservation concern for listing purposes or for more intensive monitoring. While Bayesian methods are not essential for List Length Analysis, they can offer more flexibility in interrogating the data and are able to provide a range of parameters that are easy to interpret and can facilitate conservation listing and prioritization. © 2010 by the Ecological Society of America.
Resumo:
Changing environments pose a serious problem to current robotic systems aiming at long term operation under varying seasons or local weather conditions. This paper is built on our previous work where we propose to learn to predict the changes in an environment. Our key insight is that the occurring scene changes are in part systematic, repeatable and therefore predictable. The goal of our work is to support existing approaches to place recognition by learning how the visual appearance of an environment changes over time and by using this learned knowledge to predict its appearance under different environmental conditions. We describe the general idea of appearance change prediction (ACP) and investigate properties of our novel implementation based on vocabularies of superpixels (SP-ACP). Our previous work showed that the proposed approach significantly improves the performance of SeqSLAM and BRIEF-Gist for place recognition on a subset of the Nordland dataset under extremely different environmental conditions in summer and winter. This paper deepens the understanding of the proposed SP-ACP system and evaluates the influence of its parameters. We present the results of a large-scale experiment on the complete 10 h Nordland dataset and appearance change predictions between different combinations of seasons.
Resumo:
In the structural health monitoring (SHM) field, long-term continuous vibration-based monitoring is becoming increasingly popular as this could keep track of the health status of structures during their service lives. However, implementing such a system is not always feasible due to on-going conflicts between budget constraints and the need of sophisticated systems to monitor real-world structures under their demanding in-service conditions. To address this problem, this paper presents a comprehensive development of a cost-effective and flexible vibration DAQ system for long-term continuous SHM of a newly constructed institutional complex with a special focus on the main building. First, selections of sensor type and sensor positions are scrutinized to overcome adversities such as low-frequency and low-level vibration measurements. In order to economically tackle the sparse measurement problem, a cost-optimized Ethernet-based peripheral DAQ model is first adopted to form the system skeleton. A combination of a high-resolution timing coordination method based on the TCP/IP command communication medium and a periodic system resynchronization strategy is then proposed to synchronize data from multiple distributed DAQ units. The results of both experimental evaluations and experimental–numerical verifications show that the proposed DAQ system in general and the data synchronization solution in particular work well and they can provide a promising cost-effective and flexible alternative for use in real-world SHM projects. Finally, the paper demonstrates simple but effective ways to make use of the developed monitoring system for long-term continuous structural health evaluation as well as to use the instrumented building herein as a multi-purpose benchmark structure for studying not only practical SHM problems but also synchronization related issues.
Resumo:
AIM: This systematic review investigated the prescription, administration and effectiveness of oral liquid nutritional supplements (OLNS) for people with dementia in residential aged care facilities (RACF). METHODS: A comprehensive search of relevant databases, hand searching and cross-referencing found 15 relevant articles from a total of 2910 possible results. Articles which met the inclusion criteria were critically appraised by two independent reviewers using the relevant Joanna Briggs Institute (JBI) appraisal checklist. Data were extracted using the relevant JBI extraction instruments. No data synthesis was possible due to clinical and methodological heterogeneity. RESULTS: Included studies examined a range of strategies, issues and results related to OLNS for persons with dementia in RACFs; however there appear to be significant gaps in the current body of research, particularly in relation to examinations of effectiveness. CONCLUSIONS: This review was unable to produce a definitive finding regarding effectiveness. OLNS may improve the nutritional state of residents with dementia and help prevent weight loss, and there is some suggestion that it may slow the rate of cognitive decline. However, in order for OLNS to be effective, nursing and care staff need to ensure that sufficient attention is paid to the issues of prescription and administration.
Resumo:
We investigated functional, morphological and molecular adaptations to strength training exercise and cold water immersion (CWI) through two separate studies. In one study, 21 physically active men strength trained for 12 weeks (2 d⋅wk–1), with either 10 min of CWI or active recovery (ACT) after each training session. Strength and muscle mass increased more in the ACT group than in the CWI group (P<0.05). Isokinetic work (19%), type II muscle fibre cross-sectional area (17%) and the number of myonuclei per fibre (26%) increased in the ACT group (all P<0.05) but not the CWI group. In another study, nine active men performed a bout of single-leg strength exercises on separate days, followed by CWI or ACT. Muscle biopsies were collected before and 2, 24 and 48 h after exercise. The number of satellite cells expressing neural cell adhesion molecule (NCAM) (10−30%) and paired box protein (Pax7)(20−50%) increased 24–48 h after exercise with ACT. The number of NCAM+ satellitecells increased 48 h after exercise with CWI. NCAM+- and Pax7+-positivesatellite cell numbers were greater after ACT than after CWI (P<0.05). Phosphorylation of p70S6 kinaseThr421/Ser424 increased after exercise in both conditions but was greater after ACT (P<0.05). These data suggest that CWI attenuates the acute changes in satellite cell numbers and activity of kinases that regulate muscle hypertrophy, which may translate to smaller long-term training gains in muscle strength and hypertrophy. The use of CWI as a regular post-exercise recovery strategy should be reconsidered.
Resumo:
This cross-sectional study assessed intellect, cognition, academic function, behaviour, and emotional health of long-term survivors after childhood liver transplantation. Eligible children were >5 yr post-transplant, still attending school, and resident in Queensland. Hearing and neurocognitive testing were performed on 13 transplanted children and six siblings including two twin pairs where one was transplanted and the other not. Median age at testing was 13.08 (range 6.52-16.99) yr; time elapsed after transplant 10.89 (range 5.16-16.37) yr; and age at transplant 1.15 (range 0.38-10.00) yr. Mean full-scale IQ was 97 (81-117) for transplanted children and 105 (87-130) for siblings. No difficulties were identified in intellect, cognition, academic function, and memory and learning in transplanted children or their siblings, although both groups had reduced mathematical ability compared with normal. Transplanted patients had difficulties in executive functioning, particularly in self-regulation, planning and organization, problem-solving, and visual scanning. Thirty-one percent (4/13) of transplanted patients, and no siblings, scored in the clinical range for ADHD. Emotional difficulties were noted in transplanted patients but were not different from their siblings. Long-term liver transplant survivors exhibit difficulties in executive function and are more likely to have ADHD despite relatively intact intellect and cognition.
Resumo:
Organ-specific immunity is a feature of many infectious diseases, including visceral leishmaniasis caused by Leishmania donovani. Experimental visceral leishmaniasis in genetically susceptible mice is characterized by an acute, resolving infection in the liver and chronic infection in the spleen. CD4+ T cell responses are critical for the establishment and maintenance of hepatic immunity in this disease model, but their role in chronically infected spleens remains unclear. In this study, we show that dendritic cells are critical for CD4+ T cell activation and expansion in all tissue sites examined. We found that FTY720-mediated blockade of T cell trafficking early in infection prevented Ag-specific CD4+ T cells from appearing in lymph nodes, but not the spleen and liver, suggesting that early CD4+ T cell priming does not occur in liver-draining lymph nodes. Extended treatment with FTY720 over the first month of infection increased parasite burdens, although this associated with blockade of lymphocyte egress from secondary lymphoid tissue, as well as with more generalized splenic lymphopenia. Importantly, we demonstrate that CD4+ T cells are required for the establishment and maintenance of antiparasitic immunity in the liver, as well as for immune surveillance and suppression of parasite outgrowth in chronically infected spleens. Finally, although early CD4+ T cell priming appeared to occur most effectively in the spleen, we unexpectedly revealed that protective CD4+ T cell-mediated hepatic immunity could be generated in the complete absence of all secondary lymphoid tissues.
Resumo:
The recent trend for journals to require open access to primary data included in publications has been embraced by many biologists, but has caused apprehension amongst researchers engaged in long-term ecological and evolutionary studies. A worldwide survey of 73 principal investigators (Pls) with long-term studies revealed positive attitudes towards sharing data with the agreement or involvement of the PI, and 93% of PIs have historically shared data. Only 8% were in favor of uncontrolled, open access to primary data while 63% expressed serious concern. We present here their viewpoint on an issue that can have non-trivial scientific consequences. We discuss potential costs of public data archiving and provide possible solutions to meet the needs of journals and researchers.
Resumo:
Malnutrition is common in end-stage liver disease, but a correction after transplantation is expected. Body cell mass (BCM) assessment using total body potassium (TBK) measurements is considered the gold standard for assessing nutritional status. The aim of this study was to examine the BCM and, therefore, nutritional status of long-term survivors after childhood liver transplantation. © 2014 American Association for the Study of Liver Diseases.
Resumo:
Daytime sleep is a significant part of the daily routine for children attending early childhood education and care (ECEC) services in Australia and many other countries. The practice of sleep-time can account for a substantial portion of the day in ECEC and often involves a mandated sleep/rest period for all children, including older preschool-aged children. Yet, there is evidence that children have a reduced need for daytime sleep as they approach school entry age and that continuation of mandated sleep-time in ECEC for preschool-aged children may have a negative impact on their health, development, learning and well-being. Mandated sleep-time practices also go against current quality expectations for services to support children’s agency and autonomy in ECEC. This study documents children’s reports of their experiences of sleep-time in ECEC. Semi-structured interviews were conducted with 54 preschool-aged children (44–63 months) across four long day ECEC services that employed a range of sleep-time practices. Findings provide a snapshot of children’s views and experiences of sleep-time and perceptions of autonomy-supportive practices. These provide a unique platform to support critical reflection on sleep-time policies and practices, with a view to continuous quality improvement in ECEC. This study forms part of a programme of work from the Sleep in Early Childhood research group. Our work examines sleep practices in ECEC, the subsequent staff, parent and child experiences and impacts on family and child learning and development outcomes.
Resumo:
Protocols for secure archival storage are becoming increasingly important as the use of digital storage for sensitive documents is gaining wider practice. Wong et al.[8] combined verifiable secret sharing with proactive secret sharing without reconstruction and proposed a verifiable secret redistribution protocol for long term storage. However their protocol requires that each of the receivers is honest during redistribution. We proposed[3] an extension to their protocol wherein we relaxed the requirement that all the recipients should be honest to the condition that only a simple majority amongst the recipients need to be honest during the re(distribution) processes. Further, both of these protocols make use of Feldman's approach for achieving integrity during the (redistribution processes. In this paper, we present a revised version of our earlier protocol, and its adaptation to incorporate Pedersen's approach instead of Feldman's thereby achieving information theoretic secrecy while retaining integrity guarantees.
Resumo:
Background Guidelines and clinical practice for the prevention of complications associated with central venous catheters (CVC) around the world vary greatly. Most institutions recommend the use of heparin to prevent occlusion, however there is debate regarding the need for heparin and evidence to suggest 0.9% sodium chloride (normal saline) may be as effective. The use of heparin is not without risk, may be unnecessary and is also associated with increased cost. Objectives To assess the clinical effects (benefits and harms) of intermittent flushing of heparin versus normal saline to prevent occlusion in long term central venous catheters in infants and children. Search Methods The Cochrane Vascular Trials Search Co-ordinator searched the Specialised Register (last searched April 2015) and the Cochrane Register of Studies (Issue 3, 2015). We also searched the reference lists of retrieved trials. Selection criteria Randomised controlled trials that compared the efficacy of normal saline with heparin to prevent occlusion of long term CVCs in infants and children aged up to 18 years of age were included. We excluded temporary CVCs and peripherally inserted central catheters (PICC). Data Collection and Analysis Two review authors independently assessed trial inclusion criteria, trial quality and extracted data. Rate ratios were calculated for two outcome measures - occlusion of the CVC and central line-associated blood stream infection. Other outcome measures included duration of catheter placement, inability to withdraw blood from the catheter, use of urokinase or recombinant tissue plasminogen, incidence of removal or re-insertion of the catheter, or both, and other CVC-related complications such as dislocation of CVCs, other CVC site infections and thrombosis. Main Results Three trials with a total of 245 participants were included in this review. The three trials directly compared the use of normal saline and heparin, however, between studies, all used different protocols for the standard and experimental arms with different concentrations of heparin and different frequency of flushes reported. In addition, not all studies reported on all outcomes. The quality of the evidence ranged from low to very low because there was no blinding, heterogeneity and inconsistency between studies was high and the confidence intervals were wide. CVC occlusion was assessed in all three trials (243 participants). We were able to pool the results of two trials for the outcomes of CVC occlusion and CVC-associated blood stream infection. The estimated rate ratio for CVC occlusion per 1000 catheter days between the normal saline and heparin group was 0.75 (95% CI 0.10 to 5.51, two studies, 229 participants, very low quality evidence). The estimated rate ratio for CVC-associated blood stream infection was 1.48 (95% CI 0.24 to 9.37, two studies, 231 participants; low quality evidence). The duration of catheter placement was reported to be similar between the two study arms, in one study (203 participants). Authors' Conclusions The review found that there was not enough evidence to determine the effects of intermittent flushing of heparin versus normal saline to prevent occlusion in long term central venous catheters in infants and children. Ultimately, if this evidence were available, the development of evidenced-based clinical practice guidelines and consistency of practice would be facilitated.