954 resultados para Long-term care facilities
Resumo:
In 2001, the red imported fire ant (Solenopsis invicta Buren) was identified in Brisbane, Australia. An eradication program involving broadcast bait treatment with two insect growth regulators and a metabolic inhibitor began in September of that year and is currently ongoing. To gauge the impacts of these treatments on local ant populations, we examined long-term monitoring data and quantified abundance patterns of S. invicta and common local ant genera using a linear mixed-effects model. For S. invicta, presence in pitfalls reduced over time to zero on every site. Significantly higher numbers of S. invicta workers were collected on high-density polygyne sites, which took longer to disinfest compared with monogyne and low-density polygyne sites. For local ants, nine genus groups of the 10 most common genera analyzed either increased in abundance or showed no significant trend. Five of these genus groups were significantly less abundant at the start of monitoring on high-density polygyne sites compared with monogyne and low-density polygyne sites. The genus Pheidole significantly reduced in abundance over time, suggesting that it was affected by treatment efforts. These results demonstrate that the treatment regime used at the time successfully removed S. invicta from these sites in Brisbane, and that most local ant genera were not seriously impacted by the treatment. These results have important implications for current and future prophylactic treatment efforts, and suggest that native ants remain in treated areas to provide some biological resistance to S. invicta.
Resumo:
Immediate and residual effects of two lengths of low plane of nutrition (PON) on the synthesis of milk protein and protein fractions were studied at the Mutdapilly Research Station, in south-east Queensland. Thirty-six multiparous Holstein-Friesian cows, between 46 and 102 days in milk (DIM) initially, were used in a completely randomised design experiment with three treatments. All cows were fed on a basal diet of ryegrass pasture (7.0 kg DM/cow.day), barley-sorghum concentrate mix (2.7 kg DM/cow.day) and a canola meal-mineral mix (1.3 kg DM/cow.day). To increase PON, 5.0 kg DM/cow.day supplemental maize and forage sorghum silage was added to the basal diet. The three treatments were (C) high PON (basal diet + supplemental silage); (L9) low PON (basal diet only) for a period of 9 weeks; and (L3) low PON (basal diet only) for a period of 3 weeks. The experiment comprised three periods (1) covariate – high PON, all groups (5 weeks), (2) period of low PON for either 3 weeks (L3) or 9 weeks (L9), and (3) period of high PON (all groups) to assess ability of cows to recover any production lost as a result of treatments (5 weeks). The low PON treatment periods for L3 and L9 were end-aligned so that all treatment groups began Period 3 together. Although there was a significant effect of L9 on yields of milk, protein, fat and lactose, and concentrations of true protein, whey protein and urea, these were not significantly different from L3. There were no residual effects of L3 or L9 on protein concentration or nitrogen distribution after 5 weeks of realimentation. There was no significant effect of low PON for 3 or 9 weeks on casein concentration or composition.
Resumo:
Background Hyperferritinemia-cataract syndrome (HCS) is a rare Mendelian condition characterized by bilateral cataract and high levels of serum ferritin in the absence of iron overload. Methods HCS was diagnosed in three adult siblings. In two of them it was possible to assess lens changes initially in 1995 and again in 2013. Serum ferritin, iron, transferrin concentrations and transferrin saturation percentage were also measured, and the Iron Responsive Element (IRE) region of the L-ferritin gene (FTL) was studied. Results Serum ferritin concentrations were considerably elevated while serum iron, transferrin and transferrin saturation levels were within the normal range in each sibling. Cataract changes in our patients were consistent with those previously reported in the literature. Progression of the cataract, an aspect of few studies in this syndrome, appeared to be quite limited in extent. The heterozygous +32G to T (-168G>T) substitution in the IRE of the FTL gene was detected in this family. Conclusions Ophthalmic and biochemical studies together with genetic testing confirmed HCS in three family members. Although the disorder has been extensively described in recent years, little is known regarding cataract evolution over time. In our cases, lens evaluations encompassed many years, identified bilateral cataract of typical morphology and supported the hypothesis that this unique clinical feature of the disease tends to be slowly progressive in nature, at least in adults.
Resumo:
Background There has been considerable publicity regarding population ageing and hospital emergency department (ED) overcrowding. Our study aims to investigate impact of one intervention piloted in Queensland Australia, the Hospital in the Nursing Home (HiNH) program, on reducing ED and hospital attendances from residential aged care facilities (RACFs). Methods A quasi-experimental study was conducted at an intervention hospital undertaking the program and a control hospital with normal practice. Routine Queensland health information system data were extracted for analysis. Results Significant reductions in the number of ED presentations per 1000 RACF beds (rate ratio (95 % CI): 0.78 (0.67–0.92); p = 0.002), number of hospital admissions per 1000 RACF beds (0.62 (0.50–0.76); p < 0.0001), and number of hospital admissions per 100 ED presentations (0.61 (0.43–0.85); p = 0.004) were noticed in the experimental hospital after the intervention; while there were no significant differences between intervention and control hospitals before the intervention. Pre-test and post-test comparison in the intervention hospital also presented significant decreases in ED presentation rate (0.75 (0.65–0.86); p < 0.0001) and hospital admission rate per RACF bed (0.66 (0.54–0.79); p < 0.0001), and a non-significant reduction in hospital admission rate per ED presentation (0.82 (0.61–1.11); p = 0.196). Conclusions Hospital in the Nursing Home program could be effective in reducing ED presentations and hospital admissions from RACF residents. Implementation of the program across a variety of settings is preferred to fully assess the ongoing benefits for patients and any possible cost-savings.
Resumo:
Soil biogeochemical cycles are largely mediated by microorganisms, while fire significantly modifies biogeochemical cycles mainly via altering microbial community and substrate availability. Majority of studies on fire effects have focused on the surface soil; therefore, our understanding of the vertical distribution of microbial communities and the impacts of fire on nitrogen (N) dynamics in the soil profile is limited. Here, we examined the changes of soil denitrification capacity (DNC) and denitrifying communities with depth under different burning regimes, and their interaction with environmental gradients along the soil profile. Results showed that soil depth had a more pronounced impact than the burning treatment on the bacterial community size. The abundance of 16S rRNA and denitrification genes (narG, nirK, and nirS) declined exponentially with soil depth. Surprisingly, the nosZ-harboring denitrifiers were enriched in the deeper soil layers, which was likely to indicate that the nosZ-harboring denitrifiers could better adapt to the stress conditions (i.e., oxygen deficiency, nutrient limitation, etc.) than other denitrifiers. Soil nutrients, including dissolved organic carbon (DOC), total soluble N (TSN), ammonium (NH4 +), and nitrate (NO3 −), declined significantly with soil depth, which probably contributed to the vertical distribution of denitrifying communities. Soil DNC decreased significantly with soil depth, which was negligible in the depths below 20 cm. These findings have provided new insights into niche separation of the N-cycling functional guilds along the soil profile, under a varied fire disturbance regime.
Resumo:
Medication information is a critical part of the information required to ensure residents' safety in the highly collaborative care context of RACFs. Studies report poor medication information as a barrier to improve medication management in RACFs. Research exploring medication work practices in aged care settings remains limited. This study aimed to identify contextual and work practice factors contributing to breakdowns in medication information exchange in RACFs in relation to the medication administration process. We employed non-participant observations and semi-structured interviews to explore information practices in three Australian RACFs. Findings identified inefficiencies due to lack of information timeliness, manual stock management, multiple data transcriptions, inadequate design of essential documents such as administration sheets and a reliance on manual auditing procedures. Technological solutions such as electronic medication administration records offer opportunities to overcome some of the identified problems. However these interventions need to be designed to align with the collaborative team based processes they intend to support.
Resumo:
The aim of this study was to examine the actions of geographically dispersed process stakeholders (doctors, community pharmacists and RACFs) in order to cope with the information silos that exist within and across different settings. The study setting involved three metropolitan RACFs in Sydney, Australia and employed a qualitative approach using semi-structured interviews, non-participant observations and artefact analysis. Findings showed that medication information was stored in silos which required specific actions by each setting to translate this information to fit their local requirements. A salient example of this was the way in which community pharmacists used the RACF medication charts to prepare residents' pharmaceutical records. This translation of medication information across settings was often accompanied by telephone or face-to-face conversations to cross-check, validate or obtain new information. Findings highlighted that technological interventions that work in silos can negatively impact the quality of medication management processes in RACF settings. The implementation of commercial software applications like electronic medication charts need to be appropriately integrated to satisfy the collaborative information requirements of the RACF medication process.
Resumo:
Prescribed fire is one of the most widely-used management tools for reducing fuel loads in managed forests. However the long-term effects of repeated prescribed fires on soil carbon (C) and nitrogen (N) pools are poorly understood. This study aimed to investigate how different fire frequency regimes influence C and N pools in the surface soils (0–10 cm). A prescribed fire field experiment in a wet sclerophyll forest established in 1972 in southeast Queensland was used in this study. The fire frequency regimes included long unburnt (NB), burnt every 2 years (2yrB) and burnt every 4 years (4yrB), with four replications. Compared with the NB treatment, the 2yrB treatment lowered soil total C by 44%, total N by 54%, HCl hydrolysable C and N by 48% and 59%, KMnO4 oxidizable C by 81%, microbial biomass C and N by 42% and 33%, cumulative CO2–C by 28%, NaOCl-non-oxidizable C and N by 41% and 51%, and charcoal-C by 17%, respectively. The 4yrB and NB treatments showed no significant differences for these soil C and N pools. All soil labile, biologically active and recalcitrant and total C and N pools were correlated positively with each other and with soil moisture content, but negatively correlated with soil pH. The C:N ratios of different C and N pools were greater in the burned treatments than in the NB treatments. This study has highlighted that the prescribed burning at four year interval is a more sustainable management practice for this subtropical forest ecosystem.
Resumo:
Bovine Viral Diarrhoea Virus (BVDV) is one of the most serious pathogen, which causes tremendous economic loss to the cattle industry worldwide, meriting the development of improved subunit vaccines. Structural glycoprotein E2 is reported to be a major immunogenic determinant of BVDV virion. We have developed a novel hollow silica vesicles (SV) based platform to administer BVDV-1 Escherichia coli-expressed optimised E2 (oE2) antigen as a nanovaccine formulation. The SV-140 vesicles (diameter 50 nm, wall thickness 6 nm, perforated by pores of entrance size 16 nm and total pore volume of 0.934 cm(3)g(-1)) have proven to be ideal candidates to load oE2 antigen and generate immune response. The current study for the first time demonstrates the ability of freeze-dried (FD) as well as non-FD oE2/SV140 nanovaccine formulation to induce long-term balanced antibody and cell mediated memory responses for at least 6 months with a shortened dosing regimen of two doses in small animal model. The in vivo ability of oE2 (100 mu g)/SV-140 (500 mu g) and FD oE2 (100 mu g)/SV-140 (500 mu g) to induce long-term immunity was compared to immunisation with oE2 (100 mu g) together with the conventional adjuvant Quil-A from the Quillaja saponira (10 mu g) in mice. The oE2/SV-140 as well as the FD oE2/SV-140 nanovaccine generated oE2-specific antibody and cell mediated responses for up to six months post the final second immunisation. Significantly, the cell-mediated responses were consistently high in mice immunised with oE2/SV-140 (1,500 SFU/million cells) at the six-month time point. Histopathology studies showed no morphological changes at the site of injection or in the different organs harvested from the mice immunised with 500 mu g SV-140 nanovaccine compared to the unimmunised control. The platform has the potential for developing single dose vaccines without the requirement of cold chain storage for veterinary and human applications.
Resumo:
This study investigated long-term use of custom-made orthopedic shoes (OS) at 1.5 years follow-up. In addition, the association between short-term outcomes and long-term use was studied. Patients from a previously published study who did use their first-ever pair of OS 3 months after delivery received another questionnaire after 1.5 years. Patients with different pathologies were included in the study (n = 269, response = 86%). Mean age was 63 ± 14 years, and 38% were male. After 1.5 years, 87% of the patients still used their OS (78% frequently [4-7 days/week] and 90% occasionally [1-3 days/week]) and 13% of the patients had ceased using their OS. Patients who were using their OS frequently after 1.5 years had significantly higher scores for 8 of 10 short-term usability outcomes (p-values ranged from <0.001 to 0.046). The largest differences between users and nonusers were found for scores on the short-term outcomes of OS fit and communication with the medical specialist and shoe technician (effect size range = 0.16 to 0.46). We conclude that patients with worse short-term usability outcomes for their OS are more likely to use their OS only occasionally or not at all at long-term follow-up.
Resumo:
Objective To determine mortality rates after a first lower limb amputation and explore the rates for different subpopulations. Methods Retrospective cohort study of all people who underwent a first amputation at or proximal to transtibial level, in an area of 1.7 million people. Analysis with Kaplan-Meier curves and Log Rank tests for univariate associations of psycho-social and health variables. Logistic regression for odds of death at 30-days, 1-year and 5-years. Results 299 people were included. Median time to death was 20.3 months (95%CI: 13.1; 27.5). 30-day mortality = 22%; odds of death 2.3 times higher in people with history of cerebrovascular disease (95%CI: 1.2; 4.7, P = 0.016). 1 year mortality = 44%; odds of death 3.5 times higher for people with renal disease (95%CI: 1.8; 7.0, P < 0.001). 5-years mortality = 77%; odds of death 5.4 times higher for people with renal disease (95%CI: 1.8; 16.0,P = 0.003). Variation in mortality rates was most apparent in different age groups; people 75–84 years having better short term outcomes than those younger and older. Conclusions Mortality rates demonstrated the frailty of this population, with almost one quarter of people dying within 30-days, and almost half at 1 year. People with cerebrovascular had higher odds of death at 30 days, and those with renal disease and 1 and 5 years, respectively.
Resumo:
Information exchange (IE) is a critical component of the complex collaborative medication process in residential aged care facilities (RACFs). Designing information and communication technology (ICT) to support complex processes requires a profound understanding of the IE that underpins their execution. There is little existing research that investigates the complexity of IE in RACFs and its impact on ICT design. The aim of this study was thus to undertake an in-depth exploration of the IE process involved in medication management to identify its implications for the design of ICT. The study was undertaken at a large metropolitan facility in NSW, Australia. A total of three focus groups, eleven interviews and two observation sessions were conducted between July to August 2010. Process modelling was undertaken by translating the qualitative data via in-depth iterative inductive analysis. The findings highlight the complexity and collaborative nature of IE in RACF medication management. These models emphasize the need to: a) deal with temporal complexity; b) rely on an interdependent set of coordinative artefacts; and c) use synchronous communication channels for coordination. Taken together these are crucial aspects of the IE process in RACF medication management that need to be catered for when designing ICT in this critical area. This study provides important new evidence of the advantages of viewing process as a part of a system rather than as segregated tasks as a means of identifying the latent requirements for ICT design and that is able to support complex collaborative processes like medication management in RACFs. © 2012 IEEE.
Resumo:
Background Poor clinical handover has been associated with inaccurate clinical assessment and diagnosis, delays in diagnosis and test ordering, medication errors and decreased patient satisfaction in the acute care setting. Research on the handover process in the residential aged care sector is very limited. Purpose The aims of this study were to: (i) Develop an in-depth understanding of the handover process in aged care by mapping all the key activities and their information dynamics, (ii) Identify gaps in information exchange in the handover process and analyze implications for resident safety, (iii) Develop practical recommendations on how information communication technology (ICT) can improve the process and resident safety. Methods The study was undertaken at a large metropolitan facility in NSW with more than 300 residents and a staff including 55 registered nurses (RNs) and 146 assistants in nursing (AINs). A total of 3 focus groups, 12 interviews and 3 observation sessions were conducted over a period from July to October 2010. Process mapping was undertaken by translating the qualitative data via a five-category code book that was developed prior to the analysis. Results Three major sub-processes were identified and mapped. The three major stages are Handover process (HOP) I “Information gathering by RN”, HOP II “Preparation of preliminary handover sheet” and HOP III “Execution of handover meeting”. Inefficient processes were identified in relation to the handover including duplication of information, utilization of multiple communication modes and information sources, and lack of standardization. Conclusion By providing a robust process model of handover this study has made two critical contributions to research in aged care: (i) a means to identify important, possibly suboptimal practices; and (ii) valuable evidence to plan and improve ICT implementation in residential aged care. The mapping of this process enabled analysis of gaps in information flow and potential impacts on resident safety. In addition it offers the basis for further studies into a process that, despite its importance for securing resident safety and continuity of care, lacks research.
Resumo:
Background Medication safety is a pressing concern for residential aged care facilities (RACFs). Retrospective studies in RACF settings identify inadequate communication between RACFs, doctors, hospitals and community pharmacies as the major cause of medication errors. Existing literature offers limited insight about the gaps in the existing information exchange process that may lead to medication errors. The aim of this research was to explicate the cognitive distribution that underlies RACF medication ordering and delivery to identify gaps in medication-related information exchange which lead to medication errors in RACFs. Methods The study was undertaken in three RACFs in Sydney, Australia. Data were generated through ethnographic field work over a period of five months (May–September 2011). Triangulated analysis of data primarily focused on examining the transformation and exchange of information between different media across the process. Results The findings of this study highlight the extensive scope and intense nature of information exchange in RACF medication ordering and delivery. Rather than attributing error to individual care providers, the explication of distributed cognition processes enabled the identification of gaps in three information exchange dimensions which potentially contribute to the occurrence of medication errors namely: (1) design of medication charts which complicates order processing and record keeping (2) lack of coordination mechanisms between participants which results in misalignment of local practices (3) reliance on restricted communication bandwidth channels mainly telephone and fax which complicates the information processing requirements. The study demonstrates how the identification of these gaps enhances understanding of medication errors in RACFs. Conclusions Application of the theoretical lens of distributed cognition can assist in enhancing our understanding of medication errors in RACFs through identification of gaps in information exchange. Understanding the dynamics of the cognitive process can inform the design of interventions to manage errors and improve residents’ safety.
Resumo:
Life-history theory states that although natural selection would favour a maximisation of both reproductive output and life-span, such a combination can not be achieved in any living organism. According to life-history theory the reason for the fact that not all traits can be maximised simultaneously is that different traits compete with each other for resources. These relationships between traits that constrain the simultaneous evolution of two or more traits are called trade-offs. Therefore, during different life-stages an individual needs to optimise its allocation of resources to life-history components such as growth, reproduction and survival. Resource limitation acts on these traits and therefore investment in one trait, e.g. reproduction, reduces the resources available for investment in another trait, e.g. residual reproduction or survival. In this thesis I study how food resources during different stages of the breeding event affect reproductive decisions in the Ural owl (Strix uralensis) and the consequences of these decisions on parents and offspring. The Ural owl is a suitable study species for such studies in natural populations since they are long-lived, site-tenacious, and feed on voles. The vole populations in Fennoscandia fluctuate in three- to four-year cycles, which create a variable food environment for the Ural owls to cope with. The thesis gives new insight in reproductive costs and their consequences in natural animal populations with emphasis on underlying physiological mechanisms. I found that supplementary fed Ural owl parents invest supplemented food resources during breeding in own self-maintenance instead of allocating those resources to offspring growth. This investment in own maintenance instead of improving current reproduction had carry-over effects to the following year in terms of increased reproductive output. Therefore, I found evidence that reduced reproductive costs improves future reproductive performance. Furthermore, I found evidence for the underlying mechanism behind this carry-over effect of supplementary food on fecundity. The supplementary-fed parents reduced their feeding investment in the offspring compared to controls, which enabled the fed female parents to invest the surplus resources in parasite resistance. Fed female parents had lower blood parasite loads than control females and this effect lasted until the following year when also reproductive output was increased. Hence, increased investment in parasite resistance when resources are plentiful has the potential to mediate positive carry-over effects on future reproduction. I further found that this carry-over effect was only present when potentials for future reproduction were good. The thesis also provides new knowledge on resource limitation on maternal effects. I found that increased resources prior to egg laying improve the condition and health of Ural owl females and enable them to allocate more resources to reproduction than control females. These additional resources are not allocated to increase the number of offspring, but instead to improve the quality of each offspring. Fed Ural owl females increased the size of their eggs and allocated more health improving immunological components into the eggs. Furthermore, the increased egg size had long-lasting effects on offspring growth, as offspring from larger eggs were heavier at fledging. Limiting resources can have different short- and long-term consequences on reproductive decisions that affect both offspring number and quality. In long-lived organisms, such as the Ural owl, it appears to be beneficial in terms of fitness to invest in long breeding life-span instead of additional investment in current reproduction. In Ural owls, females can influence the phenotypic quality of the offspring by transferring additional resources to the eggs that can have long-lasting effects on growth.