161 resultados para Long-term care facilities


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background/Rationale Guided by the need-driven dementia-compromised behavior (NDB) model, this study examined influences of the physical environment on wandering behavior. Methods Using a descriptive, cross-sectional design, 122 wanderers from 28 long-term care (LTC) facilities were videotaped 10 to 12 times; data on wandering, light, sound, temperature and humidity levels, location, ambiance, and crowding were obtained. Associations between environmental variables and wandering were evaluated with chi-square and t tests; the model was evaluated using logistic regression. Results In all, 80% of wandering occurred in the resident’s own room, dayrooms, hallways, or dining rooms. When observed in other residents’ rooms, hallways, shower/baths, or off-unit locations, wanderers were likely (60%-92% of observations) to wander. The data were a good fit to the model overall (LR [logistic regression] χ2 (5) = 50.38, P < .0001) and by wandering type. Conclusions Location, light, sound, proximity of others, and ambiance are associated with wandering and may serve to inform environmental designs and care practices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The efficiency of agricultural management practices to store SOC depends on C input level and how far a soil is from its saturation level (i.e. saturation deficit). The C Saturation hypothesis suggests an ultimate soil C stabilization capacity defined by four SOM pools capable of C saturation: (1) non-protected, (2) physically protected, (3) chemically protected and (4) biochemically protected. We tested if C saturation deficit and the amount of added C influenced SOC storage in measurable soil fractions corresponding to the conceptual chemical, physical, biochemical, and non-protected C pools. We added two levels of C-13- labeled residue to soil samples from seven agricultural sites that were either closer to (i.e., A-horizon) or further from (i.e., C-horizon) their C saturation level and incubated them for 2.5 years. Residue-derived C stabilization was, in most sites, directly related to C saturation deficit but mechanisms of C stabilization differed between the chemically and biochemically protected pools. The physically protected C pool showed a varied effect of C saturation deficit on C-13 stabilization, due to opposite behavior of the POM and mineral fractions. We found distinct behavior between unaggregated and aggregated mineral-associated fractions emphasizing the mechanistic difference between the chemically and physically protected C-pools. To accurately predict SOC dynamics and stabilization, C Saturation of soil C pools, particularly the chemically and biochemically protected pools, should be considered. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Microorganisms play key roles in biogeochemical cycling by facilitating the release of nutrients from organic compounds. In doing so, microbial communities use different organic substrates that yield different amounts of energy for maintenance and growth of the community. Carbon utilization efficiency (CUE) is a measure of the efficiency with which substrate carbon is metabolized versus mineralized by the microbial biomass. In the face of global change, we wanted to know how temperature affected the efficiency by which the soil microbial community utilized an added labile substrate, and to determine the effect of labile soil carbon depletion (through increasing duration of incubation) on the community's ability to respond to an added substrate. Cellobiose was added to soil samples as a model compound at several times over the course of a long-term incubation experiment to measure the amount of carbon assimilated or lost as CO2 respiration. Results indicated that in all cases, the time required for the microbial community to take up the added substrate increased as incubation time prior to substrate addition increased. However, the CUE was not affected by incubation time. Increased temperature generally decreased CUE, thus the microbial community was more efficient at 15 degrees C than at 25 degrees C. These results indicate that at warmer temperatures microbial communities may release more CO2 per unit of assimilated carbon. Current climate-carbon models have a fixed CUE to predict how much CO2 will be released as soil organic matter is decomposed. Based on our findings, this assumption may be incorrect due to variation of CUE with changing temperature. (c) 2008 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although current assessments of agricultural management practices on soil organic C (SOC) dynamics are usually conducted without any explicit consideration of limits to soil C storage, it has been hypothesized that the SOC pool has an upper, or saturation limit with respect to C input levels at steady state. Agricultural management practices that increase C input levels over time produce a new equilibrium soil C content. However, multiple C input level treatments that produce no increase in SOC stocks at equilibrium show that soils have become saturated with respect to C inputs. SOC storage of added C input is a function of how far a soil is from saturation level (saturation deficit) as well as C input level. We tested experimentally if C saturation deficit and varying C input levels influenced soil C stabilization of added C-13 in soils varying in SOC content and physiochemical characteristics. We incubated for 2.5 years soil samples from seven agricultural sites that were closer to (i.e., A-horizon) or further from (i.e., C-horizon) their C saturation limit. At the initiation of the incubations, samples received low or high C input levels of 13 C-labeled wheat straw. We also tested the effect of Ca addition and residue quality on a subset of these soils. We hypothesized that the proportion of C stabilized would be greater in samples with larger C Saturation deficits (i.e., the C- versus A-horizon samples) and that the relative stabilization efficiency (i.e., Delta SCC/Delta C input) would decrease as C input level increased. We found that C saturation deficit influenced the stabilization of added residue at six out of the seven sites and C addition level affected the stabilization of added residue in four sites, corroborating both hypotheses. Increasing Ca availability or decreasing residue quality had no effect on the stabilization of added residue. The amount of new C stabilized was significantly related to C saturation deficit, supporting the hypothesis that C saturation influenced C stabilization at all our sites. Our results suggest that soils with low C contents and degraded lands may have the greatest potential and efficiency to store added C because they are further from their saturation level. (c) 2008 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

No-tillage (NT) management has been promoted as a practice capable of offsetting greenhouse gas (GHG) emissions because of its ability to sequester carbon in soils. However, true mitigation is only possible if the overall impact of NT adoption reduces the net global warming potential (GWP) determined by fluxes of the three major biogenic GHGs (i.e. CO2, N2O, and CH4). We compiled all available data of soil-derived GHG emission comparisons between conventional tilled (CT) and NT systems for humid and dry temperate climates. Newly converted NT systems increase GWP relative to CT practices, in both humid and dry climate regimes, and longer-term adoption (>10 years) only significantly reduces GWP in humid climates. Mean cumulative GWP over a 20-year period is also reduced under continuous NT in dry areas, but with a high degree of uncertainty. Emissions of N2O drive much of the trend in net GWP, suggesting improved nitrogen management is essential to realize the full benefit from carbon storage in the soil for purposes of global warming mitigation. Our results indicate a strong time dependency in the GHG mitigation potential of NT agriculture, demonstrating that GHG mitigation by adoption of NT is much more variable and complex than previously considered, and policy plans to reduce global warming through this land management practice need further scrutiny to ensure success.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To undertake exploratory benchmarking of a set of clinical indicators of quality care in residential care in Australia, data were collected from 107 residents within four medium-sized facilities (40–80 beds) in Brisbane, Australia. The proportion of residents in each sample facility with a particular clinical problem was compared with US Minimum Data Set quality indicator thresholds. Results demonstrated variability within and between clinical indicators, suggesting breadth of assessment using various clinical indicators of quality is an important factor when monitoring quality of care. More comprehensive and objective measures of quality of care would be of great assistance in determining and monitoring the effectiveness of residential aged care provision in Australia, particularly as demands for accountability by consumers and their families increase. What is known about the topic? The key to quality improvement is effective quality assessment, and one means of evaluating quality of care is through clinical outcomes. The Minimum Data Set quality indicators have been credited with improving quality in United States nursing homes. What does this paper add? The Clinical Care Indicators Tool was used to collect data on clinical outcomes, enabling comparison of data from a small Australian sample with American quality benchmarks to illustrate the utility of providing guidelines for interpretation. What are the implications for practitioners? Collecting and comparing clinical outcome data would enable practitioners to better understand the quality of care being provided and whether practices required review. The Clinical Care Indicator Tool could provide a comprehensive and systematic means of doing this, thus filling a gap in quality monitoring within Australian residential aged care.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: In response to the need for more comprehensive quality assessment within Australian residential aged care facilities, the Clinical Care Indicator (CCI) Tool was developed to collect outcome data as a means of making inferences about quality. A national trial of its effectiveness and a Brisbane-based trial of its use within the quality improvement context determined the CCI Tool represented a potentially valuable addition to the Australian aged care system. This document describes the next phase in the CCI Tool.s development; the aims of which were to establish validity and reliability of the CCI Tool, and to develop quality indicator thresholds (benchmarks) for use in Australia. The CCI Tool is now known as the ResCareQA (Residential Care Quality Assessment). Methods: The study aims were achieved through a combination of quantitative data analysis, and expert panel consultations using modified Delphi process. The expert panel consisted of experienced aged care clinicians, managers, and academics; they were initially consulted to determine face and content validity of the ResCareQA, and later to develop thresholds of quality. To analyse its psychometric properties, ResCareQA forms were completed for all residents (N=498) of nine aged care facilities throughout Queensland. Kappa statistics were used to assess inter-rater and test-retest reliability, and Cronbach.s alpha coefficient calculated to determine internal consistency. For concurrent validity, equivalent items on the ResCareQA and the Resident Classification Scales (RCS) were compared using Spearman.s rank order correlations, while discriminative validity was assessed using known-groups technique, comparing ResCareQA results between groups with differing care needs, as well as between male and female residents. Rank-ordered facility results for each clinical care indicator (CCI) were circulated to the panel; upper and lower thresholds for each CCI were nominated by panel members and refined through a Delphi process. These thresholds indicate excellent care at one extreme and questionable care at the other. Results: Minor modifications were made to the assessment, and it was renamed the ResCareQA. Agreement on its content was reached after two Delphi rounds; the final version contains 24 questions across four domains, enabling generation of 36 CCIs. Both test-retest and inter-rater reliability were sound with median kappa values of 0.74 (test-retest) and 0.91 (inter-rater); internal consistency was not as strong, with a Chronbach.s alpha of 0.46. Because the ResCareQA does not provide a single combined score, comparisons for concurrent validity were made with the RCS on an item by item basis, with most resultant correlations being quite low. Discriminative validity analyses, however, revealed highly significant differences in total number of CCIs between high care and low care groups (t199=10.77, p=0.000), while the differences between male and female residents were not significant (t414=0.56, p=0.58). Clinical outcomes varied both within and between facilities; agreed upper and lower thresholds were finalised after three Delphi rounds. Conclusions: The ResCareQA provides a comprehensive, easily administered means of monitoring quality in residential aged care facilities that can be reliably used on multiple occasions. The relatively modest internal consistency score was likely due to the multi-factorial nature of quality, and the absence of an aggregate result for the assessment. Measurement of concurrent validity proved difficult in the absence of a gold standard, but the sound discriminative validity results suggest that the ResCareQA has acceptable validity and could be confidently used as an indication of care quality within Australian residential aged care facilities. The thresholds, while preliminary due to small sample size, enable users to make judgements about quality within and between facilities. Thus it is recommended the ResCareQA be adopted for wider use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a randomized, double-blind study, 202 healthy adults were randomized to receive a live, attenuated Japanese encephalitis chimeric virus vaccine (JE-CV) and placebo 28 days apart in a cross-over design. A subgroup of 98 volunteers received a JE-CV booster at month 6. Safety, immunogenicity, and persistence of antibodies to month 60 were evaluated. There were no unexpected adverse events (AEs) and the incidence of AEs between JE-CV and placebo were similar. There were three serious adverse events (SAE) and no deaths. A moderately severe case of acute viral illness commencing 39 days after placebo administration was the only SAE considered possibly related to immunization. 99% of vaccine recipients achieved a seroprotective antibody titer ≥ 10 to JE-CV 28 days following the single dose of JE-CV, and 97% were seroprotected at month 6. Kaplan Meier analysis showed that after a single dose of JE-CV, 87% of the participants who were seroprotected at month 6 were still protected at month 60. This rate was 96% among those who received a booster immunization at month 6. 95% of subjects developed a neutralizing titer ≥ 10 against at least three of the four strains of a panel of wild-type Japanese encephalitis virus (JEV) strains on day 28 after immunization. At month 60, that proportion was 65% for participants who received a single dose of JE-CV and 75% for the booster group. These results suggest that JE-CV is safe, well tolerated and that a single dose provides long-lasting immunity to wild-type strains

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hydrogels provide a 3-dimensional network for embedded cells and offer promise for cartilage tissue engineering applications. Nature-derived hydrogels, including alginate, have been shown to enhance the chondrocyte phenotype but are variable and not entirely controllable. Synthetic hydrogels, including polyethylene glycol (PEG)-based matrices, have the advantage of repeatability and modularity; mechanical stiffness, cell adhesion, and degradability can be altered independently. In this study, we compared the long-term in vitro effects of different hydrogels (alginate and Factor XIIIa-cross-linked MMP-sensitive PEG at two stiffness levels) on the behavior of expanded human chondrocytes and the development of construct properties. Monolayer-expanded human chondrocytes remained viable throughout culture, but morphology varied greatly in different hydrogels. Chondrocytes were characteristically round in alginate but mostly spread in PEG gels at both concentrations. Chondrogenic gene (COL2A1, aggrecan) expression increased in all hydrogels, but alginate constructs had much higher expression levels of these genes (up to 90-fold for COL2A1), as well as proteoglycan 4, a functional marker of the superficial zone. Also, chondrocytes expressed COL1A1 and COL10A1, indicative of de-differentiation and hypertrophy. After 12 weeks, constructs with lower polymer content were stiffer than similar constructs with higher polymer content, with the highest compressive modulus measured in 2.5% PEG gels. Different materials and polymer concentrations have markedly different potency to affect chondrocyte behavior. While synthetic hydrogels offer many advantages over natural materials such as alginate, they must be further optimized to elicit desired chondrocyte responses for use as cartilage models and for development of functional tissue-engineered articular cartilage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Techniques for detecting circulating tumor cells in the peripheral blood of patients with head and neck cancers may identify individuals likely to benefit from early systemic treatment. Methods Reconstruction experiments were used to optimise immunomagnetic enrichment and RT-PCR detection of circulating tumor cells using four markers (ELF3, CK19, EGFR and EphB4). This method was then tested in a pilot study using samples from 16 patients with advanced head and neck carcinomas. Results Seven patients were positive for circulating tumour cells both prior to and after surgery, 4 patients were positive prior to but not after surgery, 3 patients were positive after but not prior to surgery and 2 patients were negative. Two patients tested positive for circulating cells but there was no other evidence of tumor spread. Given this patient cohort had mostly advanced disease, as expected the detection of circulating tumour cells was not associated with significant differences in overall or disease free survival. Conclusion For the first time, we show that almost all patients with advanced head and neck cancers have circulating cells at the time of surgery. The clinical application of techniques for detection of spreading disease, such as the immunomagnetic enrichment RT-PCR analysis used in this study, should be explored further.