988 resultados para Winter sports facilities
Resumo:
For frail older people, admission to hospital is an opportunity to review the indications for specific medications. This research investigates prescribing for 206 older people discharged into residential aged care facilities from 11 acute care hospitals in Australia. Patients had multiple comorbidities (mean 6), high levels of dependency, and were prescribed a mean of 7.2 regular medications at admission to hospital and 8.1 medications on discharge, with hyper-polypharmacy (≥10 drugs) increasing from 24.3% to 32.5%. Many drugs were preventive medications whose time until benefit was likely to exceed the expected lifespan. In summary, frail patients continue to be exposed to extensive polypharmacy and medications with uncertain risk–benefit ratio.
Resumo:
Digital Image
Resumo:
Digital Image
Resumo:
Field studies were conducted over 5 years on two dairy farms in southern Queensland to evaluate the impacts of zero-tillage, nitrogen (N) fertiliser and legumes on a winter-dominant forage system based on raingrown oats. Oats was able to be successfully established using zero-tillage methods, with no yield penalties and potential benefits in stubble retention over the summer fallow. N fertiliser, applied at above industry-standard rates (140 vs. 55 kg/ha.crop) in the first 3 years, increased forage N concentration significantly and had residual effects on soil nitrate-N at both sites. At one site, crop yield was increased by 10 kg DM/ha. kg fertiliser N applied above industry-standard rates. The difference between sites in fertiliser response reflected contrasting soil and fertiliser history. There was no evidence that modifications to oats cropping practices (zero-tillage and increased N fertiliser) increased surface soil organic carbon (0-10 cm) in the time frame of the present study. When oats was substituted with annual legumes, there were benefits in improved forage N content of the oat crop immediately following, but legume yield was significantly inferior to oats. In contrast, the perennial legume Medicago sativa was competitive in biomass production and forage quality with oats at both sites and increased soil nitrate-N levels following termination. However, its contribution to winter forage was low at 10% of total production, compared with 40% for oats, and soil water reserves were significantly reduced at one site, which had an impact on the following oat production. The study demonstrated that productive grazed oat crops can be grown using zero tillage and that increased N fertiliser is more consistent in its effect on N concentration than on forage yield. A lucerne ley provides a strategy for raising soil nitrate-N concentration and increasing overall forage productivity, although winter forage production is reduced.
Resumo:
The wheat grain industry is Australia's second largest agricultural export commodity. There is an increasing demand for accurate, objective and near real-time crop production information by industry. The advent of the Moderate Resolution Imaging Spectroradiometer (MODIS) satellite platform has augmented the capability of satellite-based applications to capture reflectance over large areas at acceptable pixel scale, cost and accuracy. The use of multi-temporal MODIS-enhanced vegetation index (EVI) imagery to determine crop area was investigated in this article. Here the rigour of the harmonic analysis of time-series (HANTS) and early-season metric approaches was assessed when extrapolating over the entire Queensland (QLD) cropping region for the 2005 and 2006 seasons. Early-season crop area estimates, at least 4 months before harvest, produced high accuracy at pixel and regional scales with percent errors of -8.6% and -26% for the 2005 and 2006 seasons, respectively. In discriminating among crops at pixel and regional scale, the HANTS approach showed high accuracy. The errors for specific area estimates for wheat, barley and chickpea were 9.9%, -5.2% and 10.9% (for 2005) and -2.8%, -78% and 64% (for 2006), respectively. Area estimates of total winter crop, wheat, barley and chickpea resulted in coefficient of determination (R(2)) values of 0.92, 0.89, 0.82 and 0.52, when contrasted against the actual shire-scale data. A significantly high coefficient of determination (0.87) was achieved for total winter crop area estimates in August across all shires for the 2006 season. Furthermore, the HANTS approach showed high accuracy in discriminating cropping area from non-cropping area and highlighted the need for accurate and up-to-date land use maps. The extrapolability of these approaches to determine total and specific winter crop area estimates, well before flowering, showed good utility across larger areas and seasons. Hence, it is envisaged that this technology might be transferable to different regions across Australia.
Resumo:
More than 1200 wheat and 120 barley experiments conducted in Australia to examine yield responses to applied nitrogen (N) fertiliser are contained in a national database of field crops nutrient research (BFDC National Database). The yield responses are accompanied by various pre-plant soil test data to quantify plant-available N and other indicators of soil fertility status or mineralisable N. A web application (BFDC Interrogator), developed to access the database, enables construction of calibrations between relative crop yield ((Y0/Ymax) × 100) and N soil test value. In this paper we report the critical soil test values for 90% RY (CV90) and the associated critical ranges (CR90, defined as the 70% confidence interval around that CV90) derived from analysis of various subsets of these winter cereal experiments. Experimental programs were conducted throughout Australia’s main grain-production regions in different eras, starting from the 1960s in Queensland through to Victoria during 2000s. Improved management practices adopted during the period were reflected in increasing potential yields with research era, increasing from an average Ymax of 2.2 t/ha in Queensland in the 1960s and 1970s, to 3.4 t/ha in South Australia (SA) in the 1980s, to 4.3 t/ha in New South Wales (NSW) in the 1990s, and 4.2 t/ha in Victoria in the 2000s. Various sampling depths (0.1–1.2 m) and methods of quantifying available N (nitrate-N or mineral-N) from pre-planting soil samples were used and provided useful guides to the need for supplementary N. The most regionally consistent relationships were established using nitrate-N (kg/ha) in the top 0.6 m of the soil profile, with regional and seasonal variation in CV90 largely accounted for through impacts on experimental Ymax. The CV90 for nitrate-N within the top 0.6 m of the soil profile for wheat crops increased from 36 to 110 kg nitrate-N/ha as Ymax increased over the range 1 to >5 t/ha. Apparent variation in CV90 with seasonal moisture availability was entirely consistent with impacts on experimental Ymax. Further analyses of wheat trials with available grain protein (~45% of all experiments) established that grain yield and not grain N content was the major driver of crop N demand and CV90. Subsets of data explored the impact of crop management practices such as crop rotation or fallow length on both pre-planting profile mineral-N and CV90. Analyses showed that while management practices influenced profile mineral-N at planting and the likelihood and size of yield response to applied N fertiliser, they had no significant impact on CV90. A level of risk is involved with the use of pre-plant testing to determine the need for supplementary N application in all Australian dryland systems. In southern and western regions, where crop performance is based almost entirely on in-crop rainfall, this risk is offset by the management opportunity to split N applications during crop growth in response to changing crop yield potential. In northern cropping systems, where stored soil moisture at sowing is indicative of minimum yield potential, erratic winter rainfall increases uncertainty about actual yield potential as well as reducing the opportunity for effective in-season applications.
Resumo:
Soil testing is the most widely used tool to predict the need for fertiliser phosphorus (P) application to crops. This study examined factors affecting critical soil P concentrations and confidence intervals for wheat and barley grown in Australian soils by interrogating validated data from 1777 wheat and 150 barley field treatment series now held in the BFDC National Database. To narrow confidence intervals associated with estimated critical P concentrations, filters for yield, crop stress, or low pH were applied. Once treatment series with low yield (<1 t/ha), severe crop stress, or pHCaCl2 <4.3 were screened out, critical concentrations were relatively insensitive to wheat yield (>1 t/ha). There was a clear increase in critical P concentration from early trials when full tillage was common compared with those conducted in 1995–2011, which corresponds to a period of rapid shift towards adoption of minimum tillage. For wheat, critical Colwell-P concentrations associated with 90 or 95% of maximum yield varied among Australian Soil Classification (ASC) Orders and Sub-orders: Calcarosol, Chromosol, Kandosol, Sodosol, Tenosol and Vertosol. Soil type, based on ASC Orders and Sub-orders, produced critical Colwell-P concentrations at 90% of maximum relative yield from 15 mg/kg (Grey Vertosol) to 47 mg/kg (Supracalcic Calcarosols), with other soils having values in the range 19–27 mg/kg. Distinctive differences in critical P concentrations were evident among Sub-orders of Calcarosols, Chromosols, Sodosols, Tenosols, and Vertosols, possibly due to differences in soil properties related to P sorption. However, insufficient data were available to develop a relationship between P buffering index (PBI) and critical P concentration. In general, there was no evidence that critical concentrations for barley would be different from those for wheat on the same soils. Significant knowledge gaps to fill to improve the relevance and reliability of soil P testing for winter cereals were: lack of data for oats; the paucity of treatment series reflecting current cropping practices, especially minimum tillage; and inadequate metadata on soil texture, pH, growing season rainfall, gravel content, and PBI. The critical concentrations determined illustrate the importance of recent experimental data and of soil type, but also provide examples of interrogation pathways into the BFDC National Database to extract locally relevant critical P concentrations for guiding P fertiliser decision-making in wheat and barley.
Resumo:
The Florida manatee, Trichechus manatus latirostris, is a hindgut-fermenting herbivore. In winter, manatees migrate to warm water overwintering sites where they undergo dietary shifts and may suffer from cold-induced stress. Given these seasonally induced changes in diet, the present study aimed to examine variation in the hindgut bacterial communities of wild manatees overwintering at Crystal River, west Florida. Faeces were sampled from 36 manatees of known sex and body size in early winter when manatees were newly arrived and then in mid-winter and late winter when diet had probably changed and environmental stress may have increased. Concentrations of faecal cortisol metabolite, an indicator of a stress response, were measured by enzyme immunoassay. Using 454-pyrosequencing, 2027 bacterial operational taxonomic units were identified in manatee faeces following amplicon pyrosequencing of the 16S rRNA gene V3/V4 region. Classified sequences were assigned to eight previously described bacterial phyla; only 0.36% of sequences could not be classified to phylum level. Five core phyla were identified in all samples. The majority (96.8%) of sequences were classified as Firmicutes (77.3 ± 11.1% of total sequences) or Bacteroidetes (19.5 ± 10.6%). Alpha-diversity measures trended towards higher diversity of hindgut microbiota in manatees in mid-winter compared to early and late winter. Beta-diversity measures, analysed through permanova, also indicated significant differences in bacterial communities based on the season.
Resumo:
The root-lesion nematode, Pratylenchus thornei, can reduce wheat yields by >50%. Although this nematode has a broad host range, crop rotation can be an effective tool for its management if the host status of crops and cultivars is known. The summer crops grown in the northern grain region of Australia are poorly characterised for their resistance to P. thornei and their role in crop sequencing to improve wheat yields. In a 4-year field experiment, we prepared plots with high or low populations of P. thornei by growing susceptible wheat or partially resistant canaryseed (Phalaris canariensis); after an 11-month, weed-free fallow, several cultivars of eight summer crops were grown. Following another 15-month, weed-free fallow, P. thornei-intolerant wheat cv. Strzelecki was grown. Populations of P. thornei were determined to 150 cm soil depth throughout the experiment. When two partially resistant crops were grown in succession, e.g. canaryseed followed by panicum (Setaria italica), P. thornei populations were <739/kg soil and subsequent wheat yields were 3245 kg/ha. In contrast, after two susceptible crops, e.g. wheat followed by soybean, P. thornei populations were 10 850/kg soil and subsequent wheat yields were just 1383 kg/ha. Regression analysis showed a linear, negative response of wheat biomass and grain yield with increasing P. thornei populations and a predicted loss of 77% for biomass and 62% for grain yield. The best predictor of wheat yield loss was P. thornei populations at 0-90 cm soil depth. Crop rotation can be used to reduce P. thornei populations and increase wheat yield, with greatest gains being made following two partially resistant crops grown sequentially.
Resumo:
Background There has been considerable publicity regarding population ageing and hospital emergency department (ED) overcrowding. Our study aims to investigate impact of one intervention piloted in Queensland Australia, the Hospital in the Nursing Home (HiNH) program, on reducing ED and hospital attendances from residential aged care facilities (RACFs). Methods A quasi-experimental study was conducted at an intervention hospital undertaking the program and a control hospital with normal practice. Routine Queensland health information system data were extracted for analysis. Results Significant reductions in the number of ED presentations per 1000 RACF beds (rate ratio (95 % CI): 0.78 (0.67–0.92); p = 0.002), number of hospital admissions per 1000 RACF beds (0.62 (0.50–0.76); p < 0.0001), and number of hospital admissions per 100 ED presentations (0.61 (0.43–0.85); p = 0.004) were noticed in the experimental hospital after the intervention; while there were no significant differences between intervention and control hospitals before the intervention. Pre-test and post-test comparison in the intervention hospital also presented significant decreases in ED presentation rate (0.75 (0.65–0.86); p < 0.0001) and hospital admission rate per RACF bed (0.66 (0.54–0.79); p < 0.0001), and a non-significant reduction in hospital admission rate per ED presentation (0.82 (0.61–1.11); p = 0.196). Conclusions Hospital in the Nursing Home program could be effective in reducing ED presentations and hospital admissions from RACF residents. Implementation of the program across a variety of settings is preferred to fully assess the ongoing benefits for patients and any possible cost-savings.
Resumo:
Background Medication incident reporting (MIR) is a key safety critical care process in residential aged care facilities (RACFs). Retrospective studies of medication incident reports in aged care have identified the inability of existing MIR processes to generate information that can be used to enhance residents’ safety. However, there is little existing research that investigates the limitations of the existing information exchange process that underpins MIR, despite the considerable resources that RACFs’ devote to the MIR process. The aim of this study was to undertake an in-depth exploration of the information exchange process involved in MIR and identify factors that inhibit the collection of meaningful information in RACFs. Methods The study was undertaken in three RACFs (part of a large non-profit organisation) in NSW, Australia. A total of 23 semi-structured interviews and 62 hours of observation sessions were conducted between May to July 2011. The qualitative data was iteratively analysed using a grounded theory approach. Results The findings highlight significant gaps in the design of the MIR artefacts as well as information exchange issues in MIR process execution. Study results emphasized the need to: a) design MIR artefacts that facilitate identification of the root causes of medication incidents, b) integrate the MIR process within existing information systems to overcome key gaps in information exchange execution, and c) support exchange of information that can facilitate a multi-disciplinary approach to medication incident management in RACFs. Conclusions This study highlights the advantages of viewing MIR process holistically rather than as segregated tasks, as a means to identify gaps in information exchange that need to be addressed in practice to improve safety critical processes.
Resumo:
Medication information is a critical part of the information required to ensure residents' safety in the highly collaborative care context of RACFs. Studies report poor medication information as a barrier to improve medication management in RACFs. Research exploring medication work practices in aged care settings remains limited. This study aimed to identify contextual and work practice factors contributing to breakdowns in medication information exchange in RACFs in relation to the medication administration process. We employed non-participant observations and semi-structured interviews to explore information practices in three Australian RACFs. Findings identified inefficiencies due to lack of information timeliness, manual stock management, multiple data transcriptions, inadequate design of essential documents such as administration sheets and a reliance on manual auditing procedures. Technological solutions such as electronic medication administration records offer opportunities to overcome some of the identified problems. However these interventions need to be designed to align with the collaborative team based processes they intend to support.
Resumo:
Introduction Electronic medication administration record (eMAR) systems are promoted as a potential intervention to enhance medication safety in residential aged care facilities (RACFs). The purpose of this study was to conduct an in-practice evaluation of an eMAR being piloted in one Australian RACF before its roll out, and to provide recommendations for system improvements. Methods A multidisciplinary team conducted direct observations of workflow (n=34 hours) in the RACF site and the community pharmacy. Semi-structured interviews (n=5) with RACF staff and the community pharmacist were conducted to investigate their views of the eMAR system. Data were analysed using a grounded theory approach to identify challenges associated with the design of the eMAR system. Results The current eMAR system does not offer an end-to-end solution for medication management. Many steps, including prescribing by doctors and communication with the community pharmacist, are still performed manually using paper charts and fax machines. Five major challenges associated with the design of eMAR system were identified: limited interactivity; inadequate flexibility; problems related to information layout and semantics; the lack of relevant decision support; and system maintenance issues.We suggest recommendations to improve the design of the eMAR system and to optimize existing workflows. Discussion Immediate value can be achieved by improving the system interactivity, reducing inconsistencies in data entry design and offering dedicated organisational support to minimise connectivity issues. Longer-term benefits can be achieved by adding decision support features and establishing system interoperability requirements with stakeholder groups (e.g. community pharmacies) prior to system roll out. In-practice evaluations of technologies like eMAR system have great value in identifying design weaknesses which inhibit optimal system use.
Resumo:
The aim of this study was to examine the actions of geographically dispersed process stakeholders (doctors, community pharmacists and RACFs) in order to cope with the information silos that exist within and across different settings. The study setting involved three metropolitan RACFs in Sydney, Australia and employed a qualitative approach using semi-structured interviews, non-participant observations and artefact analysis. Findings showed that medication information was stored in silos which required specific actions by each setting to translate this information to fit their local requirements. A salient example of this was the way in which community pharmacists used the RACF medication charts to prepare residents' pharmaceutical records. This translation of medication information across settings was often accompanied by telephone or face-to-face conversations to cross-check, validate or obtain new information. Findings highlighted that technological interventions that work in silos can negatively impact the quality of medication management processes in RACF settings. The implementation of commercial software applications like electronic medication charts need to be appropriately integrated to satisfy the collaborative information requirements of the RACF medication process.