958 resultados para Baseline
Resumo:
The financial health of beef cattle enterprises in northern Australia has declined markedly over the last decade due to an escalation in production and marketing costs and a real decline in beef prices. Historically, gains in animal productivity have offset the effect of declining terms of trade on farm incomes. This raises the question of whether future productivity improvements can remain a key path for lifting enterprise profitability sufficient to ensure that the industry remains economically viable over the longer term. The key objective of this study was to assess the production and financial implications for north Australian beef enterprises of a range of technology interventions (development scenarios), including genetic gain in cattle, nutrient supplementation, and alteration of the feed base through introduced pastures and forage crops, across a variety of natural environments. To achieve this objective a beef systems model was developed that is capable of simulating livestock production at the enterprise level, including reproduction, growth and mortality, based on energy and protein supply from natural C4 pastures that are subject to high inter-annual climate variability. Comparisons between simulation outputs and enterprise performance data in three case study regions suggested that the simulation model (the Northern Australia Beef Systems Analyser) can adequately represent the performance beef cattle enterprises in northern Australia. Testing of a range of development scenarios suggested that the application of individual technologies can substantially lift productivity and profitability, especially where the entire feedbase was altered through legume augmentation. The simultaneous implementation of multiple technologies that provide benefits to different aspects of animal productivity resulted in the greatest increases in cattle productivity and enterprise profitability, with projected weaning rates increasing by 25%, liveweight gain by 40% and net profit by 150% above current baseline levels, although gains of this magnitude might not necessarily be realised in practice. While there were slight increases in total methane output from these development scenarios, the methane emissions per kg of beef produced were reduced by 20% in scenarios with higher productivity gain. Combinations of technologies or innovative practices applied in a systematic and integrated fashion thus offer scope for providing the productivity and profitability gains necessary to maintain viable beef enterprises in northern Australia into the future.
Resumo:
Assessing the impacts of climate variability on agricultural productivity at regional, national or global scale is essential for defining adaptation and mitigation strategies. We explore in this study the potential changes in spring wheat yields at Swift Current and Melfort, Canada, for different sowing windows under projected climate scenarios (i.e., the representative concentration pathways, RCP4.5 and RCP8.5). First, the APSIM model was calibrated and evaluated at the study sites using data from long term experimental field plots. Then, the impacts of change in sowing dates on final yield were assessed over the 2030-2099 period with a 1990-2009 baseline period of observed yield data, assuming that other crop management practices remained unchanged. Results showed that the performance of APSIM was quite satisfactory with an index of agreement of 0.80, R2 of 0.54, and mean absolute error (MAE) and root mean square error (RMSE) of 529 kg/ha and 1023 kg/ha, respectively (MAE = 476 kg/ha and RMSE = 684 kg/ha in calibration phase). Under the projected climate conditions, a general trend in yield loss was observed regardless of the sowing window, with a range from -24 to -94 depending on the site and the RCP, and noticeable losses during the 2060s and beyond (increasing CO2 effects being excluded). Smallest yield losses obtained through earlier possible sowing date (i.e., mid-April) under the projected future climate suggested that this option might be explored for mitigating possible adverse impacts of climate variability. Our findings could therefore serve as a basis for using APSIM as a decision support tool for adaptation/mitigation options under potential climate variability within Western Canada.
Resumo:
In the study, we used the Agilent 8453 spectrophotometer (which is equipped with a limiting aperture that restricts the light beam to the central 5 mm of the contact lens), to measure the transmittance of various coloured contact lenses including the one Day Acuvue define manufactured by Johnson and Johnson which the authors represent. We measured the instrument baseline before the transmittance spectra of lenses were tested. The values of lens transmittances were thus the difference between baseline and lens measurement at each time. The transmittance measurements were obtained at 0.5 nm intervals, from 200 to 700 nm after a soak in saline to remove the influence of any surface active agents within the packaging products. The technique used in our study was not very different from how other research studies [2], [3], [4], [5] and [6] have measured the spectra transmittances of contact lenses...
Resumo:
This paper presents an effective classification method based on Support Vector Machines (SVM) in the context of activity recognition. Local features that capture both spatial and temporal information in activity videos have made significant progress recently. Efficient and effective features, feature representation and classification plays a crucial role in activity recognition. For classification, SVMs are popularly used because of their simplicity and efficiency; however the common multi-class SVM approaches applied suffer from limitations including having easily confused classes and been computationally inefficient. We propose using a binary tree SVM to address the shortcomings of multi-class SVMs in activity recognition. We proposed constructing a binary tree using Gaussian Mixture Models (GMM), where activities are repeatedly allocated to subnodes until every new created node contains only one activity. Then, for each internal node a separate SVM is learned to classify activities, which significantly reduces the training time and increases the speed of testing compared to popular the `one-against-the-rest' multi-class SVM classifier. Experiments carried out on the challenging and complex Hollywood dataset demonstrates comparable performance over the baseline bag-of-features method.
Resumo:
It is recognised that patients with chronic disease are unable to remembercorrectly information provided by health care profesionals. The teach-back method is acknowledgedas a technique to improve patients’ understanding. Yet it is not used in nursing practice in Vietnam. Objectives This study sought to examine knowledge background of heart failure among cardiac nurses, introduce a education about heart failure self-management and the teach-back method to assist teaching patients on self-care. The study also wanted to explore if a short education could benefit nurses’ knowledge so they would be qualified to deliver education to patients. Methods A pre/post-test design was employed. Cardiac nurses from 3 hospitals (Vietnam National Heart Institute, E Hospital, Huu Nghi Hospital) were invited to attend a six-hour educational session which covered both the teach-back method and heart failure self-management. Role-play with scenarios were used to reinforce educational contents. The Dutch Heart Failure Knowledge Scale was used to assess nurses’ knowledge of heart failure at baseline and after the educational session. Results 20 nurses from3 selected hospitals participated. Average age was 34.5±7.9 years and years of nursing experience was 11.6±8.3. Heart failure knowledge score at the baseline was 12.7±1.2 and post education was 13.8±1.0. There was deficiency of nurses knowledge regarding fluid restriction among heart failure people, causes of worsening heart failure. Heart failure knowledge improved significantly following the workshop (p < 0.001). All nurses achieved an overall adequate knowledge score (≥11 of the maximum 15) at the end. 100% of nurses agreed that the teach-back method was effective and could be used to educate patients about heart failure self-management. Conclusions The results of this study have shown the effectiveness of the piloteducaiton in increasing nurses’ knowledge of heart failure. The teach-back method is accepted for Vietnamese nurses to use in routine cardiac practice.
Resumo:
Background The objective is to estimate the incremental cost-effectiveness of the Australian National Hand Hygiene Inititiave implemented between 2009 and 2012 using healthcare associated Staphylococcus aureus bacteraemia as the outcome. Baseline comparators are the eight existing state and territory hand hygiene programmes. The setting is the Australian public healthcare system and 1,294,656 admissions from the 50 largest Australian hospitals are included. Methods The design is a cost-effectiveness modelling study using a before and after quasi-experimental design. The primary outcome is cost per life year saved from reduced cases of healthcare associated Staphylococcus aureus bacteraemia, with cost estimated by the annual on-going maintenance costs less the costs saved from fewer infections. Data were harvested from existing sources or were collected prospectively and the time horizon for the model was 12 months, 2011–2012. Findings No useable pre-implementation Staphylococcus aureus bacteraemia data were made available from the 11 study hospitals in Victoria or the single hospital in Northern Territory leaving 38 hospitals among six states and territories available for cost-effectiveness analyses. Total annual costs increased by $2,851,475 for a return of 96 years of life giving an incremental cost-effectiveness ratio (ICER) of $29,700 per life year gained. Probabilistic sensitivity analysis revealed a 100% chance the initiative was cost effective in the Australian Capital Territory and Queensland, with ICERs of $1,030 and $8,988 respectively. There was an 81% chance it was cost effective in New South Wales with an ICER of $33,353, a 26% chance for South Australia with an ICER of $64,729 and a 1% chance for Tasmania and Western Australia. The 12 hospitals in Victoria and the Northern Territory incur annual on-going maintenance costs of $1.51M; no information was available to describe cost savings or health benefits. Conclusions The Australian National Hand Hygiene Initiative was cost-effective against an Australian threshold of $42,000 per life year gained. The return on investment varied among the states and territories of Australia.
Resumo:
With the aim of increasing peanut production in Australia, the Australian peanut industry has recently considered growing peanuts in rotation with maize at Katherine in the Northern Territory—a location with a semi-arid tropical climate and surplus irrigation capacity. We used the well-validated APSIM model to examine potential agronomic benefits and long-term risks of this strategy under the current and warmer climates of the new region. Yield of the two crops, irrigation requirement, total soil organic carbon (SOC), nitrogen (N) losses and greenhouse gas (GHG) emissions were simulated. Sixteen climate stressors were used; these were generated by using global climate models ECHAM5, GFDL2.1, GFDL2.0 and MRIGCM232 with a median sensitivity under two Special Report of Emissions Scenarios over the 2030 and 2050 timeframes plus current climate (baseline) for Katherine. Effects were compared at three levels of irrigation and three levels of N fertiliser applied to maize grown in rotations of wet-season peanut and dry-season maize (WPDM), and wet-season maize and dry-season peanut (WMDP). The climate stressors projected average temperature increases of 1°C to 2.8°C in the dry (baseline 24.4°C) and wet (baseline 29.5°C) seasons for the 2030 and 2050 timeframes, respectively. Increased temperature caused a reduction in yield of both crops in both rotations. However, the overall yield advantage of WPDM increased from 41% to up to 53% compared with the industry-preferred sequence of WMDP under the worst climate projection. Increased temperature increased the irrigation requirement by up to 11% in WPDM, but caused a smaller reduction in total SOC accumulation and smaller increases in N losses and GHG emission compared with WMDP. We conclude that although increased temperature will reduce productivity and total SOC accumulation, and increase N losses and GHG emissions in Katherine or similar northern Australian environments, the WPDM sequence should be preferable over the industry-preferred sequence because of its overall yield and sustainability advantages in warmer climates. Any limitations of irrigation resulting from climate change could, however, limit these advantages.
Resumo:
Objectives: 1. Estimate population parameters required for a management model. These include survival, density, age structure, growth, age and size at maturity and at recruitment to the adult eel fishery. Estimate their variability among individuals in a range of habitats. 2. Develop a management population dynamics model and use it to investigate management options. 3. Establish baseline data and sustainability indicators for long-term monitoring. 4. Assess the applicability of the above techniques to other eel fisheries in Australia, in collaboration with NSW. Distribute developed tools via the Australia and New Zealand Eel Reference Group.
Resumo:
Background Malnutrition and unintentional weight loss are major clinical issues in people with dementia living in residential aged care facilities (RACFs) and are associated with serious adverse outcomes. However, evidence regarding effective interventions is limited and strategies to improve the nutritional status of this population are required. This presentation describes the implementation and results of a pilot randomised controlled trial of a multi-component intervention for improving the nutritional status of RACF residents with dementia. Method Fifteen residents with moderate-severe dementia living in a secure long-term RACF participated in a five week pilot study. Participants were randomly allocated to either an Intervention (n=8) or Control group (n=7). The intervention comprised four elements delivered in a separate dining room at lunch and dinner: the systematic reinforcement of residents’ eating behaviors using a specific communication protocol; family-style dining; high ambiance table presentation; and routine Dietary-Nutrition Champion supervision. Control group participants ate their meals according to the facility’s standard practice. Baseline and follow-up assessments of nutritional status, food consumption, and body mass index were obtained by qualified nutritionists. Additional assessments included measures of cognitive functioning, mealtime agitation, depression, wandering status and multiple measures of intervention fidelity. Results No participant was malnourished at study commencement and participants in both groups gained weight from follow-up to baseline which was not significantly different between groups (t=0.43; p=0.67). A high degree of treatment fidelity was evident throughout the intervention. Qualitative data from staff indicate the intervention was perceived to be beneficial for residents. Conclusions This multi-component nutritional intervention was well received and was feasible in the RACF setting. Participants’ sound nutritional status at baseline likely accounts for the lack of an intervention effect. Further research using this protocol in malnourished residents is recommended. For success, a collaborative approach between researchers and facility staff, particularly dietary staff, is essential.
Resumo:
Symposium co-ordinated by The International Network for Food and Obesity/NCDs Research, Monitoring and Action Support (INFORMAS) Purpose Global monitoring of the price and affordability of foods, meals and diets is urgently needed. There are major methodological challenges in developing robust, cost-effective, standardized, and policy relevant tools, pertinent to nutrition, obesity, and diet-related non-communicable diseases and their inequalities. There is increasing pressure to take into account environmental sustainability. Changes in price differentials and affordability need to be comparable between and within countries and over time. Robust tools could provide baseline data for monitoring and evaluating structural, economic and social policies at the country/regional and household levels. INFORMAS offers one framework for consideration.
Resumo:
PURPOSE The purpose of this study was to examine the relationship between objectively measured ambient light exposure and longitudinal changes in axial eye growth in childhood. METHODS A total of 101 children (41 myopes and 60 nonmyopes), 10 to 15 years of age participated in this prospective longitudinal observational study. Axial eye growth was determined from measurements of ocular optical biometry collected at four study visits over an 18-month period. Each child’s mean daily light exposure was derived from two periods (each 14 days long) of objective light exposure measurements from a wrist-worn light sensor. RESULTS Over the 18-month study period, a modest but statistically significant association between greater average daily light exposure and slower axial eye growth was observed (P ¼ 0.047). Other significant predictors of axial eye growth in this population included children’s refractive error group (P < 0.001), sex (P < 0.01), and age (P < 0.001). Categorized according to their objectively measured average daily light exposure and adjusting for potential confounders (age, sex, baseline axial length, parental myopia, nearwork, and physical activity), children experiencing low average daily light exposure (mean daily light exposure: 459 6 117 lux, annual eye growth: 0.13 mm/y) exhibited significantly greater eye growth than children experiencing moderate (842 6 109 lux, 0.060 mm/y), and high (1455 6 317 lux, 0.065 mm/y) average daily light exposure levels (P ¼ 0.01). CONCLUSIONS In this population of children, greater daily light exposure was associated with less axial eye growth over an 18-month period. These findings support the role of light exposure in the documented association between time spent outdoors and childhood myopia.
Resumo:
While mobile phones have become ubiquitous in modern society, the use of mobile phones while driving is increasing at an alarming rate despite the associated crash risks. A significant safety concern is that driving while distracted by a mobile phone is more prevalent among young drivers, a less experienced driving cohort with elevated crash risk. The objective of this study was to examine the gap acceptance behavior of distracted young drivers at roundabouts. The CARRS-Q Advanced Driving Simulator was used to test participants on a simulated gap acceptance scenario at roundabouts. Conflicting traffic from the right approach of a four-legged roundabout were programmed to have a series of vehicles having the gaps between them proportionately increased from two to six seconds. Thirty-two licensed young drivers drove the simulator under three phone conditions: baseline (no phone conversation), hands-free and handheld phone conversations. Results show that distracted drivers started responding to the gap acceptance scenario at a distance closer to the roundabout and approached the roundabout at slower speeds. They also decelerated at faster rates to reduce their speeds prior to gap acceptance compared to non-distracted drivers. Although accepted gap sizes were not significantly different across phone conditions, differences in the safety margins at various gap sizes—measured by Post Encroachment Time (PET) between the driven vehicle and the conflicting vehicle—were statistically significant across phone conditions. PETs for distracted drivers were smaller across different gap sizes, suggesting a lower safety margin taken by distracted drivers compared to non-distracted drivers. The results aid in understanding how cognitive distraction resulting from mobile phone conversations while driving influences driving behavior during gap acceptance at roundabouts.
Resumo:
- Introduction Heat-based training (HT) is becoming increasingly popular as a means of inducing acclimation before athletic competition in hot conditions and/or to augment the training impulse beyond that achieved in thermo-neutral conditions. Importantly, current understanding of the effects of HT on regenerative processes such as sleep and the interactions with common recovery interventions remain unknown. This study aimed to examine sleep characteristics during five consecutive days of training in the heat with the inclusion of cold-water immersion (CWI) compared to baseline sleep patterns. - Methods Thirty recreationally-trained males completed HT in 32 ± 1 °C and 60% rh for five consecutive days. Conditions included: 1) 90 min cycling at 40 % power at VO2max (Pmax) (90CONT; n = 10); 90 min cycling at 40 % Pmax with a 20 min CWI (14 ± 1 °C; 90CWI; n = 10); and 30 min cycling alternating between 40 and 70 % Pmax every 3 min, with no recovery intervention (30HIT; n = 10). Sleep quality and quantity was assessed during HT and four nights of 'baseline' sleep (BASE). Actigraphy provided measures of time in and out of bed, sleep latency, efficiency, total time in bed and total time asleep, wake after sleep onset, number of awakenings, and wakening duration. Subjective ratings of sleep were also recorded using a 1-5 Likert scale. Repeated measures analysis of variance (ANOVA) was completed to determine effect of time and condition on sleep quality and quantity. Cohen's d effect sizes were also applied to determine magnitude and trends in the data. - Results Sleep latency, efficiency, total time in bed and number of awakenings were not significantly different between BASE and HT (P > 0.05). However, total time asleep was significantly reduced (P = 0.01; d = 1.46) and the duration periods of wakefulness after sleep onset was significantly greater during HT compared with BASE (P = 0.001; d = 1.14). Comparison between training groups showed latency was significantly higher for the 30HIT group compared to 90CONT (P = 0.02; d = 1.33). Nevertheless, there were no differences between training groups for sleep efficiency, total time in bed or asleep, wake after sleep onset, number of awakenings or awake duration (P > 0.05). Further, cold-water immersion recovery had no significant effect on sleep characteristics (P > 0.05). - Discussion Sleep plays an important role in athletic recovery and has previously been demonstrated to be influenced by both exercise training and thermal strain. Present data highlight the effect of HT on reduced sleep quality, specifically reducing total time asleep due to longer duration awake during awakenings after sleep onset. Importantly, although cold water recovery accelerates the removal of thermal load, this intervention did not blunt the negative effects of HT on sleep characteristics. - Conclusion Training in hot conditions may reduce both sleep quantity and quality and should be taken into consideration when administering this training intervention in the field.