50 resultados para RISK OPTIMIZATION
Resumo:
The off-site transport of agricultural chemicals, such as herbicides, into freshwater and marine ecosystems is a world-wide concern. The adoption of farm management practices that minimise herbicide transport in rainfall-runoff is a priority for the Australian sugarcane industry, particularly in the coastal catchments draining into the World Heritage listed Great Barrier Reef (GBR) lagoon. In this study, residual herbicide runoff and infiltration were measured using a rainfall simulator in a replicated trial on a brown Chromosol with 90–100% cane trash blanket cover in the Mackay Whitsunday region, Queensland. Management treatments included conventional 1.5 m spaced sugarcane beds with a single row of sugarcane (CONV) and 2 m spaced, controlled traffic sugarcane beds with dual sugarcane rows (0.8 m apart) (2mCT). The aim was to simulate the first rainfall event after the application of the photosynthesis inhibiting (PSII) herbicides ametryn, atrazine, diuron and hexazinone, by broadcast (100% coverage, on bed and furrow) and banding (50–60% coverage, on bed only) methods. These events included heavy rainfall 1 day after herbicide application, considered a worst case scenario, or rainfall 21 days after application. The 2mCT rows had significantly (P < 0.05) less runoff (38%) and lower peak runoff rates (43%) than CONV rows for a rainfall average of 93 mm at 100 mm h−1 (1:20 yr Average Return Interval). Additionally, final infiltration rates were higher in 2mCT rows than CONV rows, with 72 and 52 mm h−1 respectively. This resulted in load reductions of 60, 55, 47, and 48% for ametryn, atrazine, diuron and hexazinone from 2mCT rows, respectively. Herbicide losses in runoff were also reduced by 32–42% when applications were banded rather than broadcast. When rainfall was experienced 1 day after application, a large percentage of herbicides were washed off the cane trash. However, by day 21, concentrations of herbicide residues on cane trash were lower and more resistant to washoff, resulting in lower losses in runoff. Consequently, ametryn and atrazine event mean concentrations in runoff were approximately 8 fold lower at day 21 compared with day 1, whilst diuron and hexazinone were only 1.6–1.9 fold lower, suggesting longer persistence of these chemicals. Runoff collected at the end of the paddock in natural rainfall events indicated consistent though smaller treatment differences to the rainfall simulation study. Overall, it was the combination of early application, banding and controlled traffic that was most effective in reducing herbicide losses in runoff. Crown copyright © 2012
Resumo:
This study examines the application of digital ecosystems concepts to a biological ecosystem simulation problem. The problem involves the use of a digital ecosystem agent to optimize the accuracy of a second digital ecosystem agent, the biological ecosystem simulation. The study also incorporates social ecosystems, with a technological solution design subsystem communicating with a science subsystem and simulation software developer subsystem to determine key characteristics of the biological ecosystem simulation. The findings show similarities between the issues involved in digital ecosystem collaboration and those occurring when digital ecosystems interact with biological ecosystems. The results also suggest that even precise semantic descriptions and comprehensive ontologies may be insufficient to describe agents in enough detail for use within digital ecosystems, and a number of solutions to this problem are proposed.
Resumo:
Hendra virus is a highly pathogenic novel paramyxovirus causing sporadic fatal infection in horses and humans in Australia. Species of fruit-bats (genus Pteropus), commonly known as flying-foxes, are the natural host of the virus. We undertook a survey of horse owners in the states of Queensland and New South Wales, Australia to assess the level of adoption of recommended risk management strategies and to identify impediments to adoption. Survey questionnaires were completed by 1431 respondents from the target states, and from a spectrum of industry sectors. Hendra virus knowledge varied with sector, but was generally limited, with only 13% of respondents rating their level of knowledge as high or very high. The majority of respondents (63%) had seen their state’s Hendra virus information for horse owners, and a similar proportion found the information useful. Fifty-six percent of respondents thought it moderately, very or extremely likely that a Hendra virus case could occur in their area, yet only 37% said they would consider Hendra virus if their horse was sick. Only 13% of respondents stabled their horses overnight, although another 24% said it would be easy or very easy to do so, but hadn’t done so. Only 13% and 15% of respondents respectively had horse feed bins and water points under solid cover. Responses varied significantly with state, likely reflecting different Hendra virus history. The survey identified inconsistent awareness and/or adoption of available knowledge, confusion in relation to Hendra virus risk perception, with both over-and under-estimation of true risk, and lag in the uptake of recommended risk minimisation strategies, even when these were readily implementable. However, we also identified frustration and potential alienation by horse owners who found the recommended strategies impractical, onerous and prohibitively expensive. The insights gained from this survey have broader application to other complex risk-management scenarios.
Resumo:
In irrigated cropping, as with any other industry, profit and risk are inter-dependent. An increase in profit would normally coincide with an increase in risk, and this means that risk can be traded for profit. It is desirable to manage a farm so that it achieves the maximum possible profit for the desired level of risk. This paper identifies risk-efficient cropping strategies that allocate land and water between crop enterprises for a case study of an irrigated farm in Southern Queensland, Australia. This is achieved by applying stochastic frontier analysis to the output of a simulation experiment. The simulation experiment involved changes to the levels of business risk by systematically varying the crop sowing rules in a bioeconomic model of the case study farm. This model utilises the multi-field capability of the process based Agricultural Production System Simulator (APSIM) and is parameterised using data collected from interviews with a collaborating farmer. We found sowing rules that increased the farm area sown to cotton caused the greatest increase in risk-efficiency. Increasing maize area also improved risk-efficiency but to a lesser extent than cotton. Sowing rules that increased the areas sown to wheat reduced the risk-efficiency of the farm business. Sowing rules were identified that had the potential to improve the expected farm profit by ca. $50,000 Annually, without significantly increasing risk. The concept of the shadow price of risk is discussed and an expression is derived from the estimated frontier equation that quantifies the trade-off between profit and risk.
Resumo:
For many years Australian forest pathologists and other scientists have dreaded the arrival of the rust fungus, Puccinia psidii, commonly known as Myrtle Rust, in Australia. This pathogen eventually did arrive in that country and was first detected in New South Wales in 2010 on Willow Myrtle (Agonis flexuosa). It is generally accepted that it entered the country on an ornamental Myrtales* host brought in by a private nursery. Despite efforts to eradicate the invasive rust, it has already spread widely, now occurring along the east coast of Australia, from temperate areas in Victoria and southern North South Wales to tropical areas in north Queensland.
Resumo:
Hendra virus causes sporadic but typically fatal infection in horses and humans in eastern Australia. Fruit-bats of the genus Pteropus (commonly known as flying-foxes) are the natural host of the virus, and the putative source of infection in horses; infected horses are the source of human infection. Effective treatment is lacking in both horses and humans, and notwithstanding the recent availability of a vaccine for horses, exposure risk mitigation remains an important infection control strategy. This study sought to inform risk mitigation by identifying spatial and environmental risk factors for equine infection using multiple analytical approaches to investigate the relationship between plausible variables and reported Hendra virus infection in horses. Spatial autocorrelation (Global Moran’s I) showed significant clustering of equine cases at a distance of 40 km, a distance consistent with the foraging ‘footprint’ of a flying-fox roost, suggesting the latter as a biologically plausible basis for the clustering. Getis-Ord Gi* analysis identified multiple equine infection hot spots along the eastern Australia coast from far north Queensland to central New South Wales, with the largest extending for nearly 300 km from southern Queensland to northern New South Wales. Geographically weighted regression (GWR) showed the density of P. alecto and P. conspicillatus to have the strongest positive correlation with equine case locations, suggesting these species are more likely a source of infection of Hendra virus for horses than P. poliocephalus or P. scapulatus. The density of horses, climate variables and vegetation variables were not found to be a significant risk factors, but the residuals from the GWR suggest that additional unidentified risk factors exist at the property level. Further investigations and comparisons between case and control properties are needed to identify these local risk factors.
Resumo:
Background and Aim The etiology of Crohn's disease (CD) implicates both genetic and environmental factors. Smoking behavior is one environmental risk factor to play a role in the development of CD. The study aimed to assess the contribution of the interleukin 23 receptor (IL23R) in determining disease susceptibility in two independent cohorts of CD, and to investigate the interactions between IL23R variants, smoking behavior, and CD-associated genes, NOD2 and ATG16L1. Methods Ten IL23R single-nucleotide polymorphisms (SNPs) were genotyped in 675 CD cases, and 1255 controls from Brisbane, Australia (dataset 1). Six of these SNPs were genotyped in 318 CD cases and 533 controls from Canterbury, New Zealand (dataset 2). Case–control analysis of genotype and allele frequencies, and haplotype analysis for all SNPs was conducted. Results We demonstrate a strong increased CD risk for smokers in both datasets (odds ratio 3.77, 95% confidence interval 2.88–4.94), and an additive interaction between IL23R SNPs and cigarette smoking. Ileal involvement was a consistent marker of strong SNP–CD association (P ≤ 0.001), while the lowest minor allele frequencies for location were found in those with colonic CD (L2). Three haplotype blocks were identified across the 10 IL23R SNPs conferring different risk of CD. Haplotypes conferred no further risk of CD when compared with single SNP analyses. Conclusion IL23R gene variants determine CD susceptibility in the Australian and New Zealand population, particularly ileal CD. A strong additive interaction exists between IL23R SNPs and smoking behavior resulting in a dramatic increase in disease risk depending upon specific genetic background.
Resumo:
Puccinia psidii, the causal agent of myrtle rust, was first recorded from Latin America more than 100 years ago. It occurs on many native species of Myrtaceae in Latin America and also infects non-native plantation-grown Eucalyptus species in the region. The pathogen has gradually spread to new areas including Australia and most recently South Africa. The aim of this study was to consider the susceptibility of selected Eucalyptus genotypes, particularly those of interest to South African forestry, to infection by P. psidii. In addition, risk maps were compiled based on suitable climatic conditions and the occurrence of potential susceptible tree species. This made it possible to identify the season when P. psidii would be most likely to infect and to define the geographic areas where the rust disease would be most likely to establish in South Africa. As expected, variation in susceptibility was observed between eucalypt genotypes tested. Importantly, species commonly planted in South Africa show good potential for yielding disease-tolerant material for future planting. Myrtle rust is predicted to be more common in spring and summer. Coastal areas, as well as areas in South Africa with subtropical climates, are more conducive to outbreaks of the pathogen.
Resumo:
Thus the objectives of this study can be broadly categorised as follows:- Evaluate current practices adopted (e.g. litter pile-up) prior to re-use of litter for subsequent chicken cycles To establish pathogen die-off that occurs during currently adopted methods of in-shed treatment of litter To establish simple physical parameters to monitor this pathogen reduction and create an understanding of such reduction strategies to aid in-shed management of re-use litter To carry out studies to assess the potential of the re-used litter (once spread) to support pathogens during a typical chicken production cycle. To provide background data for the development of a simple code of practice for an in-shed litter pile-up process
Resumo:
Bats of the genus Pteropus (flying-foxes) are the natural host of Hendra virus (HeV) which periodically causes fatal disease in horses and humans in Australia. The increased urban presence of flying-foxes often provokes negative community sentiments because of reduced social amenity and concerns of HeV exposure risk, and has resulted in calls for the dispersal of urban flying-fox roosts. However, it has been hypothesised that disturbance of urban roosts may result in a stress-mediated increase in HeV infection in flying-foxes, and an increased spillover risk. We sought to examine the impact of roost modification and dispersal on HeV infection dynamics and cortisol concentration dynamics in flying-foxes. The data were analysed in generalised linear mixed models using restricted maximum likelihood (REML). The difference in mean HeV prevalence in samples collected before (4.9%), during (4.7%) and after (3.4%) roost disturbance was small and non-significant (P = 0.440). Similarly, the difference in mean urine specific gravity-corrected urinary cortisol concentrations was small and non-significant (before = 22.71 ng/mL, during = 27.17, after = 18.39) (P= 0.550). We did find an underlying association between cortisol concentration and season, and cortisol concentration and region, suggesting that other (plausibly biological or environmental) variables play a role in cortisol concentration dynamics. The effect of roost disturbance on cortisol concentration approached statistical significance for region, suggesting that the relationship is not fixed, and plausibly reflecting the nature and timing of disturbance. We also found a small positive statistical association between HeV excretion status and urinary cortisol concentration. Finally, we found that the level of flying-fox distress associated with roost disturbance reflected the nature and timing of the activity, highlighting the need for a ‘best practice’ approach to dispersal or roost modification activities. The findings usefully inform public discussion and policy development in relation to Hendra virus and flying-fox management.
Resumo:
Pteropid bats or flying-foxes (Chiroptera: Pteropodidae) are the natural host of Hendra virus (HeV) which sporadically causes fatal disease in horses and humans in eastern Australia. While there is strong evidence that urine is an important infectious medium that likely drives bat to bat transmission and bat to horse transmission, there is uncertainty about the relative importance of alternative routes of excretion such as nasal and oral secretions, and faeces. Identifying the potential routes of HeV excretion in flying-foxes is important to effectively mitigate equine exposure risk at the bat-horse interface, and in determining transmission rates in host-pathogen models. The aim of this study was to identify the major routes of HeV excretion in naturally infected flying-foxes, and secondarily, to identify between-species variation in excretion prevalence. A total of 2840 flying-foxes from three of the four Australian mainland species (Pteropus alecto, P. poliocephalus and P. scapulatus) were captured and sampled at multiple roost locations in the eastern states of Queensland and New South Wales between 2012 and 2014. A range of biological samples (urine and serum, and urogenital, nasal, oral and rectal swabs) were collected from anaesthetized bats, and tested for HeV RNA using a qRT-PCR assay targeting the M gene. Forty-two P. alecto (n = 1410) had HeV RNA detected in at least one sample, and yielded a total of 78 positive samples, at an overall detection rate of 1.76% across all samples tested in this species (78/4436). The rate of detection, and the amount of viral RNA, was highest in urine samples (>serum, packed haemocytes >faecal >nasal >oral), identifying urine as the most plausible source of infection for flying-foxes and for horses. Detection in a urine sample was more efficient than detection in urogenital swabs, identifying the former as the preferred diagnostic sample. The detection of HeV RNA in serum is consistent with haematogenous spread, and with hypothesised latency and recrudesence in flying-foxes. There were no detections in P. poliocephalus (n = 1168 animals; n = 2958 samples) or P. scapulatus (n = 262 animals; n = 985 samples), suggesting (consistent with other recent studies) that these species are epidemiologically less important than P. alecto in HeV infection dynamics. The study is unprecedented in terms of the individual animal approach, the large sample size, and the use of a molecular assay to directly determine infection status. These features provide a high level of confidence in the veracity of our findings, and a sound basis from which to more precisely target equine risk mitigation strategies.
Resumo:
Aflatoxin is a potent carcinogen produced by Aspergillus flavus, which frequently contaminates maize (Zea mays L.) in the field between 40° north and 40° south latitudes. A mechanistic model to predict risk of pre-harvest contamination could assist in management of this very harmful mycotoxin. In this study we describe an aflatoxin risk prediction model which is integrated with the Agricultural Production Systems Simulator (APSIM) modelling framework. The model computes a temperature function for A. flavus growth and aflatoxin production using a set of three cardinal temperatures determined in the laboratory using culture medium and intact grains. These cardinal temperatures were 11.5 °C as base, 32.5 °C as optimum and 42.5 °C as maximum. The model used a low (≤0.2) crop water supply to demand ratio—an index of drought during the grain filling stage to simulate maize crop's susceptibility to A. flavus growth and aflatoxin production. When this low threshold of the index was reached the model converted the temperature function into an aflatoxin risk index (ARI) to represent the risk of aflatoxin contamination. The model was applied to simulate ARI for two commercial maize hybrids, H513 and H614D, grown in five multi-location field trials in Kenya using site specific agronomy, weather and soil parameters. The observed mean aflatoxin contamination in these trials varied from <1 to 7143 ppb. ARI simulated by the model explained 99% of the variation (p ≤ 0.001) in a linear relationship with the mean observed aflatoxin contamination. The strong relationship between ARI and aflatoxin contamination suggests that the model could be applied to map risk prone areas and to monitor in-season risk for genotypes and soils parameterized for APSIM.