64 resultados para reference modelling
Resumo:
Increasing resistance to phosphine (PH 3) in insect pests, including lesser grain borer (Rhyzopertha dominica) has become a critical issue, and development of effective and sustainable strategies to manage resistance is crucial. In practice, the same grain store may be fumigated multiple times, but usually for the same exposure period and concentration. Simulating a single fumigation allows us to look more closely at the effects of this standard treatment.We used an individual-based, two-locus model to investigate three key questions about the use of phosphine fumigant in relation to the development of PH 3 resistance. First, which is more effective for insect control; long exposure time with a low concentration or short exposure period with a high concentration? Our results showed that extending exposure duration is a much more efficient control tactic than increasing the phosphine concentration. Second, how long should the fumigation period be extended to deal with higher frequencies of resistant insects in the grain? Our results indicated that if the original frequency of resistant insects is increased n times, then the fumigation needs to be extended, at most, n days to achieve the same level of insect control. The third question is how does the presence of varying numbers of insects inside grain storages impact the effectiveness of phosphine fumigation? We found that, for a given fumigation, as the initial population number was increased, the final survival of resistant insects increased proportionally. To control initial populations of insects that were n times larger, it was necessary to increase the fumigation time by about n days. Our results indicate that, in a 2-gene mediated resistance where dilution of resistance gene frequencies through immigration of susceptibles has greater effect, extending fumigation times to reduce survival of homozygous resistant insects will have a significant impact on delaying the development of resistance. © 2012 Elsevier Ltd.
Resumo:
Remote detection of management-related trend in the presence of inter-annual climatic variability in the rangelands is difficult. Minimally disturbed reference areas provide a useful guide, but suitable benchmarks are usually difficult to identify. We describe a method that uses a unique conceptual framework to identify reference areas from multitemporal sequences of ground cover derived from Landsat TM and ETM+ imagery. The method does not require ground-based reference sites nor GIS layers about management. We calculate a minimum ground cover image across all years to identify locations of most persistent ground cover in years of lowest rainfall. We then use a moving window approach to calculate the difference between the window's central pixel and its surrounding reference pixels. This difference estimates ground-cover change between successive below-average rainfall years, which provides a seasonally interpreted measure of management effects. We examine the approach's sensitivity to window size and to cover-index percentiles used to define persistence. The method successfully detected management-related change in ground cover in Queensland tropical savanna woodlands in two case studies: (1) a grazing trial where heavy stocking resulted in substantial decline in ground cover in small paddocks, and (2) commercial paddocks where wet-season spelling (destocking) resulted in increased ground cover. At a larger scale, there was broad agreement between our analysis of ground-cover change and ground-based land condition change for commercial beef properties with different a priori ratings of initial condition, but there was also some disagreement where changing condition reflected pasture composition rather than ground cover. We conclude that the method is suitably robust to analyse grazing effects on ground cover across the 1.3 x 10(6) km(2) of Queensland's rangelands. Crown Copyright (c) 2012 Published by Elsevier Inc. All rights reserved.
Resumo:
Targets for improvements in water quality entering the Great Barrier Reef (GBR) have been set through the Reef Water Quality Protection Plan (Reef Plan). To measure and report on progress towards the targets set a program has been established that combines monitoring and modelling at paddock through to catchment and reef scales; the Paddock to Reef Integrated Monitoring, Modelling and Reporting Program (Paddock to Reef Program). This program aims to provide evidence of links between land management activities, water quality and reef health. Five lines of evidence are used: the effectiveness of management practices to improve water quality; the prevalence of management practice adoption and change in catchment indicators; long-term monitoring of catchment water quality; paddock & catchment modelling to provide a relative assessment of progress towards meeting targets; and finally marine monitoring of GBR water quality and reef ecosystem health. This paper outlines the first four lines of evidence. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Glyphosate resistance is a rapidly developing threat to profitability in Australian cotton farming. Resistance causes an immediate reduction in the effectiveness of in-crop weed control in glyphosate-resistant transgenic cotton and summer fallows. Although strategies for delaying glyphosate resistance and those for managing resistant populations are qualitatively similar, the longer resistance can be delayed, the longer cotton growers will have choice over which tactics to apply and when to apply them. Effective strategies to avoid, delay, and manage resistance are thus of substantial value. We used a model of glyphosate resistance dynamics to perform simulations of resistance evolution in Sonchus oleraceus (common sowthistle) and Echinochloa colona (awnless barnyard grass) under a range of resistance prevention, delaying, and management strategies. From these simulations, we identified several elements that could contribute to effective glyphosate resistance prevention and management strategies. (i) Controlling glyphosate survivors is the most robust approach to delaying or preventing resistance. High-efficacy, high-frequency survivor control almost doubled the useful lifespan of glyphosate from 13 to 25 years even with glyphosate alone used in summer fallows. (ii) Two non-glyphosate tactics in-crop plus two in-summer fallows is the minimum intervention required for long-term delays in resistance evolution. (iii) Pre-emergence herbicides are important, but should be backed up with non-glyphosate knockdowns and strategic tillage; replacing a late-season, pre-emergence herbicide with inter-row tillage was predicted to delay glyphosate resistance by 4 years in awnless barnyard grass. (iv) Weed species' ecological characteristics, particularly seed bank dynamics, have an impact on the effectiveness of resistance strategies; S. oleraceus, because of its propensity to emerge year-round, was less exposed to selection with glyphosate than E. colona, resulting in an extra 5 years of glyphosate usefulness (18 v. 13 years) even in the most rapid cases of resistance evolution. Delaying tactics are thus available that can provide some or many years of continued glyphosate efficacy. If glyphosate-resistant cotton cropping is to remain profitable in Australian farming systems in the long-term, however, growers must adapt to the probability that they will have to deal with summer weeds that are no longer susceptible to glyphosate. Robust resistance management systems will need to include a diversity of weed control options, used appropriately.
Resumo:
Concepts of agricultural sustainability and possible roles of simulation modelling for characterising sustainability were explored by conducting, and reflecting on, a sustainability assessment of rain-fed wheat-based systems in the Middle East and North Africa region. We designed a goal-oriented, model-based framework using the cropping systems model Agricultural Production Systems sIMulator (APSIM). For the assessment, valid (rather than true or false) sustainability goals and indicators were identified for the target system. System-specific vagueness was depicted in sustainability polygons-a system property derived from highly quantitative data-and denoted using descriptive quantifiers. Diagnostic evaluations of alternative tillage practices demonstrated the utility of the framework to quantify key bio-physical and chemical constraints to sustainability. Here, we argue that sustainability is a vague, emergent system property of often wicked complexity that arises out of more fundamental elements and processes. A 'wicked concept of sustainability' acknowledges the breadth of the human experience of sustainability, which cannot be internalised in a model. To achieve socially desirable sustainability goals, our model-based approach can inform reflective evaluation processes that connect with the needs and values of agricultural decision-makers. Hence, it can help to frame meaningful discussions, from which actions might emerge.
Resumo:
Climate change and on-going water policy reforms will likely contribute to on-farm and regional structural adjustment in Australia. This paper gathers empirical evidence of farm-level structural adjustments and integrates these with a regional equilibrium model to investigate sectoral and regional impacts of climate change and recent water use policy on rice industry. We find strong evidence of adjustments to the farming system, enabled by existing diversity in on-farm production. A further loss of water with additional pressures to adopt less intensive and larger-scale farming, will however reduce the net number of farm businesses, which may affect regional rice production. The results from a regional CGE model show impacts on the regional economy over and above the direct cost of the environmental water, although a net reduction in real economic output and real income is partially offset by gains in rest of the Australia through the reallocation or resources. There is some interest within the industry and from potential new corporate entrants in the relocation of some rice production to the north. However, strong government support would be crucial to implement such relocation.
Resumo:
Computer simulation modelling is an essential aid in building an integrated understanding of how different factors interact to affect the evolutionary and population dynamics of herbicide resistance, and thus in helping to predict and manage how agricultural systems will be affected. In this review, we first discuss why computer simulation modelling is such an important tool and framework for dealing with herbicide resistance. We then explain what questions related to herbicide resistance have been addressed to date using simulation modelling, and discuss the modelling approaches that have been used, focusing first on the earlier, more general approaches, and then on some newer, more innovative approaches. We then consider how these approaches could be further developed in the future, by drawing on modelling techniques that are already employed in other areas, such as individual-based and spatially explicit modelling approaches, as well as the possibility of better representing genetics, competition and economics, and finally the questions and issues of importance to herbicide resistance research and management that could be addressed using these new approaches are discussed. We conclude that it is necessary to proceed with caution when increasing the complexity of models by adding new details, but, with appropriate care, more detailed models will make it possible to integrate more current knowledge in order better to understand, predict and ultimately manage the evolution of herbicide resistance. © 2014 Society of Chemical Industry.
Resumo:
* Plant response to drought is complex, so that traits adapted to a specific drought type can confer disadvantage in another drought type. Understanding which type(s) of drought to target is of prime importance for crop improvement. * Modelling was used to quantify seasonal drought patterns for a check variety across the Australian wheatbelt, using 123 yr of weather data for representative locations and managements. Two other genotypes were used to simulate the impact of maturity on drought pattern. * Four major environment types summarized the variability in drought pattern over time and space. Severe stress beginning before flowering was common (44% of occurrences), with (24%) or without (20%) relief during grain filling. High variability occurred from year to year, differing with geographical region. With few exceptions, all four environment types occurred in most seasons, for each location, management system and genotype. * Applications of such environment characterization are proposed to assist breeding and research to focus on germplasm, traits and genes of interest for target environments. The method was applied at a continental scale to highly variable environments and could be extended to other crops, to other drought-prone regions around the world, and to quantify potential changes in drought patterns under future climates.
Resumo:
Extensive resources are allocated to managing vertebrate pests, yet spatial understanding of pest threats, and how they respond to management, is limited at the regional scale where much decision-making is undertaken. We provide regional-scale spatial models and management guidance for European rabbits (Oryctolagus cuniculus) in a 260,791 km(2) region in Australia by determining habitat suitability, habitat susceptibility and the effects of the primary rabbit management options (barrier fence, shooting and baiting and warren ripping) or changing predation or disease control levels. A participatory modelling approach was used to develop a Bayesian network which captured the main drivers of suitability and spread, which in turn was linked spatially to develop high resolution risk maps. Policy-makers, rabbit managers and technical experts were responsible for defining the questions the model needed to address, and for subsequently developing and parameterising the model. Habitat suitability was determined by conditions required for warren-building and by above-ground requirements, such as food and harbour, and habitat susceptibility by the distance from current distributions, habitat suitability, and the costs of traversing habitats of different quality. At least one-third of the region had a high probability of being highly suitable (support high rabbit densities), with the model supported by validation. Habitat susceptibility was largely restricted by the current known rabbit distribution. Warren ripping was the most effective control option as warrens were considered essential for rabbit persistence. The anticipated increase in disease resistance was predicted to increase the probability of moderately suitable habitat becoming highly suitable, but not increase the at-risk area. We demonstrate that it is possible to build spatial models to guide regional-level management of vertebrate pests which use the best available knowledge and capture fine spatial-scale processes.
Resumo:
Pasture rest is a possible strategy for improving land condition in the extensive grazing lands of northern Australia. If pastures currently in poor condition could be improved, then overall animal productivity and the sustainability of grazing could be increased. The scientific literature is examined to assess the strength of the experimental information to support and guide the use of pasture rest, and simulation modelling is undertaken to extend this information to a broader range of resting practices, growing conditions and initial pasture condition. From this, guidelines are developed that can be applied in the management of northern Australia’s grazing lands and also serve as hypotheses for further field experiments. The literature on pasture rest is diverse but there is a paucity of data from much of northern Australia as most experiments have been conducted in southern and central parts of Queensland. Despite this, the limited experimental information and the results from modelling were used to formulate the following guidelines. Rest during the growing season gives the most rapid improvement in the proportion of perennial grasses in pastures; rest during the dormant winter period is ineffective in increasing perennial grasses in a pasture but may have other benefits. Appropriate stocking rates are essential to gain the greatest benefit from rest: if stocking rates are too high, then pasture rest will not lead to improvement; if stocking rates are low, pastures will tend to improve without rest. The lower the initial percentage of perennial grasses, the more frequent the rests should be to give a major improvement within a reasonable management timeframe. Conditions during the growing season also have an impact on responses with the greatest improvement likely to be in years of good growing conditions. The duration and frequency of rest periods can be combined into a single value expressed as the proportion of time during which resting occurs; when this is done the modelling suggests the greater the proportion of time that a pasture is rested, the greater is the improvement but this needs to be tested experimentally. These guidelines should assist land managers to use pasture resting but the challenge remains to integrate pasture rest with other pasture and animal management practices at the whole-property scale.
Resumo:
Post-rainy sorghum (Sorghum bicolor (L.) Moench) production underpins the livelihood of millions in the semiarid tropics, where the crop is affected by drought. Drought scenarios have been classified and quantified using crop simulation. In this report, variation in traits that hypothetically contribute to drought adaptation (plant growth dynamics, canopy and root water conducting capacity, drought stress responses) were virtually introgressed into the most common post-rainy sorghum genotype, and the influence of these traits on plant growth, development, and grain and stover yield were simulated across different scenarios. Limited transpiration rates under high vapour pressure deficit had the highest positive effect on production, especially combined with enhanced water extraction capacity at the root level. Variability in leaf development (smaller canopy size, later plant vigour or increased leaf appearance rate) also increased grain yield under severe drought, although it caused a stover yield trade-off under milder stress. Although the leaf development response to soil drying varied, this trait had only a modest benefit on crop production across all stress scenarios. Closer dissection of the model outputs showed that under water limitation, grain yield was largely determined by the amount of water availability after anthesis, and this relationship became closer with stress severity. All traits investigated increased water availability after anthesis and caused a delay in leaf senescence and led to a ‘stay-green’ phenotype. In conclusion, we showed that breeding success remained highly probabilistic; maximum resilience and economic benefits depended on drought frequency. Maximum potential could be explored by specific combinations of traits.
Resumo:
We trace the evolution of the representation of management in cropping and grazing systems models, from fixed annual schedules of identical actions in single paddocks toward flexible scripts of rules. Attempts to define higher-level organizing concepts in management policies, and to analyse them to identify optimal plans, have focussed on questions relating to grazing management owing to its inherent complexity. “Rule templates” assist the re-use of complex management scripts by bundling commonly-used collections of rules with an interface through which key parameters can be input by a simulation builder. Standard issues relating to parameter estimation and uncertainty apply to management sub-models and need to be addressed. Techniques for embodying farmers' expectations and plans for the future within modelling analyses need to be further developed, especially better linking planning- and rule-based approaches to farm management and analysing the ways that managers can learn.
Resumo:
Hendra virus (HeV), a highly pathogenic zoonotic paramyxovirus recently emerged from bats, is a major concern to the horse industry in Australia. Previous research has shown that higher temperatures led to lower virus survival rates in the laboratory. We develop a model of survival of HeV in the environment as influenced by temperature. We used 20 years of daily temperature at six locations spanning the geographic range of reported HeV incidents to simulate the temporal and spatial impacts of temperature on HeV survival. At any location, simulated virus survival was greater in winter than in summer, and in any month of the year, survival was higher in higher latitudes. At any location, year-to-year variation in virus survival 24 h post-excretion was substantial and was as large as the difference between locations. Survival was higher in microhabitats with lower than ambient temperature, and when environmental exposure was shorter. The within-year pattern of virus survival mirrored the cumulative within-year occurrence of reported HeV cases, although there were no overall differences in survival in HeV case years and non-case years. The model examines the effect of temperature in isolation; actual virus survivability will reflect the effect of additional environmental factors
Resumo:
The financial health of beef cattle enterprises in northern Australia has declined markedly over the last decade due to an escalation in production and marketing costs and a real decline in beef prices. Historically, gains in animal productivity have offset the effect of declining terms of trade on farm incomes. This raises the question of whether future productivity improvements can remain a key path for lifting enterprise profitability sufficient to ensure that the industry remains economically viable over the longer term. The key objective of this study was to assess the production and financial implications for north Australian beef enterprises of a range of technology interventions (development scenarios), including genetic gain in cattle, nutrient supplementation, and alteration of the feed base through introduced pastures and forage crops, across a variety of natural environments. To achieve this objective a beef systems model was developed that is capable of simulating livestock production at the enterprise level, including reproduction, growth and mortality, based on energy and protein supply from natural C4 pastures that are subject to high inter-annual climate variability. Comparisons between simulation outputs and enterprise performance data in three case study regions suggested that the simulation model (the Northern Australia Beef Systems Analyser) can adequately represent the performance beef cattle enterprises in northern Australia. Testing of a range of development scenarios suggested that the application of individual technologies can substantially lift productivity and profitability, especially where the entire feedbase was altered through legume augmentation. The simultaneous implementation of multiple technologies that provide benefits to different aspects of animal productivity resulted in the greatest increases in cattle productivity and enterprise profitability, with projected weaning rates increasing by 25%, liveweight gain by 40% and net profit by 150% above current baseline levels, although gains of this magnitude might not necessarily be realised in practice. While there were slight increases in total methane output from these development scenarios, the methane emissions per kg of beef produced were reduced by 20% in scenarios with higher productivity gain. Combinations of technologies or innovative practices applied in a systematic and integrated fashion thus offer scope for providing the productivity and profitability gains necessary to maintain viable beef enterprises in northern Australia into the future.