25 resultados para Sequential indicator simulation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Concepts of agricultural sustainability and possible roles of simulation modelling for characterising sustainability were explored by conducting, and reflecting on, a sustainability assessment of rain-fed wheat-based systems in the Middle East and North Africa region. We designed a goal-oriented, model-based framework using the cropping systems model Agricultural Production Systems sIMulator (APSIM). For the assessment, valid (rather than true or false) sustainability goals and indicators were identified for the target system. System-specific vagueness was depicted in sustainability polygons-a system property derived from highly quantitative data-and denoted using descriptive quantifiers. Diagnostic evaluations of alternative tillage practices demonstrated the utility of the framework to quantify key bio-physical and chemical constraints to sustainability. Here, we argue that sustainability is a vague, emergent system property of often wicked complexity that arises out of more fundamental elements and processes. A 'wicked concept of sustainability' acknowledges the breadth of the human experience of sustainability, which cannot be internalised in a model. To achieve socially desirable sustainability goals, our model-based approach can inform reflective evaluation processes that connect with the needs and values of agricultural decision-makers. Hence, it can help to frame meaningful discussions, from which actions might emerge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The root-lesion nematode, Pratylenchus thornei, can reduce wheat yields by >50%. Although this nematode has a broad host range, crop rotation can be an effective tool for its management if the host status of crops and cultivars is known. The summer crops grown in the northern grain region of Australia are poorly characterised for their resistance to P. thornei and their role in crop sequencing to improve wheat yields. In a 4-year field experiment, we prepared plots with high or low populations of P. thornei by growing susceptible wheat or partially resistant canaryseed (Phalaris canariensis); after an 11-month, weed-free fallow, several cultivars of eight summer crops were grown. Following another 15-month, weed-free fallow, P. thornei-intolerant wheat cv. Strzelecki was grown. Populations of P. thornei were determined to 150 cm soil depth throughout the experiment. When two partially resistant crops were grown in succession, e.g. canaryseed followed by panicum (Setaria italica), P. thornei populations were <739/kg soil and subsequent wheat yields were 3245 kg/ha. In contrast, after two susceptible crops, e.g. wheat followed by soybean, P. thornei populations were 10 850/kg soil and subsequent wheat yields were just 1383 kg/ha. Regression analysis showed a linear, negative response of wheat biomass and grain yield with increasing P. thornei populations and a predicted loss of 77% for biomass and 62% for grain yield. The best predictor of wheat yield loss was P. thornei populations at 0-90 cm soil depth. Crop rotation can be used to reduce P. thornei populations and increase wheat yield, with greatest gains being made following two partially resistant crops grown sequentially.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In irrigated cropping, as with any other industry, profit and risk are inter-dependent. An increase in profit would normally coincide with an increase in risk, and this means that risk can be traded for profit. It is desirable to manage a farm so that it achieves the maximum possible profit for the desired level of risk. This paper identifies risk-efficient cropping strategies that allocate land and water between crop enterprises for a case study of an irrigated farm in Southern Queensland, Australia. This is achieved by applying stochastic frontier analysis to the output of a simulation experiment. The simulation experiment involved changes to the levels of business risk by systematically varying the crop sowing rules in a bioeconomic model of the case study farm. This model utilises the multi-field capability of the process based Agricultural Production System Simulator (APSIM) and is parameterised using data collected from interviews with a collaborating farmer. We found sowing rules that increased the farm area sown to cotton caused the greatest increase in risk-efficiency. Increasing maize area also improved risk-efficiency but to a lesser extent than cotton. Sowing rules that increased the areas sown to wheat reduced the risk-efficiency of the farm business. Sowing rules were identified that had the potential to improve the expected farm profit by ca. $50,000 Annually, without significantly increasing risk. The concept of the shadow price of risk is discussed and an expression is derived from the estimated frontier equation that quantifies the trade-off between profit and risk.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Agricultural systems models worldwide are increasingly being used to explore options and solutions for the food security, climate change adaptation and mitigation and carbon trading problem domains. APSIM (Agricultural Production Systems sIMulator) is one such model that continues to be applied and adapted to this challenging research agenda. From its inception twenty years ago, APSIM has evolved into a framework containing many of the key models required to explore changes in agricultural landscapes with capability ranging from simulation of gene expression through to multi-field farms and beyond. Keating et al. (2003) described many of the fundamental attributes of APSIM in detail. Much has changed in the last decade, and the APSIM community has been exploring novel scientific domains and utilising software developments in social media, web and mobile applications to provide simulation tools adapted to new demands. This paper updates the earlier work by Keating et al. (2003) and chronicles the changing external challenges and opportunities being placed on APSIM during the last decade. It also explores and discusses how APSIM has been evolving to a “next generation” framework with improved features and capabilities that allow its use in many diverse topics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pasture rest is a possible strategy for improving land condition in the extensive grazing lands of northern Australia. If pastures currently in poor condition could be improved, then overall animal productivity and the sustainability of grazing could be increased. The scientific literature is examined to assess the strength of the experimental information to support and guide the use of pasture rest, and simulation modelling is undertaken to extend this information to a broader range of resting practices, growing conditions and initial pasture condition. From this, guidelines are developed that can be applied in the management of northern Australia’s grazing lands and also serve as hypotheses for further field experiments. The literature on pasture rest is diverse but there is a paucity of data from much of northern Australia as most experiments have been conducted in southern and central parts of Queensland. Despite this, the limited experimental information and the results from modelling were used to formulate the following guidelines. Rest during the growing season gives the most rapid improvement in the proportion of perennial grasses in pastures; rest during the dormant winter period is ineffective in increasing perennial grasses in a pasture but may have other benefits. Appropriate stocking rates are essential to gain the greatest benefit from rest: if stocking rates are too high, then pasture rest will not lead to improvement; if stocking rates are low, pastures will tend to improve without rest. The lower the initial percentage of perennial grasses, the more frequent the rests should be to give a major improvement within a reasonable management timeframe. Conditions during the growing season also have an impact on responses with the greatest improvement likely to be in years of good growing conditions. The duration and frequency of rest periods can be combined into a single value expressed as the proportion of time during which resting occurs; when this is done the modelling suggests the greater the proportion of time that a pasture is rested, the greater is the improvement but this needs to be tested experimentally. These guidelines should assist land managers to use pasture resting but the challenge remains to integrate pasture rest with other pasture and animal management practices at the whole-property scale.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We trace the evolution of the representation of management in cropping and grazing systems models, from fixed annual schedules of identical actions in single paddocks toward flexible scripts of rules. Attempts to define higher-level organizing concepts in management policies, and to analyse them to identify optimal plans, have focussed on questions relating to grazing management owing to its inherent complexity. “Rule templates” assist the re-use of complex management scripts by bundling commonly-used collections of rules with an interface through which key parameters can be input by a simulation builder. Standard issues relating to parameter estimation and uncertainty apply to management sub-models and need to be addressed. Techniques for embodying farmers' expectations and plans for the future within modelling analyses need to be further developed, especially better linking planning- and rule-based approaches to farm management and analysing the ways that managers can learn.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The financial health of beef cattle enterprises in northern Australia has declined markedly over the last decade due to an escalation in production and marketing costs and a real decline in beef prices. Historically, gains in animal productivity have offset the effect of declining terms of trade on farm incomes. This raises the question of whether future productivity improvements can remain a key path for lifting enterprise profitability sufficient to ensure that the industry remains economically viable over the longer term. The key objective of this study was to assess the production and financial implications for north Australian beef enterprises of a range of technology interventions (development scenarios), including genetic gain in cattle, nutrient supplementation, and alteration of the feed base through introduced pastures and forage crops, across a variety of natural environments. To achieve this objective a beef systems model was developed that is capable of simulating livestock production at the enterprise level, including reproduction, growth and mortality, based on energy and protein supply from natural C4 pastures that are subject to high inter-annual climate variability. Comparisons between simulation outputs and enterprise performance data in three case study regions suggested that the simulation model (the Northern Australia Beef Systems Analyser) can adequately represent the performance beef cattle enterprises in northern Australia. Testing of a range of development scenarios suggested that the application of individual technologies can substantially lift productivity and profitability, especially where the entire feedbase was altered through legume augmentation. The simultaneous implementation of multiple technologies that provide benefits to different aspects of animal productivity resulted in the greatest increases in cattle productivity and enterprise profitability, with projected weaning rates increasing by 25%, liveweight gain by 40% and net profit by 150% above current baseline levels, although gains of this magnitude might not necessarily be realised in practice. While there were slight increases in total methane output from these development scenarios, the methane emissions per kg of beef produced were reduced by 20% in scenarios with higher productivity gain. Combinations of technologies or innovative practices applied in a systematic and integrated fashion thus offer scope for providing the productivity and profitability gains necessary to maintain viable beef enterprises in northern Australia into the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aflatoxin is a potent carcinogen produced by Aspergillus flavus, which frequently contaminates maize (Zea mays L.) in the field between 40° north and 40° south latitudes. A mechanistic model to predict risk of pre-harvest contamination could assist in management of this very harmful mycotoxin. In this study we describe an aflatoxin risk prediction model which is integrated with the Agricultural Production Systems Simulator (APSIM) modelling framework. The model computes a temperature function for A. flavus growth and aflatoxin production using a set of three cardinal temperatures determined in the laboratory using culture medium and intact grains. These cardinal temperatures were 11.5 °C as base, 32.5 °C as optimum and 42.5 °C as maximum. The model used a low (≤0.2) crop water supply to demand ratio—an index of drought during the grain filling stage to simulate maize crop's susceptibility to A. flavus growth and aflatoxin production. When this low threshold of the index was reached the model converted the temperature function into an aflatoxin risk index (ARI) to represent the risk of aflatoxin contamination. The model was applied to simulate ARI for two commercial maize hybrids, H513 and H614D, grown in five multi-location field trials in Kenya using site specific agronomy, weather and soil parameters. The observed mean aflatoxin contamination in these trials varied from <1 to 7143 ppb. ARI simulated by the model explained 99% of the variation (p ≤ 0.001) in a linear relationship with the mean observed aflatoxin contamination. The strong relationship between ARI and aflatoxin contamination suggests that the model could be applied to map risk prone areas and to monitor in-season risk for genotypes and soils parameterized for APSIM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aflatoxin is a potent carcinogen produced by Aspergillus flavus, which frequently contaminates maize (Zea mays L.) in the field between 40° north and 40° south latitudes. A mechanistic model to predict risk of pre-harvest contamination could assist in management of this very harmful mycotoxin. In this study we describe an aflatoxin risk prediction model which is integrated with the Agricultural Production Systems Simulator (APSIM) modelling framework. The model computes a temperature function for A. flavus growth and aflatoxin production using a set of three cardinal temperatures determined in the laboratory using culture medium and intact grains. These cardinal temperatures were 11.5 °C as base, 32.5 °C as optimum and 42.5 °C as maximum. The model used a low (≤0.2) crop water supply to demand ratio—an index of drought during the grain filling stage to simulate maize crop's susceptibility to A. flavus growth and aflatoxin production. When this low threshold of the index was reached the model converted the temperature function into an aflatoxin risk index (ARI) to represent the risk of aflatoxin contamination. The model was applied to simulate ARI for two commercial maize hybrids, H513 and H614D, grown in five multi-location field trials in Kenya using site specific agronomy, weather and soil parameters. The observed mean aflatoxin contamination in these trials varied from <1 to 7143 ppb. ARI simulated by the model explained 99% of the variation (p ≤ 0.001) in a linear relationship with the mean observed aflatoxin contamination. The strong relationship between ARI and aflatoxin contamination suggests that the model could be applied to map risk prone areas and to monitor in-season risk for genotypes and soils parameterized for APSIM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Limitations in quality bedding material have resulted in the growing need to re-use litter during broiler farming in some countries, which can be of concern from a food-safety perspective. The aim of this study was to compare the Campylobacter levels in ceca and litter across three litter treatments under commercial farming conditions. The litter treatments were (a) the use of new litter after each farming cycle; (b) an Australian partial litter re-use practice; and (c) a full litter re-use practice. The study was carried out on two farms over two years (Farm 1, from 2009–2010 and Farm 2, from 2010–2011), across three sheds (35,000 to 40,000 chickens/shed) on each farm, adopting three different litter treatments across six commercial cycles. A random sampling design was adopted to test litter and ceca for Campylobacter and Escherichia coli, prior to commercial first thin-out and final pick-up. Campylobacter levels varied little across litter practices and farming cycles on each farm and were in the range of log 8.0–9.0 CFU/g in ceca and log 4.0–6.0 MPN/g for litter. Similarly the E. coli in ceca were ∼log 7.0 CFU/g. At first thin-out and final pick-up, the statistical analysis for both litter and ceca showed that the three-way interaction (treatments by farms by times) was highly significant (P < 0.01), indicating that the patterns of Campylobacter emergence/presence across time vary between the farms, cycles and pickups. The emergence and levels of both organisms were not influenced by litter treatments across the six farming cycles on both farms. Either C. jejuni or C. coli could be the dominant species across litter and ceca, and this phenomenon could not be attributed to specific litter treatments. Irrespective of the litter treatments in place, cycle 2 on Farm 2 remained campylobacter-free. These outcomes suggest that litter treatments did not directly influence the time of emergence and levels of Campylobacter and E. coli during commercial farming.