63 resultados para incident duration modelling
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Two field experiments using maize (Pioneer 31H50) and three watering regimes [(i) irrigated for the whole crop cycle, until anthesis, (ii) not at all (experiment 1) and (iii) fully irrigated and rain grown for the whole crop cycle (experiment 2)] were conducted at Gatton, Australia, during the 2003-04 season. Data on crop ontogeny, leaf, sheath and internode lengths and leaf width, and senescence were collected at 1- to 3-day intervals. A glasshouse experiment during 2003 quantified the responses of leaf shape and leaf presentation to various levels of water stress. Data from experiment 1 were used to modify and parameterise an architectural model of maize (ADEL-Maize) to incorporate the impact of water stress on maize canopy characteristics. The modified model produced accurate fitted values for experiment 1 for final leaf area and plant height, but values during development for leaf area were lower than observed data. Crop duration was reasonably well fitted and differences between the fully irrigated and rain-grown crops were accurately predicted. Final representations of maize crop canopies were realistic. Possible explanations for low values of leaf area are provided. The model requires further development using data from the glasshouse study and before being validated using data from experiment 2 and other independent data. It will then be used to extend functionality in architectural models of maize. With further research and development, the model should be particularly useful in examining the response of maize production to water stress including improved prediction of total biomass and grain yield. This will facilitate improved simulation of plant growth and development processes allowing investigation of genotype by environment interactions under conditions of suboptimal water supply.
Resumo:
Paropsis atomaria is a recently emerged pest of eucalypt plantations in subtropical Australia. Its broad host range of at least 20 eucalypt species and wide geographical distribution provides it the potential to become a serious forestry pest both within Australia and, if accidentally introduced, overseas. Although populations of P. atomaria are genetically similar throughout its range, population dynamics differ between regions. Here, we determine temperature-dependent developmental requirements using beetles sourced from temperate and subtropical zones by calculating lower temperature thresholds, temperature-induced mortality, and day-degree requirements. We combine these data with field mortality estimates of immature life stages to produce a cohort-based model, ParopSys, using DYMEX™ that accurately predicts the timing, duration, and relative abundance of life stages in the field and number of generations in a spring–autumn (September–May) field season. Voltinism was identified as a seasonally plastic trait dependent upon environmental conditions, with two generations observed and predicted in the Australian Capital Territory, and up to four in Queensland. Lower temperature thresholds for development ranged between 4 and 9 °C, and overall development rates did not differ according to beetle origin. Total immature development time (egg–adult) was approximately 769.2 ± S.E. 127.8 DD above a lower temperature threshold of 6.4 ± S.E. 2.6 °C. ParopSys provides a basic tool enabling forest managers to use the number of generations and seasonal fluctuations in abundance of damaging life stages to estimate the pest risk of P. atomaria prior to plantation establishment, and predict the occurrence and duration of damaging life stages in the field. Additionally, by using local climatic data the pest potential of P. atomaria can be estimated to predict the risk of it establishing if accidentally introduced overseas. Improvements to ParopSys’ capability and complexity can be made as more biological data become available.
Resumo:
The ability to predict phenology and canopy development is critical in crop models used for simulating likely consequences of alternative crop management and cultivar choice strategies. Here we quantify and contrast the temperature and photoperiod responses for phenology and canopy development of a diverse range of elite Indian and Australian sorghum genotypes (hybrid and landrace). Detailed field experiments were undertaken in Australia and India using a range of genotypes, sowing dates, and photoperiod extension treatments. Measurements of timing of developmental stages and leaf appearance were taken. The generality of photo-thermal approaches to modelling phenological and canopy development was tested. Environmental and genotypic effects on rate of progression from emergence to floral initiation (E-FI) were explained well using a multiplicative model, which combined the intrinsic development rate (Ropt), with responses to temperature and photoperiod. Differences in Ropt and extent of the photoperiod response explained most genotypic effects. Average leaf initiation rate (LIR), leaf appearance rate and duration of the phase from anthesis to physiological maturity differed among genotypes. The association of total leaf number (TLN) with photoperiod found for all genotypes could not be fully explained by effects on development and LIRs. While a putative effect of photoperiod on LIR would explain the observations, other possible confounding factors, such as air-soil temperature differential and the nature of model structure were considered and discussed. This study found a generally robust predictive capacity of photo-thermal development models across diverse ranges of both genotypes and environments. Hence, they remain the most appropriate models for simulation analysis of genotype-by-management scenarios in environments varying broadly in temperature and photoperiod.
Resumo:
The nitrogen-driven trade-off between nitrogen utilisation efficiency (yield per unit nitrogen uptake) and water use efficiency (yield per unit evapotranspiration) is widespread and results from well established, multiple effects of nitrogen availability on the water, carbon and nitrogen economy of crops. Here we used a crop model (APSIM) to simulate the yield, evapotranspiration, soil evaporation and nitrogen uptake of wheat, and analysed yield responses to water, nitrogen and climate using a framework analogous to the rate-duration model of determinate growth. The relationship between modelled grain yield (Y) and evapotranspiration (ET) was fitted to a linear-plateau function to derive three parameters: maximum yield (Ymax), the ET break-point when yield reaches its maximum (ET#), and the rate of yield response in the linear phase ([Delta]Y/[Delta]ET). Against this framework, we tested the hypothesis that nitrogen deficit reduces maximum yield by reducing both the rate ([Delta]Y/[Delta]ET) and the range of yield response to evapotranspiration, i.e. ET# - Es, where Es is modelled median soil evaporation. Modelled data reproduced the nitrogen-driven trade-off between nitrogen utilisation efficiency and water use efficiency in a transect from Horsham (36°S) to Emerald (23°S) in eastern Australia. Increasing nitrogen supply from 50 to 250 kg N ha-1 reduced yield per unit nitrogen uptake from 29 to 12 kg grain kg-1 N and increased yield per unit evapotranspiration from 6 to 15 kg grain ha-1 mm-1 at Emerald. The same increment in nitrogen supply reduced yield per unit nitrogen uptake from 30 to 25 kg grain kg-1 N and increased yield per unit evapotranspiration from 6 to 25 kg grain ha-1 mm-1 at Horsham. Maximum yield ranged from 0.9 to 6.4 t ha-1. Consistent with our working hypothesis, reductions in maximum yield with nitrogen deficit were associated with both reduction in the rate of yield response to ET and compression of the range of yield response to ET. Against the notion of managing crops to maximise water use efficiency in low rainfall environments, we emphasise the trade-off between water use efficiency and nitrogen utilisation efficiency, particularly under conditions of high nitrogen-to-grain price ratio. The rate-range framework to characterise the relationship between yield and evapotranspiration is useful to capture this trade-off as the parameters were responsive to both nitrogen supply and climatic factors.
Resumo:
Two prerequisites for realistically embarking upon an eradication programme are that cost-benefit analysis favours this strategy over other management options and that sufficient resources are available to carry the programme through to completion. These are not independent criteria, but it is our view that too little attention has been paid to estimating the investment required to complete weed eradication programmes. We deal with this problem by using a two-pronged approach: 1) developing a stochastic dynamic model that provides an estimation of programme duration; and 2) estimating the inputs required to delimit a weed incursion and to prevent weed reproduction over a sufficiently long period to allow extirpation of all infestations. The model is built upon relationships that capture the time-related detection of new infested areas, rates of progression of infestations from the active to the monitoring stage, rates of reversion of infestations from the monitoring to active stage, and the frequency distribution of time since last detection for all infestations. This approach is applied to the branched broomrape (Orobanche ramosa) eradication programme currently underway in South Australia. This programme commenced in 1999 and currently 7450 ha are known to be infested with the weed. To date none of the infestations have been eradicated. Given recent (2008) levels of investment and current eradication methods, model predictions are that it would take, on average, an additional 73 years to eradicate this weed at an average additional cost (NPV) of $AU67.9m. When the model was run for circumstances in 2003 and 2006, the average programme duration and total cost (NPV) were predicted to be 159 and 94 years, and $AU91.3m and $AU72.3m, respectively. The reduction in estimated programme length and cost may represent progress towards the eradication objective, although eradication of this species still remains a long term prospect.
Resumo:
Increasing resistance to phosphine (PH 3) in insect pests, including lesser grain borer (Rhyzopertha dominica) has become a critical issue, and development of effective and sustainable strategies to manage resistance is crucial. In practice, the same grain store may be fumigated multiple times, but usually for the same exposure period and concentration. Simulating a single fumigation allows us to look more closely at the effects of this standard treatment.We used an individual-based, two-locus model to investigate three key questions about the use of phosphine fumigant in relation to the development of PH 3 resistance. First, which is more effective for insect control; long exposure time with a low concentration or short exposure period with a high concentration? Our results showed that extending exposure duration is a much more efficient control tactic than increasing the phosphine concentration. Second, how long should the fumigation period be extended to deal with higher frequencies of resistant insects in the grain? Our results indicated that if the original frequency of resistant insects is increased n times, then the fumigation needs to be extended, at most, n days to achieve the same level of insect control. The third question is how does the presence of varying numbers of insects inside grain storages impact the effectiveness of phosphine fumigation? We found that, for a given fumigation, as the initial population number was increased, the final survival of resistant insects increased proportionally. To control initial populations of insects that were n times larger, it was necessary to increase the fumigation time by about n days. Our results indicate that, in a 2-gene mediated resistance where dilution of resistance gene frequencies through immigration of susceptibles has greater effect, extending fumigation times to reduce survival of homozygous resistant insects will have a significant impact on delaying the development of resistance. © 2012 Elsevier Ltd.
Resumo:
Pasture rest is a possible strategy for improving land condition in the extensive grazing lands of northern Australia. If pastures currently in poor condition could be improved, then overall animal productivity and the sustainability of grazing could be increased. The scientific literature is examined to assess the strength of the experimental information to support and guide the use of pasture rest, and simulation modelling is undertaken to extend this information to a broader range of resting practices, growing conditions and initial pasture condition. From this, guidelines are developed that can be applied in the management of northern Australia’s grazing lands and also serve as hypotheses for further field experiments. The literature on pasture rest is diverse but there is a paucity of data from much of northern Australia as most experiments have been conducted in southern and central parts of Queensland. Despite this, the limited experimental information and the results from modelling were used to formulate the following guidelines. Rest during the growing season gives the most rapid improvement in the proportion of perennial grasses in pastures; rest during the dormant winter period is ineffective in increasing perennial grasses in a pasture but may have other benefits. Appropriate stocking rates are essential to gain the greatest benefit from rest: if stocking rates are too high, then pasture rest will not lead to improvement; if stocking rates are low, pastures will tend to improve without rest. The lower the initial percentage of perennial grasses, the more frequent the rests should be to give a major improvement within a reasonable management timeframe. Conditions during the growing season also have an impact on responses with the greatest improvement likely to be in years of good growing conditions. The duration and frequency of rest periods can be combined into a single value expressed as the proportion of time during which resting occurs; when this is done the modelling suggests the greater the proportion of time that a pasture is rested, the greater is the improvement but this needs to be tested experimentally. These guidelines should assist land managers to use pasture resting but the challenge remains to integrate pasture rest with other pasture and animal management practices at the whole-property scale.
Resumo:
In recent years many sorghum producers in the more marginal (<600 mm annual rainfall) cropping areas of Qld and northern NSW have utilised skip row configurations in an attempt to improve yield reliability and reduce sorghum production risk. But will this work in the long run? What are the trade-offs between productivity and risk of crop failure? This paper describes a modelling and simulation approach to study the long-term effects of skip row configurations. Detailed measurements of light interception and water extraction from sorghum crops grown in solid, single and double skip row configurations were collected from three on-farm participatory research trials established in southern Qld and northern NSW. These measurements resulted in changes to the model that accounted for the elliptical water uptake pattern below the crop row and reduced total light interception associated with the leaf area reduction of the skip configuration. Following validation of the model, long-term simulation runs using historical weather data were used to determine the value of skip row sorghum production as a means of maintaining yield reliability in the dryland cropping regions of southern Qld and northern NSW.
Resumo:
By quantifying the effects of climatic variability in the sheep grazing lands of north western and western Queensland, the key biological rates of mortality and reproduction can be predicted for sheep. These rates are essential components of a decision support package which can prove a useful management tool for producers, especially if they can easily obtain the necessary predictors. When the sub-models of the GRAZPLAN ruminant biology process model were re-parameterised from Queensland data along with an empirical equation predicting the probability of ewes mating added, the process model predicted the probability of pregnancy well (86% variation explained). Predicting mortality from GRAZPLAN was less successful but an empirical equation based on relative condition of the animal (a measure based on liveweight), pregnancy status and age explained 78% of the variation in mortalities. A crucial predictor in these models was liveweight which is not often recorded on producer properties. Empirical models based on climatic and pasture conditions estimated from the pasture production model GRASP, predicted marking and mortality rates for Mitchell grass (Astrebla sp.) pastures (81% and 63% of the variation explained). These prediction equations were tested against independent data from producer properties and the model successfully validated for Mitchell grass communities.
Resumo:
Soil nitrogen (N) supply in the Vertosols of southern Queensland, Australia has steadily declined as a result of long-term cereal cropping without N fertiliser application or rotations with legumes. Nitrogen-fixing legumes such as lucerne may enhance soil N supply and therefore could be used in lucerne-wheat rotations. However, lucerne leys in this subtropical environment can create a soil moisture deficit, which may persist for a number of seasons. Therefore, we evaluated the effect of varying the duration of a lucerne ley (for up to 4 years) on soil N increase, N supply to wheat, soil water changes, wheat yields and wheat protein on a fertility-depleted Vertosol in a field experiment between 1989 and 1996 at Warra (26degrees 47'S, 150degrees53'E), southern Queensland. The experiment consisted of a wheat-wheat rotation, and 8 treatments of lucerne leys starting in 1989 (phase 1) or 1990 (phase 2) for 1,2,3 or 4 years duration, followed by wheat cropping. Lucerne DM yield and N yield increased with increasing duration of lucerne leys. Soil N increased over time following 2 years of lucerne but there was no further significant increase after 3 or 4 years of lucerne ley. Soil nitrate concentrations increased significantly with all lucerne leys and moved progressively downward in the soil profile from 1992 to 1995. Soil water, especially at 0.9-1.2 m depth, remained significantly lower for the next 3 years after the termination of the 4 year lucerne ley than under continuous wheat. No significant increase in wheat yields was observed from 1992 to 1995, irrespective of the lucerne ley. However, wheat grain protein concentrations were significantly higher under lucerne-wheat than under wheat wheat rotations for 3-5 years. The lucerne yield and soil water and nitrate-N concentrations were satisfactorily simulated with the APSIM model. Although significant N accretion occurred in the soil following lucerne leys, in drier seasons, recharge of the drier soil profile following long duration lucerne occurred after 3 years. Consequently, 3- and 4-year lucerne-wheat rotations resulted in more variable wheat yields than wheat-wheat rotations in this region. The remaining challenge in using lucerne-wheat rotations is balancing the N accretion benefits with plant-available water deficits, which are most likely to occur in the highly variable rainfall conditions of this region.
Resumo:
Annual discard ogives were estimated using generalized additive models (GAMs) for four demersal fish species: whiting, haddock, megrim, and plaice. The analysis was based on data collected on board commercial vessels and at Irish fishing ports from 1995 to 2003. For all species the most important factors influencing annual discard ogives were fleet (combination of gear, fishing ground, and targeted species), mean length of the catch and year, and, for megrim, also minimum landing size. The length at which fish are discarded has increased since 2000 for haddock, whiting, and plaice. In contrast, discarded length has decreased for megrim, accompanying a reduction in minimum landing size in 2000.
Resumo:
Prediction of the initiation, appearance and emergence of leaves is critically important to the success of simulation models of crop canopy development and some aspects of crop ontogeny. Data on leaf number and crop ontogeny were collected on five cultivars of maize differing widely in maturity and genetic background grown under natural and extended photoperiods, and planted on seven sowing dates from October 1993 to March 1994 at Gatton, South-east Queensland. The same temperature coefficients were established for crop ontogeny before silking, and the rates of leaf initiation, leaf tip appearance and full leaf expansion, the base, optimum and maximum temperatures for each being 8, 34 and 40 degrees C. After silking, the base temperature for ontogeny was 0 degrees C, but the optimum and maximum temperatures remained unchanged. The rates of leaf initiation, appearance of leaf tips and full leaf expansion varied in a relatively narrow range across sowing times and photoperiod treatments, with average values of 0.040 leaves (degrees Cd)-1, 0.021 leaves (degrees Cd)-1, and 0.019 leaves (degrees Cd)-1, respectively. The relationships developed in this study provided satisfactory predictions of leaf number and crop ontogeny (tassel initiation to silking, emergence to silking and silking to physiological maturity) when assessed using independent data from Gatton (South eastern Queensland), Katherine and Douglas Daly (Northern Territory), Walkamin (North Queensland) and Kununurra (Western Australia).
Resumo:
Physiological and genetic studies of leaf growth often focus on short-term responses, leaving a gap to whole-plant models that predict biomass accumulation, transpiration and yield at crop scale. To bridge this gap, we developed a model that combines an existing model of leaf 6 expansion in response to short-term environmental variations with a model coordinating the development of all leaves of a plant. The latter was based on: (1) rates of leaf initiation, appearance and end of elongation measured in field experiments; and (2) the hypothesis of an independence of the growth between leaves. The resulting whole-plant leaf model was integrated into the generic crop model APSIM which provided dynamic feedback of environmental conditions to the leaf model and allowed simulation of crop growth at canopy level. The model was tested in 12 field situations with contrasting temperature, evaporative demand and soil water status. In observed and simulated data, high evaporative demand reduced leaf area at the whole-plant level, and short water deficits affected only leaves developing during the stress, either visible or still hidden in the whorl. The model adequately simulated whole-plant profiles of leaf area with a single set of parameters that applied to the same hybrid in all experiments. It was also suitable to predict biomass accumulation and yield of a similar hybrid grown in different conditions. This model extends to field conditions existing knowledge of the environmental controls of leaf elongation, and can be used to simulate how their genetic controls flow through to yield.
Resumo:
Quantifying the potential spread and density of an invading organism enables decision-makers to determine the most appropriate response to incursions. We present two linked models that estimate the spread of Solenopsis invicta Buren (red imported fire ant) in Australia based on limited data gathered after its discovery in Brisbane in 2001. A stochastic cellular automaton determines spread within a location (100 km by 100 km) and this is coupled with a model that simulates human-mediated movement of S. invicta to new locations. In the absence of any control measures, the models predict that S. invicta could cover 763 000–4 066 000 km2 by the year 2035 and be found at 200 separate locations around Australia by 2017–2027, depending on the rate of spread. These estimated rates of expansion (assuming no control efforts were in place) are higher than those experienced in the USA in the 1940s during the early invasion phases in that country. Active control efforts and quarantine controls in the USA (including a concerted eradication attempt in the 1960s) may have slowed spread. Further, milder winters, the presence of the polygynous social form, increased trade and human mobility in Australia in 2000s compared with the USA in 1940s could contribute to faster range expansion.
Resumo:
Aflatoxins are highly carcinogenic mycotoxins produced by two fungi, Aspergillus flavus and A. parasiticus, under specific moisture and temperature conditions before harvest and/or during storage of a wide range of crops including maize. Modelling of interactions between host plant and environment during the season can enable quantification of preharvest aflatoxin risk and its potential management. A model was developed to quantify climatic risks of aflatoxin contamination in maize using principles previously used for peanuts. The model outputs an aflatoxin risk index in response to seasonal temperature and soil moisture during the maize grain filling period using the APSIM's maize module. The model performed well in simulating climatic risk of aflatoxin contamination in maize as indicated by a significant R2 (P ≤ 0.01) between aflatoxin risk index and the measured aflatoxin B1 in crop samples, which was 0.69 for a range of rainfed Australian locations and 0.62 when irrigated locations were also included in the analysis. The model was further applied to determine probabilities of exceeding a given aflatoxin risk in four non-irrigated maize growing locations of Queensland using 106 years of historical climatic data. Locations with both dry and hot climates had a much higher probability of higher aflatoxin risk compared with locations having either dry or hot conditions alone. Scenario analysis suggested that under non-irrigated conditions the risk of aflatoxin contamination could be minimised by adjusting sowing time or selecting an appropriate hybrid to better match the grain filling period to coincide with lower temperature and water stress conditions.