12 resultados para Principle of alternative possibilities

em eResearch Archive - Queensland Department of Agriculture


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The emerging carbon economy will have a major impact on grazing businesses because of significant livestock methane and land-use change emissions. Livestock methane emissions alone account for similar to 11% of Australia's reported greenhouse gas emissions. Grazing businesses need to develop an understanding of their greenhouse gas impact and be able to assess the impact of alternative management options. This paper attempts to generate a greenhouse gas budget for two scenarios using a spread sheet model. The first scenario was based on one land-type '20-year-old brigalow regrowth' in the brigalow bioregion of southern-central Queensland. The 50 year analysis demonstrated the substantially different greenhouse gas outcomes and livestock carrying capacity for three alternative regrowth management options: retain regrowth (sequester 71.5 t carbon dioxide equivalents per hectare, CO2-e/ha), clear all regrowth (emit 42.8 t CO2-e/ha) and clear regrowth strips (emit 5.8 t CO2-e/ha). The second scenario was based on a 'remnant eucalypt savanna-woodland' land type in the Einasleigh Uplands bioregion of north Queensland. The four alternative vegetation management options were: retain current woodland structure (emit 7.4 t CO2-e/ha), allow woodland to thicken increasing tree basal area (sequester 20.7 t CO2-e/ha), thin trees less than 10 cm diameter (emit 8.9 t CO2-e/ha), and thin trees <20 cm diameter (emit 12.4 t CO2-e/ha). Significant assumptions were required to complete the budgets due to gaps in current knowledge on the response of woody vegetation, soil carbon and non-CO2 soil emissions to management options and land-type at the property scale. The analyses indicate that there is scope for grazing businesses to choose alternative management options to influence their greenhouse gas budget. However, a key assumption is that accumulation of carbon or avoidance of emissions somewhere on a grazing business (e.g. in woody vegetation or soil) will be recognised as an offset for emissions elsewhere in the business (e.g. livestock methane). This issue will be a challenge for livestock industries and policy makers to work through in the coming years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cucurbit crops host a range of serious sap-sucking insect pests, including silverleaf whitefly (SLW) and aphids, which potentially represent considerable risk to the Australian horticulture industry. These pests are extremely polyphagous with a wide host range. Chemical control is made difficult due to resistance and pollution, and other side-effects are associated with insecticide use. Consequently, there is much interest in maximising the role of biological control in the management of these sap-sucking insect pests. This study aimed to evaluate companion cropping alongside cucurbit crops in a tropical setting as a means to increase the populations of beneficial insects and spiders so as to control the major sap-sucking insect pests. The Population of beneficial and harmful insects, with a focus on SLW and aphids, and other invertebrates were sampled weekly oil four different crops which could be used for habitat manipulation: Goodbug Mix (GBM; a proprietary seed Mixture including self-sowing annual and perennial herbaceous flower species); lablab (Lablab purpureus L. Sweet); lucerne (Medicago sativa L.); and niger (Guizotia abyssinica (L.f.) Cass.). Lablab hosted the highest numbers of beneficial insects (larvae and adults of lacewing (Mallada signata (Schneider)), ladybird beetles (Coccinella transversalis Fabricius) and spiders) while GBM hosted the highest numbers of European bees (Apis mellifera Linnaeus) and spiders. Lucerne and niger showed little promise in hosting beneficial insects, but lucerne hosted significantly more spiders (double the numbers) than niger. Lucerne hosted significantly more of the harmful insect species of aphids (Aphis gossypii (Glover)) and Myzus persicae (Sulzer)) and heliothis (Heliothis armigera Hubner). Niger hosted significantly more vegetable weevils (Listroderes difficillis (Germar)) than the other three species. Therefore, lablab and GBM appear to be viable options to grow within cucurbits or as field boundary crops to attract and increase beneficial insects and spiders for the control of sap-sucking insect pests. Use of these bio-control strategies affords the opportunity to minimise pesticide usage and the risks associated with pollution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Non-parametric difference tests such as triangle and duo-trio tests traditionally are used to establish differences or similarities between products. However they only supply the researcher with partial answers and often further testing is required to establish the nature, size and direction of differences. This paper looks at the advantages of the difference from control (DFC) test (also known as degree of difference test) and discusses appropriate applications of the test. The scope and principle of the test, panel composition and analysis of results are presented with the aid of suitable examples. Two of the major uses of the DFC test are in quality control and shelf-life testing. The role DFC takes in these areas and the use of other tests to complement the testing is discussed. Controls or standards are important in both these areas and the use of standard products, mental and written standards and blind controls are highlighted. The DFC test has applications in products where the duo-trio and triangle tests cannot be used because of the normal heterogeneity of the product. While the DFC test is a simple difference test it can be structured to give the researcher more valuable data and scope to make informed decisions about their product.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The widespread and increasing resistance of internal parasites to anthelmintic control is a serious problem for the Australian sheep and wool industry. As part of control programmes, laboratories use the Faecal Egg Count Reduction Test (FECRT) to determine resistance to anthelmintics. It is important to have confidence in the measure of resistance, not only for the producer planning a drenching programme but also for companies investigating the efficacy of their products. The determination of resistance and corresponding confidence limits as given in anthelmintic efficacy guidelines of the Standing Committee on Agriculture (SCA) is based on a number of assumptions. This study evaluated the appropriateness of these assumptions for typical data and compared the effectiveness of the standard FECRT procedure with the effectiveness of alternative procedures. Several sets of historical experimental data from sheep and goats were analysed to determine that a negative binomial distribution was a more appropriate distribution to describe pre-treatment helminth egg counts in faeces than a normal distribution. Simulated egg counts for control animals were generated stochastically from negative binomial distributions and those for treated animals from negative binomial and binomial distributions. Three methods for determining resistance when percent reduction is based on arithmetic means were applied. The first was that advocated in the SCA guidelines, the second similar to the first but basing the variance estimates on negative binomial distributions, and the third using Wadley’s method with the distribution of the response variate assumed negative binomial and a logit link transformation. These were also compared with a fourth method recommended by the International Co-operation on Harmonisation of Technical Requirements for Registration of Veterinary Medicinal Products (VICH) programme, in which percent reduction is based on the geometric means. A wide selection of parameters was investigated and for each set 1000 simulations run. Percent reduction and confidence limits were then calculated for the methods, together with the number of times in each set of 1000 simulations the theoretical percent reduction fell within the estimated confidence limits and the number of times resistance would have been said to occur. These simulations provide the basis for setting conditions under which the methods could be recommended. The authors show that given the distribution of helminth egg counts found in Queensland flocks, the method based on arithmetic not geometric means should be used and suggest that resistance be redefined as occurring when the upper level of percent reduction is less than 95%. At least ten animals per group are required in most circumstances, though even 20 may be insufficient where effectiveness of the product is close to the cut off point for defining resistance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Residue retention is an important issue in evaluating the sustainability of production forestry. However, its long-term impacts have not been studied extensively, especially in sub-tropical environments. This study investigated the long-term impact of harvest residue retention on tree nutrition, growth and productivity of a F1 hybrid (Pinus elliottii var. elliottii × Pinus caribaea var. hondurensis) exotic pine plantation in sub-tropical Australia, under three harvest residue management regimes: (1) residue removal, RR0; (2) single residue retention, RR1; and (3) double residue retention, RR2. The experiment, established in 1996, is a randomised complete block design with 4 replicates. Tree growth measurements in this study were carried out at ages 2, 4, 6, 8 and 10 years, while foliar nutrient analyses were carried out at ages 2, 4, 6 and 10 years. Litter production and litter nitrogen (N) and phosphorus (P) measurements were carried out quarterly over a 15-month period between ages 9 and 10 years. Results showed that total tree growth was still greater in residue-retained treatments compared to the RR0 treatment. However, mean annual increments of diameter at breast height (MAID) and basal area (MAIB) declined significantly after age 4 years to about 68-78% at age 10 years. Declining foliar N and P concentrations accounted for 62% (p < 0.05) of the variation of growth rates after age 4 years, and foliar N and P concentrations were either marginal or below critical concentrations. In addition, litter production, and litter N and P contents were not significantly different among the treatments. This study suggests that the impact of residue retention on tree nutrition and growth rates might be limited over a longer period, and that the integration of alternative forest management practices is necessary to sustain the benefits of harvest residues until the end of the rotation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Weedy Sporobolus grasses have low palatability for livestock, with infestations reducing land condition and pastoral productivity. Control and containment options are available, but the cost of weed control is high relative to the extra return from livestock, thus, limiting private investment. This paper outlines a process for analysing the economic consequences of alternative management options for weedy Sporobolus grasses. This process is applicable to other weeds and other pastoral degradation or development issues. Using a case study property, three scenarios were developed. Each scenario compared two alternative management options and was analysed using discounted cash flow analysis. Two of the scenarios were based on infested properties and one scenario was based on a currently uninfested property but highly likely to become infested without active containment measures preventing weed seed transport and seedling establishment. The analysis highlighted why particular weedy Sporobolus grass management options may not be financially feasible for the landholder with the infestation. However, at the regional scale, the management options may be highly worthwhile due to a reduction in weed seed movement and new weed invasions. Therefore, to encourage investment by landholders in weedy Sporobolus grass management the investment of public money on behalf of landholders with non-infested properties should be considered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ability to predict phenology and canopy development is critical in crop models used for simulating likely consequences of alternative crop management and cultivar choice strategies. Here we quantify and contrast the temperature and photoperiod responses for phenology and canopy development of a diverse range of elite Indian and Australian sorghum genotypes (hybrid and landrace). Detailed field experiments were undertaken in Australia and India using a range of genotypes, sowing dates, and photoperiod extension treatments. Measurements of timing of developmental stages and leaf appearance were taken. The generality of photo-thermal approaches to modelling phenological and canopy development was tested. Environmental and genotypic effects on rate of progression from emergence to floral initiation (E-FI) were explained well using a multiplicative model, which combined the intrinsic development rate (Ropt), with responses to temperature and photoperiod. Differences in Ropt and extent of the photoperiod response explained most genotypic effects. Average leaf initiation rate (LIR), leaf appearance rate and duration of the phase from anthesis to physiological maturity differed among genotypes. The association of total leaf number (TLN) with photoperiod found for all genotypes could not be fully explained by effects on development and LIRs. While a putative effect of photoperiod on LIR would explain the observations, other possible confounding factors, such as air-soil temperature differential and the nature of model structure were considered and discussed. This study found a generally robust predictive capacity of photo-thermal development models across diverse ranges of both genotypes and environments. Hence, they remain the most appropriate models for simulation analysis of genotype-by-management scenarios in environments varying broadly in temperature and photoperiod.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since their release over 100 years ago, camels have spread across central Australia and increased in number. Increasingly, they are being seen as a pest, with observed impacts from overgrazing and damage to infrastructure such as fences. Irregular aerial surveys since 1983 and an interview-based survey in 1966 suggest that camels have been increasing at close to their maximum rate. A comparison of three models of population growth fitted to these, albeit limited, data suggests that the Northern Territory population has indeed been growing at an annual exponential rate of r = 0.074, or 8% per year, with little evidence of a density-dependent brake. A stage-structured model using life history data from a central Australian camel population suggests that this rate approximates the theoretical maximum. Elasticity analysis indicates that adult survival is by far the biggest influence on rate of increase and that a 9% reduction in survival from 96% is needed to stop the population growing. In contrast, at least 70% of mature females need to be sterilised to have a similar effect. In a benign environment, a population of large mammals such as camels is expected to grow exponentially until close to carrying capacity. This will frustrate control programs, because an ever-increasing number of animals will need to be removed for zero growth the longer that culling or harvesting effort is delayed. A population projection for 2008 suggests ~10 500 animals need to be harvested across the Northern Territory. Current harvests are well short of this. The ability of commercial harvesting to control camel populations in central Australia will depend on the value of animals, access to animals and the presence of alternative species to harvest when camels are at low density.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. Weed eradication efforts often must be sustained for long periods owing to the existence of persistent seed banks, among other factors. Decision makers need to consider both the amount of investment required and the period over which investment must be maintained when determining whether to commit to (or continue) an eradication programme. However, a basis for estimating eradication programme duration based on simple data has been lacking. Here, we present a stochastic dynamic model that can provide such estimates. 2. The model is based upon the rates of progression of infestations from the active to the monitoring state (i.e. no plants detected for at least 12 months), rates of reversion of infestations from monitoring to the active state and the frequency distribution of time since last detection for all infestations. Isoquants that illustrate the combinations of progression and reversion parameters corresponding to eradication within different time frames are generated. 3. The model is applied to ongoing eradication programmes targeting branched broomrape Orobanche ramosa and chromolaena Chromolaena odorata. The minimum periods in which eradication could potentially be achieved were 22 and 23 years, respectively. On the basis of programme performance until 2008, however, eradication is predicted to take considerably longer for both species (on average, 62 and 248 years, respectively). Performance of the branched broomrape programme could be best improved through reducing rates of reversion to the active state; for chromolaena, boosting rates of progression to the monitoring state is more important. 4. Synthesis and applications. Our model for estimating weed eradication programme duration, which captures critical transitions between a limited number of states, is readily applicable to any weed.Aparticular strength of the method lies in its minimal data requirements. These comprise estimates of maximum seed persistence and infested area, plus consistent annual records of the detection (or otherwise) of the weed in each infestation. This work provides a framework for identifying where improvements in management are needed and a basis for testing the effectiveness of alternative tactics. If adopted, our approach should help improve decision making with regard to eradication as a management strategy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Echinochloa colona is the most common grass weed of summer fallows in the grain-cropping systems of the subtropical region of Australia. Glyphosate is the most commonly used herbicide for summer grass control in fallows in this region. The world's first population of glyphosate-resistant E. colona was confirmed in Australia in 2007 and, since then, >70 populations have been confirmed to be resistant in the subtropical region. The efficacy of alternative herbicides on glyphosate-susceptible populations was evaluated in three field experiments and on both glyphosate-susceptible and glyphosate-resistant populations in two pot experiments. The treatments were knockdown and pre-emergence herbicides that were applied as a single application (alone or in a mixture) or as part of a sequential application to weeds at different growth stages. Glyphosate at 720 g ai ha−1 provided good control of small glyphosate-susceptible plants (pre- to early tillering), but was not always effective on larger susceptible plants. Paraquat was effective and the most reliable when applied at 500 g ai ha−1 on small plants, irrespective of the glyphosate resistance status. The sequential application of glyphosate followed by paraquat provided 96–100% control across all experiments, irrespective of the growth stage, and the addition of metolachlor and metolachlor + atrazine to glyphosate or paraquat significantly reduced subsequent emergence. Herbicide treatments have been identified that provide excellent control of small E. colona plants, irrespective of their glyphosate resistance status. These tactics of knockdown herbicides, sequential applications and pre-emergence herbicides should be incorporated into an integrated weed management strategy in order to greatly improve E. colona control, reduce seed production by the sprayed survivors and to minimize the risk of the further development of glyphosate resistance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Concepts of agricultural sustainability and possible roles of simulation modelling for characterising sustainability were explored by conducting, and reflecting on, a sustainability assessment of rain-fed wheat-based systems in the Middle East and North Africa region. We designed a goal-oriented, model-based framework using the cropping systems model Agricultural Production Systems sIMulator (APSIM). For the assessment, valid (rather than true or false) sustainability goals and indicators were identified for the target system. System-specific vagueness was depicted in sustainability polygons-a system property derived from highly quantitative data-and denoted using descriptive quantifiers. Diagnostic evaluations of alternative tillage practices demonstrated the utility of the framework to quantify key bio-physical and chemical constraints to sustainability. Here, we argue that sustainability is a vague, emergent system property of often wicked complexity that arises out of more fundamental elements and processes. A 'wicked concept of sustainability' acknowledges the breadth of the human experience of sustainability, which cannot be internalised in a model. To achieve socially desirable sustainability goals, our model-based approach can inform reflective evaluation processes that connect with the needs and values of agricultural decision-makers. Hence, it can help to frame meaningful discussions, from which actions might emerge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pteropid bats or flying-foxes (Chiroptera: Pteropodidae) are the natural host of Hendra virus (HeV) which sporadically causes fatal disease in horses and humans in eastern Australia. While there is strong evidence that urine is an important infectious medium that likely drives bat to bat transmission and bat to horse transmission, there is uncertainty about the relative importance of alternative routes of excretion such as nasal and oral secretions, and faeces. Identifying the potential routes of HeV excretion in flying-foxes is important to effectively mitigate equine exposure risk at the bat-horse interface, and in determining transmission rates in host-pathogen models. The aim of this study was to identify the major routes of HeV excretion in naturally infected flying-foxes, and secondarily, to identify between-species variation in excretion prevalence. A total of 2840 flying-foxes from three of the four Australian mainland species (Pteropus alecto, P. poliocephalus and P. scapulatus) were captured and sampled at multiple roost locations in the eastern states of Queensland and New South Wales between 2012 and 2014. A range of biological samples (urine and serum, and urogenital, nasal, oral and rectal swabs) were collected from anaesthetized bats, and tested for HeV RNA using a qRT-PCR assay targeting the M gene. Forty-two P. alecto (n = 1410) had HeV RNA detected in at least one sample, and yielded a total of 78 positive samples, at an overall detection rate of 1.76% across all samples tested in this species (78/4436). The rate of detection, and the amount of viral RNA, was highest in urine samples (>serum, packed haemocytes >faecal >nasal >oral), identifying urine as the most plausible source of infection for flying-foxes and for horses. Detection in a urine sample was more efficient than detection in urogenital swabs, identifying the former as the preferred diagnostic sample. The detection of HeV RNA in serum is consistent with haematogenous spread, and with hypothesised latency and recrudesence in flying-foxes. There were no detections in P. poliocephalus (n = 1168 animals; n = 2958 samples) or P. scapulatus (n = 262 animals; n = 985 samples), suggesting (consistent with other recent studies) that these species are epidemiologically less important than P. alecto in HeV infection dynamics. The study is unprecedented in terms of the individual animal approach, the large sample size, and the use of a molecular assay to directly determine infection status. These features provide a high level of confidence in the veracity of our findings, and a sound basis from which to more precisely target equine risk mitigation strategies.