75 resultados para manage
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Approaches to manage for the sustainable use of natural and cultural resources in a landscape can have many different designs. One design is adaptive collaborative landscape management (ACLM) where research providers and users work closely together on projects to develop resources while adaptively managing to sustain or maintain landscapes in the long term. We propose that collaborative projects are more useful for achieving outcomes than integrative projects where participants merely join their separate contributions. To foster collaborative research projects to adaptively manage landscapes in northern Australia, a Tropical Savannas Cooperative Research Centre (TSCRC) was established in 1995. The TSCRC is a joint venture of major organizations involved in research and land management. This paper is our perspective on the four most important 'lessons learned' after using a ACLM-type approach for over 10 y. We learnt that collaboration (working in combination) not necessarily integration (combining parts into a whole) achieved sustainable outcomes. We found that integration across culturally diverse perspectives seldom achieved sustainable solutions because it devalued the position of the less empowered participants. In addition, positive outcomes were achieved when participants developed trust and respect for each other by embracing and respecting their differences and by sharing unifying concepts such as savanna health. Another lesson learned was that a collaborative organization must act as an honest broker by resisting advocacy of one view point over another. Finally, we recognized the importance of strongly investing in communication and networking so that people could adaptively learn from one another's experiences, understand each other's challenges and respect each other's choices. Our experience confirms the usefulness of the ACLM approach and highlights its role in the process of sustaining healthy landscapes.
Resumo:
Because of the variable and changing environment, advisors and farmers are seeking systems that provide risk management support at a number of time scales. The Agricultural Production Systems Research Unit, Toowoomba, Australia has developed a suite of tools to assist advisors and farmers to better manage risk in cropping. These tools range from simple rainfall analysis tools (Rainman, HowWet, HowOften) through crop simulation tools (WhopperCropper and YieldProphet) to the most complex, APSFarm, a whole-farm analysis tool. Most are derivatives of the APSIM crop model. These tools encompass a range of complexity and potential benefit to both the farming community and for government policy. This paper describes, the development and usage of two specific products; WhopperCropper and APSFarm. WhopperCropper facilitates simulation-aided discussion of growers' exposure to risk when comparing alternative crop input options. The user can readily generate 'what-if' scenarios that separate the major influences whilst holding other factors constant. Interactions of the major inputs can also be tested. A manager can examine the effects of input levels (and Southern Oscillation Index phase) to broadly determine input levels that match their attitude to risk. APSFarm has been used to demonstrate that management changes can have different effects in short and long time periods. It can be used to test local advisors and farmers' knowledge and experience of their desired rotation system. This study has shown that crop type has a larger influence than more conservative minimum soil water triggers in the long term. However, in short term dry periods, minimum soil water triggers and maximum area of the various crops can give significant financial gains.
Resumo:
The Wambiana grazing trial started in 1997 to test and develop sustainable and profitable grazing strategies to manage for rainfall variability. It is important that this trial continue as the results are still relatively short-term relative to rainfall cycles and significant treatment changes are still occurring. This new proposal will maintain baseline treatments but will modify others based on trial learning’s to date. It builds on treatment differences and evidence collected over the last 12 years to deliver evidence-based guidelines and principles for sustainable and productive management. The trial also links to other projects modelling water quality, climate change, methane emissions and soil C sequestration on grazing lands.
Resumo:
This is part of a GRDC funded project led by Dr Jeremy Whish of CSIRO Ecosystem Sciences. The project aims to build a root-lesion nematode module into the crop growth simulation program APSIM (Agricultural Production Systems Simulator). This will utilise existing nematode and crop data from field, glasshouse and laboratory research led by Dr John Thompson. New data will be collected to validate and extend the model.
Resumo:
BACKGROUND The emergence of high levels of resistance in Cryptolestes ferrugineus (Stephens) in recent years threatens the sustainability of phosphine, a key fumigant used worldwide to disinfest stored grain. We aimed at developing robust fumigation protocols that could be used in a range of practical situations to control this resistant pest. RESULTS Values of the lethal time to kill 99.9% (LT99.9, in days) of mixed-age populations, containing all life stages, of a susceptible and a strongly resistant C. ferrugineus population were established at three phosphine concentrations (1.0, 1.5 and 2.0 mg L−1) and three temperatures (25, 30 and 35 °C). Multiple linear regression analysis revealed that phosphine concentration and temperature both contributed significantly to the LT99.9 of a population (P < 0.003, R2 = 0.92), with concentration being the dominant variable, accounting for 75.9% of the variation. Across all concentrations, LT99.9 of the strongly resistant C. ferrugineus population was longest at the lowest temperature and shortest at the highest temperature. For example, 1.0 mg L−1 of phosphine is required for 20, 15 and 15 days, 1.5 mg L−1 for 12, 11 and 9 days and 2.0 mg L−1 for 10, 7 and 6 days at 25, 30 and 35 °C, respectively, to achieve 99.9% mortality of the strongly resistant C. ferrugineus population. We also observed that phosphine concentration is inversely proportional to fumigation period in regard to the population extinction of this pest. CONCLUSION The fumigation protocols developed in this study will be used in recommending changes to the currently registered rates of phosphine in Australia towards management of strongly resistant C. ferrugineus populations, and can be repeated in any country where this type of resistance appears.
Resumo:
BACKGROUND The emergence of high levels of resistance in Cryptolestes ferrugineus (Stephens) in recent years threatens the sustainability of phosphine, a key fumigant used worldwide to disinfest stored grain. We aimed at developing robust fumigation protocols that could be used in a range of practical situations to control this resistant pest. RESULTS Values of the lethal time to kill 99.9% (LT99.9, in days) of mixed-age populations, containing all life stages, of a susceptible and a strongly resistant C. ferrugineus population were established at three phosphine concentrations (1.0, 1.5 and 2.0 mg L−1) and three temperatures (25, 30 and 35 °C). Multiple linear regression analysis revealed that phosphine concentration and temperature both contributed significantly to the LT99.9 of a population (P < 0.003, R2 = 0.92), with concentration being the dominant variable, accounting for 75.9% of the variation. Across all concentrations, LT99.9 of the strongly resistant C. ferrugineus population was longest at the lowest temperature and shortest at the highest temperature. For example, 1.0 mg L−1 of phosphine is required for 20, 15 and 15 days, 1.5 mg L−1 for 12, 11 and 9 days and 2.0 mg L−1 for 10, 7 and 6 days at 25, 30 and 35 °C, respectively, to achieve 99.9% mortality of the strongly resistant C. ferrugineus population. We also observed that phosphine concentration is inversely proportional to fumigation period in regard to the population extinction of this pest. CONCLUSION The fumigation protocols developed in this study will be used in recommending changes to the currently registered rates of phosphine in Australia towards management of strongly resistant C. ferrugineus populations, and can be repeated in any country where this type of resistance appears. © 2014 Commonwealth of Australia. Pest Management Science © 2014 Society of Chemical Industry
Resumo:
Most Australian banana production occurs on the north-eastern tropical coast between latitudes 15-18°S, and can experience summer cyclone activity. Damage from severe tropical cyclones has serious impact on banana-based livelihoods. The most significant impacts include immediate loss of production and income for several months, the region-wide synchronization of cropping and the expense of rehabilitating affected plantations. Severe tropical cyclones have directly affected the main production region twice in recent years Tropical Cyclone (TC) Larry (Category 4) in March 2006 and TC Yasi (Category 5) in February 2011. Based on TC Larry experiences, pre- and post-cyclone farm practices were developed to reduce these impacts in future cyclonic events. The main pre-cyclone farm practice focused on maintaining production units and an earlier return to fruit production by partially or completely removing the plant canopy to reduce wind resistance. Post-cyclone farm practices focused on managing the industry-wide crop synchronization using crop timing techniques to achieve a staggered return to cropping by scheduling production to provide continuous fruit supply. With TC Yasi in 2011, some banana producers implemented these practices, allowing them to examine their effectiveness in reducing cyclonic impacts. Additional research and development activities were conducted to refine our understanding of their effectiveness and improve their application for future cyclonic events. Based on these activities and farm-based observations, suggested practice-based management strategies can be developed to help reduce the impact of severe tropical cyclones in the future. Canopy removal maintained banana plants as productive units, and provided earlier but smaller bunches, generating earlier-than-expected income. Queensland producers expressed willingness to adopt canopy removal for future cyclone threats where appropriate, despite its labor-intensiveness. Mechanization would allow larger scale adoption. Implementing a staggered cropping program successfully achieved a consistent, continuous fruit supply after a cyclone impact. Both techniques should be applicable to other cyclone-prone regions.
Resumo:
This paper reports an experiment undertaken to examine the impact of burning in spring together with reduced grazing pressure on the dynamics of H. contortus and Aristida spp. In H. contortus pasture in south-eastern Queensland. The overall results indicate that spring burning in combination with reduced grazing pressure had no marked effect on the density of either grass species. This was attributed to 2 factors. Firstly, extreme drought conditions restricted any increase in H. contortus seedling establishment despite the presence of an adequate soil seed bank prior to summer; and secondly, some differences occurred in the response to fire of the diverse taxonomic groupings in the species of Aristida spp. present at the study site. This study concluded that it is necessary to identify appropriate taxonomic units within the Aristida genus and that, where appropriate, burning in spring to manage pasture composition should be conducted under favorable rainfall conditions using seasonal forecasting indicators such as the Southern Oscillation Index
Resumo:
A study was undertaken from 2004 to 2007 to investigate factors associated with decreased efficacy of metalaxyl to manage damping-off of cucumber in Oman. A survey over six growing seasons showed that growers lost up to 14.6% of seedlings following application of metalaxyl. No resistance to metalaxyl was found among Pythium isolates. Damping-off disease in the surveyed greenhouses followed two patterns. In most (69%) greenhouses, seedling mortality was found to occur shortly after transplanting and decrease thereafter (Phase-I). However, a second phase of seedling mortality (Phase-II) appeared 9-14 d after transplanting in about 31% of the surveyed greenhouses. Analysis of the rate of biodegradation of metalaxyl in six greenhouses indicated a significant increase in the rate of metalaxyl biodegradation in greenhouses, which encountered Phase-II damping-off. The half-life of metalaxyl dropped from 93 d in soil, which received no previous metalaxyl treatment to 14 d in soil, which received metalaxyl for eight consecutive seasons, indicating an enhanced rate of metalaxyl biodegradation after repeated use. Multiple applications of metalaxyl helped reduce the appearance of Phase-II damping-off. This appears to be the first report of rapid biodegradation of metalaxyl in greenhouse soils and the first report of its association with appearance of a second phase of mortality in cucumber seedlings.
Resumo:
In dryland agricultural systems of the subtropical, semi-arid region of north-eastern Australia, water is the most limiting resource. Crop productivity depends on the efficient use of rainfall and available water stored in the soil during fallow. Agronomic management practices including a period of fallow, stubble retention, and reduced tillage enhance reserves of soil water. However, access to stored water in these soils may be restricted by the presence of growth-limiting conditions in the rooting zone of the crop. These have been termed as subsoil constraints. Subsoil constraints may include compacted or gravel layers (physical), sodicity, salinity, acidity, nutrient deficiencies, presence of toxic elements (chemical) and low microbial activity (biological). Several of these constraints may occur together in some soils. Farmers have often not been able to obtain the potential yield determined by their prevailing climatic conditions in the marginal rainfall areas of the northern grains region. In the past, the adoption of soil management practices had been largely restricted to the top 100 mm soil layer. Exploitation of the subsoil as a source of water and nutrients has largely been overlooked. The key towards realising potential yields would be to gain better understanding of subsoils and their limitations, then develop options to manage them practically and economically. Due to the complex nature of the causal factors of these constraints, efforts are required for a combination of management approaches rather than individual options, with the aim to combat these constraints for sustainable crop production, managing natural resources and avoiding environmental damage.
Resumo:
Brassicaceae plants have the potential as part of an integrated approach to replace fumigant nematicides, providing the biofumigation response following their incorporation is not offset by reproduction of plant-parasitic nematodes on their roots. Forty-three Brassicaceae cultivars were screened in a pot trial for their ability to reduce reproduction of three root-knot nematode isolates from north Queensland, Australia: M. arenaria (NQ1), M. javanica (NQ2) and M. arenaria race 2 (NQ5/7). No cultivar was found to consistently reduce nematode reproduction relative to forage sorghum, the current industry standard, although a commercial fodder radish (Raphanus sativus) and a white mustard (Sinapis alba) line were consistently as resistant to the formation of galls as forage sorghum. A second pot trial screened five commercially available Brassicaceae cultivars, selected for their biofumigation potential, for resistance to two nematode species, M. javanica (NQ2) and M. arenaria (NQ5/7). The fodder radish cv. Weedcheck, was found to be as resistant as forage sorghum to nematode reproduction. A multivariate cluster analysis using the resistance measurements, gall index, nematode number per g of root and multiplication for two nematode species (NQ2 and NQ5/7) confirmed the similarity in resistance between the radish cultivar and forage sorghum. A field trial confirmed the resistance of the fodder radish cv. Weedcheck, with a similar reduction in the number of Meloidogyne spp. juveniles recovered from the roots 8 weeks after planting. The use of fodder radish cultivars as biofumigation crops to manage root-knot nematodes in tropical vegetable production systems deserves further investigation.
Resumo:
The distribution and nutritional profiles of sub-tidal seagrasses from the Torres Strait were surveyed and mapped across an area of 31,000 km2. Benthic sediment composition, water depth, seagrass species type and nutrients were sampled at 168 points selected in a stratified representative pattern. Eleven species of seagrass were present at 56 (33.3%) of the sample points. Halophila spinulosa, Halophila ovalis, Cymodocea serrulata and Syringodium isoetifolium were the most common species and these were nutrient profiled. Sub-tidal seagrass distribution (and associated seagrass nutrient concentrations) was generally confined to northern-central and south-western regions of the survey area (
Resumo:
Prior to the 1980s, arthropod pest control in Queensland strawberries was based entirely on calendar sprays of insecticides (mainly endosulfan, triclorfon, dimethoate and carbaryl) and a miticide (dicofol). These chemicals were applied frequently and spider mite outbreaks occurred every season. The concept of integrated pest management (IPM) had not been introduced to growers, and the suggestion that an alternative to the standard chemical pest control recipe might be available, was ignored. Circumstances changed when the predatory mite, Phytoseiulus persimilis Athios-Henriot, became available commercially in Australia, providing the opportunity to manage spider mites, the major pests of strawberries, with an effective biological agent. Trials conducted on commercial farms in the early 1980s indicated that a revolution in strawberry pest management was at hand, but the industry generally remained sceptical and afraid to adopt the new strategy. Lessons are learnt from disasters and the consequent monetary loss that ensues, and in 1993, such an event relating to ineffective spider mite control, spawned the revolution we had to have. Farm-oriented research and evolving grower perspectives have resulted in the acceptance of biological control of spider mites using Phytoseiulus persimilis and the 'pest in first' technique, and it now forms the basis of an IPM system that is used on more than 80% of the Queensland strawberry crop.
Resumo:
Recent incidents of mycotoxin contamination (particularly aflatoxins and fumonisins) have demonstrated a need for an industry-wide management system to ensure Australian maize meets the requirements of all domestic users and export markets. Results of recent surveys are presented, demonstrating overall good conformity with nationally accepted industry marketing standards but with occasional samples exceeding these levels. This paper describes mycotoxin-related hazards inherent in the Australian maize production system and a methodology combining good agricultural practices and the hazard analysis critical control point framework to manage risk.
Resumo:
Single or multiple factors implicated in subsoil constraints including salinity, sodicity, and phytotoxic concentrations of chloride (Cl) are present in many Vertosols including those occurring in Queensland, Australia. The variable distribution and the complex interactions that exist between these constraints limit the agronomic or management options available to manage the soil with these subsoil constraints. The identification of crops and cultivars adapted to these adverse subsoil conditions and/or able to exploit subsoil water may be an option to maintain productivity of these soils. We evaluated relative performance of 5 winter crop species, in terms of grain yields, nutrient concentration, and ability to extract soil water, grown on soils with various levels and combinations of subsoil constraints in 19 field experiments over 2 years. Subsoil constraints were measured by levels of soil Cl, electrical conductivity of the saturation extract (ECse), and exchangeable sodium percentage (ESP). Increasing levels of subsoil constraints significantly decreased maximum depth of water extraction, grain yield, and plant-available water capacity for all the 5 crops and more so for chickpea and durum wheat than bread wheat, barley, or canola. Increasing soil Cl levels had a greater restricting effect on water availability than did ECse and ESP. We developed empirical relationships between soil Cl, ECse, and ESP and crop lower limit (CLL) for estimating subsoil water extraction by 5 winter crops. However, the presence of gypsum influenced the ability to predict CLL based on the levels of ECse. Stronger relationships between apparent unused plant-available water (CLL - LL15; LL15 is lower limit at -1.5 MPa) and soil Cl concentrations than ESP or ECse suggested that the presence of high Cl in these soils most likely inhibited the subsoil water extraction by the crops. This was supported by increased sodium (Na) and Cl concentration with a corresponding decrease in calcium (Ca) and potassium (K) in young mature leaf of bread wheat, durum wheat, and chickpea with increasing levels of subsoil constraints. Of the 2 ions, Na and Cl, the latter appears to be more damaging than the former, resulting in plant dieback and reduced grain yields.