29 resultados para Rotation capacity
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Reduced supplies of nitrogen (N) in many soils of southern Queensland that were cropped exhaustively with cereals over many decades have been the focus of much research to avoid declines in profitability and sustainability of farming systems. A 45-month period of mixed grass (purple pigeon grass, Setaria incrassata Stapf; Rhodes grass, Chloris gayana Kunth.) and legume (lucerne, Medicago sativa L.; annual medics, M. scutellata L. Mill. and M. truncatula Gaertn.) pasture was one of several options that were compared at a fertility-depleted Vertosol at Warra, southern Queensland, to improve grain yields or increase grain protein concentration of subsequent wheat crops. Objectives of the study were to measure the productivity of a mixed grass and legume pasture grown over 45 months (cut and removed over 36 months) and its effects on yield and protein concentrations of the following wheat crops. Pasture production (DM t/ha) and aboveground plant N yield (kg/ha) for grass, legume (including a small amount of weeds) and total components of pasture responded linearly to total rainfall over the duration of each of 3 pastures sown in 1986, 1987 and 1988. Averaged over the 3 pastures, each 100 mm of rainfall resulted in 0.52 t/ha of grass, 0.44 t/ha of legume and 0.97 t/ha of total pasture DM, there being little variation between the 3 pastures. Aboveground plant N yield of the 3 pastures ranged from 17.2 to 20.5 kg/ha per 100 mm rainfall. Aboveground legume N in response to total rainfall was similar (10.6 - 13.2 kg/ha. 100 mm rainfall) across the 3 pastures in spite of very different populations of legumes and grasses at establishment. Aboveground grass N yield was 5.2 - 7.0 kg/ha per 100mm rainfall. In most wheat crops following pasture, wheat yields were similar to that of unfertilised wheat except in 1990 and 1994, when grain yields were significantly higher but similar to that for continuous wheat fertilised with 75 kg N/ha. In contrast, grain protein concentrations of most wheat crops following pasture responded positively, being substantially higher than unfertilised wheat but similar to that of wheat fertilised with 75 kg N/ha. Grain protein averaged over all years of assay was increased by 25 - 40% compared with that of unfertilised wheat. Stored water supplies after pasture were < 134mm (< 55% of plant available water capacity); for most assay crops water storages were 67 - 110 mm, an equivalent wet soil depth of only 0.3 - 0.45 m. Thus, the crop assays of pasture benefits were limited by low water supply to wheat crops. Moreover, the severity of common root rot in wheat crop was not reduced by pasture - wheat rotation.
Resumo:
The impact of cropping histories (sugarcane, maize and soybean), tillage practices (conventional tillage and direct drill) and fertiliser N in the plant and 1st ratoon (1R) crops of sugarcane were examined in field trials at Bundaberg and Ingham. Average yields at Ingham (Q200) and Bundaberg (Q151) were quite similar in both the plant crop (83 t/ha and 80 t/ha, respectively) and the 1R (89 t/ha v 94 t/ha, respectively), with only minor treatment effects on CCS at each site. Cane yield responses to tillage, break history and N fertiliser varied significantly between sites. There was a 27% yield increase in the plant crop from the soybean fallow at Ingham, with soybeans producing a yield advantage over continuous cane, but there were no clear break effects at Bundaberg - possibly due to a complex of pathogenic nematodes that responded differently to soybeans and maize breaks. There was no carryover benefit of the soybean break into the 1R crop at Ingham, while at Bundaberg the maize break produced a 15% yield advantage over soybeans and continuous cane. The Ingham site recorded positive responses to N fertiliser addition in both the plant (20% yield increase) and 1R (34% yield increase) crops, but there was negligible carryover benefit from plant crop N in the 1R crop, or of a reduced N response after a soybean rotation. By contrast, the Bundaberg site showed no N response in any history in the plant crop, and only a small (5%) yield increase with N applied in the 1R crop. There was again no evidence of a reduced N response in the 1R crop after a soybean fallow. There were no significant effects of tillage on cane yields at either site, although there were some minor interactions between tillage, breaks and N management in the 1R crop at both sites. Crop N contents at Bundaberg were more than 3 times those recorded at Ingham in both the plant and 1R crops, with N concentrations in millable stalk at Ingham suggesting N deficiencies in all treatments. There was negligible additional N recovered in crop biomass from N fertiliser application or soybean residues at the Ingham site. There was additional N recovered in crop biomass in response to N fertiliser and soybean breaks at Bundaberg, but effects were small and fertiliser use efficiencies poor. Loss pathways could not be quantified, but denitrification or losses in runoff were the likely causes at Ingham while leaching predominated at Bundaberg. Results highlight the complexity involved in developing sustainable farming systems for contrasting soil types and climatic conditions. A better understanding of key sugarcane pathogens and their host range, as well as improved capacity to predict in-crop N mineralisation, will be key factors in future improvements to sugarcane farming systems.
Resumo:
Continuous cultivation and cereal cropping of southern Queensland soils previously supporting native vegetation have resulted in reduced soil nitrogen supply, and consequently decreased cereal grain yields and low grain protein. To enhance yields and protein concentrations of wheat, management practices involving N fertiliser application, with no-tillage and stubble retention, grain legumes, and legume leys were evaluated from 1987 to 1998 on a fertility-depleted Vertosol at Warra, southern Queensland. The objective of this study was to examine the effect of lucerne in a 2-year lucerne–wheat rotation for its nitrogen and disease-break benefits to subsequent grain yield and protein content of wheat as compared with continuous wheat cropping. Dry matter production and nitrogen yields of lucerne were closely correlated with the total rainfall for October–September as well as March–September rainfall. Each 100 mm of total rainfall resulted in 0.97 t/ha of dry matter and 26 kg/ha of nitrogen yield. For the March–September rainfall, the corresponding values were 1.26 t/ha of dry matter and 36 kg/ha of nitrogen yield. The latter values were 10% lower than those produced by annual medics during a similar period. Compared with wheat–wheat cropping, significant increases in total soil nitrogen were observed only in 1990, 1992 and 1994 but increases in soil mineralisable nitrogen were observed in most years following lucerne. Similarly, pre-plant nitrate nitrogen in the soil profile following lucerne was higher by 74 kg/ha (9–167 kg N/ha) than that of wheat–wheat without N fertiliser in all years except 1996. Consequently, higher wheat grain protein (7 out of 9 seasons) and grain yield (4 out of 9 seasons) were produced compared with continuous wheat. There was significant depression in grain yield in 2 (1993 and 1995) out of 9 seasons attributed to soil moisture depletion and/or low growing season rainfall. Consequently, the overall responses in yield were lower than those of 50 kg/ha of fertiliser nitrogen applied to wheat–wheat crops, 2-year medic–wheat or chickpea–wheat rotation, although grain protein concentrations were higher following lucerne. The incidence and severity of the soilborne disease, common root rot of wheat caused by Bipolaris sorokiniana, was generally higher in lucerne–wheat than in continuous wheat with no nitrogen fertiliser applications, since its severity was significantly correlated with plant available water at sowing. No significant incidence of crown rot or root lesion nematode was observed. Thus, productivity, which was mainly due to nitrogen accretion in this experiment, can be maintained where short duration lucerne leys are grown in rotations with wheat.
Resumo:
Grazing is a major land use in Australia's rangelands. The 'safe' livestock carrying capacity (LCC) required to maintain resource condition is strongly dependent on climate. We reviewed: the approaches for quantifying LCC; current trends in climate and their effect on components of the grazing system; implications of the 'best estimates' of climate change projections for LCC; the agreement and disagreement between the current trends and projections; and the adequacy of current models of forage production in simulating the impact of climate change. We report the results of a sensitivity study of climate change impacts on forage production across the rangelands, and we discuss the more general issues facing grazing enterprises associated with climate change, such as 'known uncertainties' and adaptation responses (e.g. use of climate risk assessment). We found that the method of quantifying LCC from a combination of estimates (simulations) of long-term (>30 years) forage production and successful grazier experience has been well tested across northern Australian rangelands with different climatic regions. This methodology provides a sound base for the assessment of climate change impacts, even though there are many identified gaps in knowledge. The evaluation of current trends indicated substantial differences in the trends of annual rainfall (and simulated forage production) across Australian rangelands with general increases in most of western Australian rangelands ( including northern regions of the Northern Territory) and decreases in eastern Australian rangelands and south-western Western Australia. Some of the projected changes in rainfall and temperature appear small compared with year-to-year variability. Nevertheless, the impacts on rangeland production systems are expected to be important in terms of required managerial and enterprise adaptations. Some important aspects of climate systems science remain unresolved, and we suggest that a risk-averse approach to rangeland management, based on the 'best estimate' projections, in combination with appropriate responses to short-term (1-5 years) climate variability, would reduce the risk of resource degradation. Climate change projections - including changes in rainfall, temperature, carbon dioxide and other climatic variables - if realised, are likely to affect forage and animal production, and ecosystem functioning. The major known uncertainties in quantifying climate change impacts are: (i) carbon dioxide effects on forage production, quality, nutrient cycling and competition between life forms (e.g. grass, shrubs and trees); and (ii) the future role of woody plants including effects of. re, climatic extremes and management for carbon storage. In a simple example of simulating climate change impacts on forage production, we found that increased temperature (3 degrees C) was likely to result in a decrease in forage production for most rangeland locations (e. g. -21% calculated as an unweighted average across 90 locations). The increase in temperature exacerbated or reduced the effects of a 10% decrease/increase in rainfall respectively (-33% or -9%). Estimates of the beneficial effects of increased CO2 (from 350 to 650 ppm) on forage production and water use efficiency indicated enhanced forage production (+26%). The increase was approximately equivalent to the decline in forage production associated with a 3 degrees C temperature increase. The large magnitude of these opposing effects emphasised the importance of the uncertainties in quantifying the impacts of these components of climate change. We anticipate decreases in LCC given that the 'best estimate' of climate change across the rangelands is for a decline (or little change) in rainfall and an increase in temperature. As a consequence, we suggest that public policy have regard for: the implications for livestock enterprises, regional communities, potential resource damage, animal welfare and human distress. However, the capability to quantify these warnings is yet to be developed and this important task remains as a challenge for rangeland and climate systems science.
Resumo:
Understanding the effects of different types and quality of data on bioclimatic modeling predictions is vital to ascertaining the value of existing models, and to improving future models. Bioclimatic models were constructed using the CLIMEX program, using different data types – seasonal dynamics, geographic (overseas) distribution, and a combination of the two – for two biological control agents for the major weed Lantana camara L. in Australia. The models for one agent, Teleonemia scrupulosa Stål (Hemiptera:Tingidae) were based on a higher quality and quantity of data than the models for the other agent, Octotoma scabripennis Guérin-Méneville (Coleoptera: Chrysomelidae). Predictions of the geographic distribution for Australia showed that T. scrupulosa models exhibited greater accuracy with a progressive improvement from seasonal dynamics data, to the model based on overseas distribution, and finally the model combining the two data types. In contrast, O. scabripennis models were of low accuracy, and showed no clear trends across the various model types. These case studies demonstrate the importance of high quality data for developing models, and of supplementing distributional data with species seasonal dynamics data wherever possible. Seasonal dynamics data allows the modeller to focus on the species response to climatic trends, while distributional data enables easier fitting of stress parameters by restricting the species envelope to the described distribution. It is apparent that CLIMEX models based on low quality seasonal dynamics data, together with a small quantity of distributional data, are of minimal value in predicting the spatial extent of species distribution.
Resumo:
Seeds in the field experience wet-dry cycling that is akin to the well-studied commercial process of seed priming in which seeds are hydrated and then re-dried to standardise their germination characteristics. To investigate whether the persistence (defined as in situ longevity) and antioxidant capacity of seeds are influenced by wet-dry cycling, seeds of the global agronomic weed Avena sterilis ssp. ludoviciana were subjected to (1) controlled ageing at 60% relative humidity and 53.5°C for 31 days, (2) controlled ageing then priming, or (3) ageing in the field in three soils for 21 months. Changes in seed viability (total germination), mean germination time, seedling vigour (mean seedling length), and the concentrations of the glutathione (GSH) / glutathione disulphide (GSSG) redox couple were recorded over time. As controlled-aged seeds lost viability, GSH levels declined and the relative proportion of GSSG contributing to total glutathione increased, indicative of a failing antioxidant capacity. Subjecting seeds that were aged under controlled conditions to a wet-dry cycle (to −1 MPa) prevented viability loss and increased GSH levels. Field-aged seeds that underwent numerous wet-dry cycles due to natural rainfall maintained high viability and high GSH levels. Thus wet-dry cycles in the field may enhance seed longevity and persistence coincident with re-synthesis of protective compounds such as GSH.
Resumo:
A review of factors that may impact on the capacity of beef cattle females, grazing semi-extensive to extensive pastures in northern Australia, to conceive, maintain a pregnancy and wean a calf was conducted. Pregnancy and weaning rates have generally been used to measure the reproductive performance of herds. However, this review recognises that reproductive efficiency and the general measures associated with it more effectively describe the economic performance of beef cattle enterprises. More specifically, reproductive efficiency is influenced by (1) pregnancy rate which is influenced by (i) age at puberty; (ii) duration of post-partum anoestrus; (iii) fertilisation failure and (iv) embryo survival; while (2) weight by number of calves per breeding female retained for mating is influenced by (i) cow survival; (ii) foetal survival; and (iii) calf survival; and (3) overall lifetime calf weight weaned per mating. These measures of reproductive efficiency are discussed in depth. Further, a range of infectious and non-infectious factors, namely, environmental, physiological, breed and genetic factors and their impact on these stages of the reproductive cycle are investigated and implications for the northern Australian beef industry are discussed. Finally, conclusions and recommendations to minimise reproductive inefficiencies based on current knowledge are presented.
Resumo:
The impact of three cropping histories (sugarcane, maize and soybean) and two tillage practices (conventional tillage and direct drill) on plant-parasitic and free-living nematodes in the following sugarcane crop was examined in a field trial at Bundaberg. Soybean reduced populations of lesion nematode (Pratylenchus zeae) and root-knot nematode (Meloidogyne javanica) in comparison to previous crops of sugarcane or maize but increased populations of spiral nematode (Helicotylenchus dihystera) and maintained populations of dagger nematode (Xiphinema elongatum). However the effect of soybean on P zeae and M. javanica was no longer apparent 15 weeks after planting sugarcane, while later in the season, populations of these nematodes following soybean were as high as or higher than maize or sugarcane. Populations of P zeae were initially reduced by cultivation but due to strong resurgence tended to be higher in conventionally tilled than direct drill plots at the end of the plant crop. Even greater tillage effects were observed with M. javanica and X. elongatum, as nematode populations were significantly higher in conventionally tilled than direct drill plots late in the season. Populations of free-living nematodes in the upper 10 cm of soil were initially highest following soybean, but after 15, 35 and 59 weeks were lower than after sugarcane and contained fewer omnivorous and predatory nematodes. Conventional tillage increased populations of free-living nematodes in soil in comparison to direct drill and was also detrimental to omnivorous and predatory nematodes. These results suggest that crop rotation and tillage not only affect plant-parasitic nematodes directly, but also have indirect effects by impacting on natural enemies that regulate nematode populations. More than 2 million nematodes/m(2) were often present in crop residues on the surface of direct drill plots. Bacterial-feeding nematodes were predominant in residues early in the decomposition process but fungal-feeding nematodes predominated after 15 weeks. This indicates that fungi become an increasingly important component of the detritus food web as decomposition proceeds, and that that the rate of nutrient cycling decreases with time. Correlations between total numbers of free-living nematodes and mineral N concentrations in crop residues and surface soil suggested that the free-living nematode community may provide an indication of the rate of mineralisation of N from organic matter.
Resumo:
Improving added value and Small Medium Enterprises capacity in the utilisation of plantation timber for furniture production in Jepara region of Indonesia: improving recovery, design, manufacturing, R&D and training capacities.
Resumo:
Project aims to develop diagnostic capacity for laurel wilt and associated ambrosia beetle in Australia.
Resumo:
Aims to build adaptive capacity within Qld's mixed farming (cropping/beef) sector.
Resumo:
Accurate identification of pests is essential for practically all aspects of agricultural development and is critical to the operations of biosecurity that safeguard agricultural integrity and facilitate trade. Diagnostic capability is at the forefront of and complementary to, activities such as border protection, incursion management, surveillance and pest and disease certification. The efficiency of a biosecurity system therefore depends largely on the feedback between these activities and diagnostics. Australian scientists will train Thai scientists in diagnostics and surveillance to provide the Thai DOA with skills that will aid in the development of a Thai Diagnostic Network. The skills will be taught using a range of pests, including some which have particular biosecurity importance for both Australia and Thailand such as citrus canker, potato viruses and fruit flies.
Resumo:
Viral diseases of cotton are of economic significance in many parts of the world and several of these remain biosecurity threats to the Australian cotton industry, including Cotton Leaf Roll Virus (CLRV) from South East Asia. The proposed project will result in a greater understanding of the field symptoms of CLRV in Thailand and diagnostic assays used for its detection. I will also determine if the diagnostic assay being developed for Brazilian CLRDV as part of the CRDC project (11-12FRP00062) may also detect Thailand CLRV. It will provide educational opportunities to increase the knowledge base of staff currently working on cotton virus research and in doing so help to protect the Australian cotton industry from incursions of exotic viruses.
Resumo:
A field experiment was established in which an amendment of poultry manure and sawdust (200 t/ha) was incorporated into some plots but not others and then a permanent pasture or a sequence of biomass-producing crops was grown with and without tillage, with all biomass being returned to the soil. After 4 years, soil C levels were highest in amended plots, particularly those that had been cropped using minimum tillage, and lowest in non-amended and fallowed plots, regardless of how they had been tilled. When ginger was planted, symphylans caused severe damage to all treatments, indicating that cropping, tillage and organic matter management practices commonly used to improve soil health are not necessarily effective for all crops or soils. During the rotational phase of the experiment, the development of suppressiveness to three key pathogens of ginger was monitored using bioassays. Results for root-knot nematode (Meloidogyne javanica) indicated that for the first 2 years, amended soil was more suppressive than non-amended soil from the same cropping and tillage treatment, whereas under pasture, the amendment only enhanced suppressiveness in the first year. Suppressiveness was generally associated with higher C levels and enhanced biological activity (as measured by the rate of fluorescein diacetate (FDA) hydrolysis and numbers of free-living nematodes). Reduced tillage also enhanced suppressiveness, as gall ratings and egg counts in the second and third years were usually significantly lower in cropped soils under minimum rather than conventional tillage. Additionally, soil that was not disturbed during the process of setting up bioassays was more suppressive than soil which had been gently mixed by hand. Results of bioassays with Fusarium oxysporum f. sp. zingiberi were too inconsistent to draw firm conclusions, but the severity of fusarium yellows was generally higher in fumigated fallow soil than in other treatments, with soil management practices having little impact on disease severity. With regard to Pythium myriotylum, biological factors capable of reducing rhizome rot were present, but were not effective enough to suppress the disease under environmental conditions that were ideal for disease development.
Resumo:
This project evaluated the timber quality, processing and performance characteristics of 19-year-old Eucalyptus cloeziana (Gympie messmate) and 15-year-old Eucalyptus pellita (red mahogany). Studies were undertaken to evaluate wood and mechanical properties, accelerated seasoning and veneer and plywood production. Above-ground and in-ground durability field tests were established at three locations in Queensland. Ground proixmity tests and L-joint tests were installed to gather data applicable to above-ground, weather-exposed end-use applications, and stake tests were installed to gather data applicable to in-ground, weather-exposed end-use applications.