35 resultados para soil factors

em eResearch Archive - Queensland Department of Agriculture


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Salinity, sodicity, acidity, and phytotoxic levels of chloride (Cl) in subsoils are major constraints to crop production in many soils of north-eastern Australia because they reduce the ability of crop roots to extract water and nutrients from the soil. The complex interactions and correlations among soil properties result in multi-colinearity between soil properties and crop yield that makes it difficult to determine which constraint is the major limitation. We used ridge-regression analysis to overcome colinearity to evaluate the contribution of soil factors and water supply to the variation in the yields of 5 winter crops on soils with various levels and combinations of subsoil constraints in the region. Subsoil constraints measured were soil Cl, electrical conductivity of the saturation extract (ECse), and exchangeable sodium percentage (ESP). The ridge regression procedure selected several of the variables used in a descriptive model, which included in-crop rainfall, plant-available soil water at sowing in the 0.90-1.10 m soil layer, and soil Cl in the 0.90-1.10 m soil layer, and accounted for 77-85% of the variation in the grain yields of the 5 winter crops. Inclusion of ESP of the top soil (0.0-0.10 m soil layer) marginally increased the descriptive capability of the models for bread wheat, barley and durum wheat. Subsoil Cl concentration was found to be an effective substitute for subsoil water extraction. The estimates of the critical levels of subsoil Cl for a 10% reduction in the grain yield were 492 mg cl/kg for chickpea, 662 mg Cl/kg for durum wheat, 854 mg Cl/kg for bread wheat, 980 mg Cl/kg for canola, and 1012 mg Cl/kg for barley, thus suggesting that chickpea and durum wheat were more sensitive to subsoil Cl than bread wheat, barley, and canola.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first larval instar has been identified as a critical stage for population mortality in Lepidoptera, yet due to the body size of these larvae, the factors that contribute to mortality under field conditions are still not clear. Dispersal behaviour has been suggested as a significant, but ignored factor contributing to mortality in first-instar lepidopteran larvae. The impact that leaving the host plant has on the mortality rate of Helicoverpa armigera neonates was examined in field crops and laboratory trials. In this study the following are examined: (1) the effects of soil surface temperature, and the level of shade within the crop, on the mortality of neonates on the soil after dropping off from the host plant; (2) the percentage of neonates that dropped off from a host plant and landed on the soil; and (3) the effects of exposure to different soil surface temperatures on the development and mortality of neonates. The findings of this study showed that: (1) on the soil, surface temperatures above 43°C were lethal for neonates, and exposure to these temperatures contributed greatly to the overall mortality rate observed; however, the fate of neonates on the soil varied significantly depending on canopy closure within the crop; (2) at least 15% of neonates dropped off from the host plant and landed on the soil, meaning that the proportion of neonates exposed to these condition is not trivial; and (3) 30 min exposure to soil surface temperatures approaching the lethal level (>43°C) has no significant negative effects on the development and mortality of larvae through to the second instar. Overall leaving the plant through drop-off contributes to first-instar mortality in crops with open canopies; however, survival of neonates that have lost contact with a host plant is possible, and becomes more likely later in the crop growing season.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wear resistance and recovery of 8 Bermudagrass (Cynodon dactylon (L.) Pers.) and hybrid Bermudagrass (C. Dactylon x C. transvaalensis Burtt-Davey) cultivars grown on a sandbased soil profile near Brisbane, Australia, were assessed in 4 wear trials conducted over a two year period. Wear was applied on a 7-day or a 14-day schedule by a modified Brinkman Traffic Simulator for 6-14 weeks at a time, either during winter-early spring or during summer-early autumn. The more frequent wear under the 7-day treatment was more damaging to the turf than the 14-day wear treatment, particularly during winter when its capacity for recovery from wear was severely restricted. There were substantial differences in wear tolerance among the 8 cultivars investigated, and the wear tolerance rankings of some cultivars changed between years. Wear tolerance was associated with high shoot density, a dense stolon mat strongly rooted to the ground surface, high cell wall strength as indicated by high total cell wall content, and high levels of lignin and neutral detergent fiber. Wear tolerance was also affected by turf age, planting sod quality, and wet weather. Resistance to wear and recovery from wear are both important components of wear tolerance, but the relative importance of their contributions to overall wear tolerance varies seasonally with turf growth rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A field experiment was established in which an amendment of poultry manure and sawdust (200 t/ha) was incorporated into some plots but not others and then a permanent pasture or a sequence of biomass-producing crops was grown with and without tillage, with all biomass being returned to the soil. After 4 years, soil C levels were highest in amended plots, particularly those that had been cropped using minimum tillage, and lowest in non-amended and fallowed plots, regardless of how they had been tilled. When ginger was planted, symphylans caused severe damage to all treatments, indicating that cropping, tillage and organic matter management practices commonly used to improve soil health are not necessarily effective for all crops or soils. During the rotational phase of the experiment, the development of suppressiveness to three key pathogens of ginger was monitored using bioassays. Results for root-knot nematode (Meloidogyne javanica) indicated that for the first 2 years, amended soil was more suppressive than non-amended soil from the same cropping and tillage treatment, whereas under pasture, the amendment only enhanced suppressiveness in the first year. Suppressiveness was generally associated with higher C levels and enhanced biological activity (as measured by the rate of fluorescein diacetate (FDA) hydrolysis and numbers of free-living nematodes). Reduced tillage also enhanced suppressiveness, as gall ratings and egg counts in the second and third years were usually significantly lower in cropped soils under minimum rather than conventional tillage. Additionally, soil that was not disturbed during the process of setting up bioassays was more suppressive than soil which had been gently mixed by hand. Results of bioassays with Fusarium oxysporum f. sp. zingiberi were too inconsistent to draw firm conclusions, but the severity of fusarium yellows was generally higher in fumigated fallow soil than in other treatments, with soil management practices having little impact on disease severity. With regard to Pythium myriotylum, biological factors capable of reducing rhizome rot were present, but were not effective enough to suppress the disease under environmental conditions that were ideal for disease development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bellyache bush (Jatropha gossypifolia L.) is an invasive shrub that adversely impacts agricultural and natural systems of northern Australia. While several techniques are available to control bellyache bush, depletion of soil seed banks is central to its management. A 10-year study determined the persistence of intact and ant-discarded bellyache bush seeds buried in shade cloth packets at six depths (ranging from 0 to 40 cm) under both natural rainfall and rainfall-excluded conditions. A second study monitored changes in seedling emergence over time, to provide an indication of the natural rate of seed bank depletion at two sites (rocky and heavy clay) following the physical removal of all bellyache bush plants. Persistence of seed in the burial trial varied depending on seed type, rainfall conditions and burial depth. No viable seeds of bellyache bush remained after 72 months irrespective of seed type under natural rainfall conditions. When rainfall was excluded seeds persisted for much longer, with a small portion (0.4%) of ant-discarded seeds still viable after 120 months. Seed persistence was prolonged (> 96 months to decline to < 1% viability) at all burial depths under rainfall-excluded conditions. In contrast, under natural rainfall, surface located seeds took twice as long (70 months) to decline to 1% viability compared with buried seeds (35 months). No seedling emergence was observed after 58 months and 36 months at the rocky and heavy clay soil sites, respectively. These results suggest that the required duration of control programs on bellyache bush may vary due to the effect of biotic and abiotic factors on persistence of soil seed banks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Soil testing is the most widely used tool to predict the need for fertiliser phosphorus (P) application to crops. This study examined factors affecting critical soil P concentrations and confidence intervals for wheat and barley grown in Australian soils by interrogating validated data from 1777 wheat and 150 barley field treatment series now held in the BFDC National Database. To narrow confidence intervals associated with estimated critical P concentrations, filters for yield, crop stress, or low pH were applied. Once treatment series with low yield (<1 t/ha), severe crop stress, or pHCaCl2 <4.3 were screened out, critical concentrations were relatively insensitive to wheat yield (>1 t/ha). There was a clear increase in critical P concentration from early trials when full tillage was common compared with those conducted in 1995–2011, which corresponds to a period of rapid shift towards adoption of minimum tillage. For wheat, critical Colwell-P concentrations associated with 90 or 95% of maximum yield varied among Australian Soil Classification (ASC) Orders and Sub-orders: Calcarosol, Chromosol, Kandosol, Sodosol, Tenosol and Vertosol. Soil type, based on ASC Orders and Sub-orders, produced critical Colwell-P concentrations at 90% of maximum relative yield from 15 mg/kg (Grey Vertosol) to 47 mg/kg (Supracalcic Calcarosols), with other soils having values in the range 19–27 mg/kg. Distinctive differences in critical P concentrations were evident among Sub-orders of Calcarosols, Chromosols, Sodosols, Tenosols, and Vertosols, possibly due to differences in soil properties related to P sorption. However, insufficient data were available to develop a relationship between P buffering index (PBI) and critical P concentration. In general, there was no evidence that critical concentrations for barley would be different from those for wheat on the same soils. Significant knowledge gaps to fill to improve the relevance and reliability of soil P testing for winter cereals were: lack of data for oats; the paucity of treatment series reflecting current cropping practices, especially minimum tillage; and inadequate metadata on soil texture, pH, growing season rainfall, gravel content, and PBI. The critical concentrations determined illustrate the importance of recent experimental data and of soil type, but also provide examples of interrogation pathways into the BFDC National Database to extract locally relevant critical P concentrations for guiding P fertiliser decision-making in wheat and barley.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effects of fire on biogeochemical cycling in terrestrial ecosystem are widely acknowledged, while few studies have focused on the bacterial community under the disturbance of long-term frequent prescribed fire. In this study, three treatments (burning every two years (B2), burning every four years (B4) and no burning (B0)) were applied for 38 years in an Australian wet sclerophyll forest. Results showed that bacterial alpha diversity (i.e. bacterial OTU) in the top soil (0-10 cm) was significantly higher in the B2 treatment compared with the B0 and B4 treatments. Non-metric multidimensional analysis (NMDS) of bacterial community showed clear separation of the soil bacterial community structure among different fire frequency regimes and between the depths. Different frequency fire did not have a substantial effect on bacterial composition at phylum level or bacterial 16S rRNA gene abundance. Soil pH and C:N ratio were the major drivers for bacterial community structure in the most frequent fire treatment (B2), while other factors (EC, DOC, DON, MBC, NH 4 +, TC and TN) were significant in the less frequent burning and no burning treatments (B4 and B0). This study suggested that burning had a dramatic impact on bacterial diversity but not abundance with more frequent fire.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seed production and soil seed hanks of H. contortus were studied in a subset of treatments within an extensive grazing study conducted in H. contortus pasture in southern Queensland between 1990 and 1996. Seed production of H. contortus in autumn ranged from 260 to 1800 seeds/m2 with much of this variation due to differences in rainfall between years. Seed production was generally higher in the silver-leaved ironbark than in the narrow-leaved ironbark land class and was also influenced by a consistent stocking rate x pasture type interaction. Inflorescence density was the main factor contributing to the variable seed production and was related to the rainfall received during February. The number of seeds per inflorescence was unaffected by seasonal rainfall, landscape position, stocking rate or legume oversowing. Seed viability was related to the rainfall received during March. Soil seed banks in spring varied from 130 to 520 seeds/m2 between 1990 and 1995 with generally more seed present in the silver-leaved ironbark than in the narrow-leaved ironbark land class. There were poor relationships between viable seed production and the size of the soil seed bank, and between the size of the soil seed bank and seedling recruitment. This study indicates that H. contortus has the potential to produce relatively large amounts of seed and showed that the seasonal pattern of rainfall plays a major role in achieving this potential

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports an experiment undertaken to examine the impact of burning in spring together with reduced grazing pressure on the dynamics of H. contortus and Aristida spp. In H. contortus pasture in south-eastern Queensland. The overall results indicate that spring burning in combination with reduced grazing pressure had no marked effect on the density of either grass species. This was attributed to 2 factors. Firstly, extreme drought conditions restricted any increase in H. contortus seedling establishment despite the presence of an adequate soil seed bank prior to summer; and secondly, some differences occurred in the response to fire of the diverse taxonomic groupings in the species of Aristida spp. present at the study site. This study concluded that it is necessary to identify appropriate taxonomic units within the Aristida genus and that, where appropriate, burning in spring to manage pasture composition should be conducted under favorable rainfall conditions using seasonal forecasting indicators such as the Southern Oscillation Index

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Attention is directed at land application of piggery effluent (containing urine, faeces, water, and wasted feed) as a potential source of water resource contamination with phosphorus (P). This paper summarises P-related properties of soil from 0-0.05 m depth at 11 piggery effluent application sites, in order to explore the impact that effluent application has had on the potential for run-off transport of P. The sites investigated were situated on Alfisol, Mollisol, Vertisol, and Spodosol soils in areas that received effluent for 1.5-30 years (estimated effluent-P applications of 100-310000 kg P/ha in total). Total (PT), bicarbonate extractable (PB), and soluble P forms were determined for the soil (0-0.05 m) at paired effluent and no-effluent sites, as well as texture, oxalate-extractable Fe and Al, organic carbon, and pH. All forms of soil P at 0-0.05 m depth increased with effluent application (PB at effluent sites was 1.7-15 times that at no-effluent sites) at 10 of the 11 sites. Increases in PB were strongly related to net P applications (regression analysis of log values for 7 sites with complete data sets: 82.6 % of variance accounted for, p <0.01). Effluent irrigation tended to increase the proportion of soil PT in dilute CaCl2-extractable forms (PTC: effluent average 2.0 %; no-effluent average 0.6%). The proportion of PTC in non-molybdate reactive forms (centrifuged supernatant) decreased (no-effluent average, 46.4 %; effluent average, 13.7 %). Anaerobic lagoon effluent did not reliably acidify soil, since no consistent relationship was observed for pH with effluent application. Soil organic carbon was increased in most of the effluent areas relative to the no-effluent areas. The four effluent areas where organic carbon was reduced had undergone intensive cultivation and cropping. Current effluent management at many of the piggeries failed to maximise the potential for waste P recapture. Ten of the case-study effluent application areas have received effluent-P in excess of crop uptake. While this may not represent a significant risk of leaching where sorption retains P, it has increased the risk of transport of P by run-off. Where such sites are close to surface water, run-off P loads should be managed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Soil nitrogen (N) supply in the Vertosols of southern Queensland, Australia has steadily declined as a result of long-term cereal cropping without N fertiliser application or rotations with legumes. Nitrogen-fixing legumes such as lucerne may enhance soil N supply and therefore could be used in lucerne-wheat rotations. However, lucerne leys in this subtropical environment can create a soil moisture deficit, which may persist for a number of seasons. Therefore, we evaluated the effect of varying the duration of a lucerne ley (for up to 4 years) on soil N increase, N supply to wheat, soil water changes, wheat yields and wheat protein on a fertility-depleted Vertosol in a field experiment between 1989 and 1996 at Warra (26degrees 47'S, 150degrees53'E), southern Queensland. The experiment consisted of a wheat-wheat rotation, and 8 treatments of lucerne leys starting in 1989 (phase 1) or 1990 (phase 2) for 1,2,3 or 4 years duration, followed by wheat cropping. Lucerne DM yield and N yield increased with increasing duration of lucerne leys. Soil N increased over time following 2 years of lucerne but there was no further significant increase after 3 or 4 years of lucerne ley. Soil nitrate concentrations increased significantly with all lucerne leys and moved progressively downward in the soil profile from 1992 to 1995. Soil water, especially at 0.9-1.2 m depth, remained significantly lower for the next 3 years after the termination of the 4 year lucerne ley than under continuous wheat. No significant increase in wheat yields was observed from 1992 to 1995, irrespective of the lucerne ley. However, wheat grain protein concentrations were significantly higher under lucerne-wheat than under wheat wheat rotations for 3-5 years. The lucerne yield and soil water and nitrate-N concentrations were satisfactorily simulated with the APSIM model. Although significant N accretion occurred in the soil following lucerne leys, in drier seasons, recharge of the drier soil profile following long duration lucerne occurred after 3 years. Consequently, 3- and 4-year lucerne-wheat rotations resulted in more variable wheat yields than wheat-wheat rotations in this region. The remaining challenge in using lucerne-wheat rotations is balancing the N accretion benefits with plant-available water deficits, which are most likely to occur in the highly variable rainfall conditions of this region.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Twelve strains of Pseudomonas pseudomallei were isolated from the soil and water of a sheep paddock over a two-year period. The organism was recovered from the clay layer of the soil profile as well as from water that seeps into this layer during the "wet" season. Five isolates were obtained before the commencement of the "wet" season; environmental factors appear to play an important role in the survival of Ps. pseudomallei during the "dry" season. Lower isolation rates were recorded than those indicated by workers in southeast Asia and Iran.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A study was undertaken from 2004 to 2007 to investigate factors associated with decreased efficacy of metalaxyl to manage damping-off of cucumber in Oman. A survey over six growing seasons showed that growers lost up to 14.6% of seedlings following application of metalaxyl. No resistance to metalaxyl was found among Pythium isolates. Damping-off disease in the surveyed greenhouses followed two patterns. In most (69%) greenhouses, seedling mortality was found to occur shortly after transplanting and decrease thereafter (Phase-I). However, a second phase of seedling mortality (Phase-II) appeared 9-14 d after transplanting in about 31% of the surveyed greenhouses. Analysis of the rate of biodegradation of metalaxyl in six greenhouses indicated a significant increase in the rate of metalaxyl biodegradation in greenhouses, which encountered Phase-II damping-off. The half-life of metalaxyl dropped from 93 d in soil, which received no previous metalaxyl treatment to 14 d in soil, which received metalaxyl for eight consecutive seasons, indicating an enhanced rate of metalaxyl biodegradation after repeated use. Multiple applications of metalaxyl helped reduce the appearance of Phase-II damping-off. This appears to be the first report of rapid biodegradation of metalaxyl in greenhouse soils and the first report of its association with appearance of a second phase of mortality in cucumber seedlings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A trial was undertaken to evaluate the effect of microwaves on seed mortality of three weed species. Seeds of rubber vine (Cryptostegia grandiflora R.Br.), parthenium (Parthenium hysterophorous L.) and bellyache bush (Jatropha gossypiifolia L.) were buried at six depths (0, 2.5, 5, 10, 20 and 40 cm) in coarse sand maintained at one of two moisture levels, oven dry or wet (field capacity), and then subjected to one of five microwave radiation durations of (0, 2, 4, 8 and 16 min). Significant interactions between soil moisture level, microwave radiation duration, seed burial depth and species were detected for mortality of seeds of all three species. Maximum seed mortality of rubber vine (88%), parthenium (67%) and bellyache bush (94%) occurred in wet soil irradiated for 16 min. Maximum seed mortality of rubber vine and bellyache bush seeds occurred in seeds buried at 2.5 cm depth whereas that of parthenium occurred in seeds buried at 10 cm depth. Maximum soil temperatures of 114.1 and 87.5°C in dry and wet soil respectively occurred at 2.5 cm depth following 16 min irradiation. Irrespective of the greater soil temperatures recorded in dry soil, irradiating seeds in wet soil generally increased seed mortality 2.9-fold compared with dry soil. Moisture content of wet soil averaged 5.7% compared with 0.1% for dry soil. Results suggest that microwave radiation has the potential to kill seeds located in the soil seed bank. However, many factors, including weed species susceptibility, determine the effectiveness of microwave radiation on buried seeds. Microwave radiation may be an alternative to conventional methods at rapidly depleting soil seed banks in the field, particularly in relatively wet soils that contain long lived weed seeds.