132 resultados para Soil loan areas
Resumo:
Attention is directed at land application of piggery effluent (containing urine, faeces, water, and wasted feed) as a potential source of water resource contamination with phosphorus (P). This paper summarises P-related properties of soil from 0-0.05 m depth at 11 piggery effluent application sites, in order to explore the impact that effluent application has had on the potential for run-off transport of P. The sites investigated were situated on Alfisol, Mollisol, Vertisol, and Spodosol soils in areas that received effluent for 1.5-30 years (estimated effluent-P applications of 100-310000 kg P/ha in total). Total (PT), bicarbonate extractable (PB), and soluble P forms were determined for the soil (0-0.05 m) at paired effluent and no-effluent sites, as well as texture, oxalate-extractable Fe and Al, organic carbon, and pH. All forms of soil P at 0-0.05 m depth increased with effluent application (PB at effluent sites was 1.7-15 times that at no-effluent sites) at 10 of the 11 sites. Increases in PB were strongly related to net P applications (regression analysis of log values for 7 sites with complete data sets: 82.6 % of variance accounted for, p <0.01). Effluent irrigation tended to increase the proportion of soil PT in dilute CaCl2-extractable forms (PTC: effluent average 2.0 %; no-effluent average 0.6%). The proportion of PTC in non-molybdate reactive forms (centrifuged supernatant) decreased (no-effluent average, 46.4 %; effluent average, 13.7 %). Anaerobic lagoon effluent did not reliably acidify soil, since no consistent relationship was observed for pH with effluent application. Soil organic carbon was increased in most of the effluent areas relative to the no-effluent areas. The four effluent areas where organic carbon was reduced had undergone intensive cultivation and cropping. Current effluent management at many of the piggeries failed to maximise the potential for waste P recapture. Ten of the case-study effluent application areas have received effluent-P in excess of crop uptake. While this may not represent a significant risk of leaching where sorption retains P, it has increased the risk of transport of P by run-off. Where such sites are close to surface water, run-off P loads should be managed.
Resumo:
To improve the sustainability and environmental accountability of the banana industry there is a need to develop a set of soil health indicators that integrate physical, chemical and biological soil properties. These indicators would allow banana growers, extension and research workers to improve soil health management practices. To determine changes in soil properties due to the cultivation of bananas, a paired site survey was conducted comparing soil properties under conventional banana systems to less intensively managed vegetation systems, such as pastures and forest. Measurements were made on physical, chemical and biological soil properties at seven locations in tropical and sub-tropical banana producing areas. Soil nematode community composition was used as a bioindicator of the biological properties of the soil. Soils under conventional banana production tended to have a greater soil bulk density, with less soil organic carbon (C) (both total C and labile C), greater exchangeable cations, higher extractable P, greater numbers of plant-parasitic nematodes and less nematode diversity, relative to less intensively managed plant systems. The organic banana production systems at two locations had greater labile C, relative to conventional banana systems, but there was no significant change in nematode community composition. There were significant interactions between physical, chemical and nematode community measurements in the soil, particularly with soil C measurements, confirming the need for a holistic set of indicators to aid soil management. There was no single indicator of soil health for the Australian banana industry, but a set of soil health indicators, which would allow the measurement of soil improvements should include: bulk density, soil C, pH, EC, total N, extractable P, ECEC and soil nematode community structure.
Resumo:
This paper is the first of a series that investigates whether new cropping systems with permanent raised beds (PRBs) or Flat land could be successfully used to increase farmers' incomes from rainfed crops in Lombok in Eastern Indonesia. This paper discusses the rice phase of the cropping system. Low grain yields of dry-seeded rice (Oryza sativa) grown on Flat land on Vertisols in the rainfed region of southern Lombok, Eastern Indonesia, are probably mainly due to (a) erratic rainfall (870-1220 mm/yr), with water often limiting at sensitive growth stages, (b) consistently high temperatures (average maximum - 31 C), and (c) low solar radiation. Farmers are therefore poor, and labour is hard and costly, as all operations are manual. Two replicated field experiments were run at Wakan (annual rainfall = 868 mm) and Kawo (1215 mm) for 3 years (2001/2002 to 2003/2004) on Vertisols in southern Lombok. Dry-seeded rice was grown in 4 treatments with or without manual tillage on (a) PRBs, 1.2 m wide, 200 mm high, separated by furrows 300 mm wide, 200 mill deep, with no rice sown in the well-graded furrows, and (b) well-graded Flat land. Excess surface water was harvested from each treatment and used for irrigation after the vegetative stage of the rice. All operations were manual. There were no differences between treatments in grain yield of rice (mean grain yield = 681 g/m(2)) which could be partly explained by total number of tillers/hill and mean panicle length, but not number of productive tillers/hill, plant height or weight of 1000 grains. When the data from both treatments on PRBs and from both treatments on Flat land, each year at each site were analysed, there were also no differences in grain yield of rice (g/m(2)). When rainfall in the wet season up to harvest was over 1000 mm (Year 2; Wakan, Kawo), or plants were water-stressed during crop establishment (Year 1; Wakan) or during grain-fill (Year 3: Kawo), there were significant differences in grain yield (g/1.5 m(2)) between treatments; generally the grain yield (g/1.5 m(2)) on PRBs with or without tillage was less than that on Flat land with or without tillage. However, when the data from both treatments on PRBs and from both treatments on Flat land, each year at each site, were analysed, the greater grain yield of dry-seeded rice on Flat land (mean yield 1 092 g/1.5 m(2)) than that on PRBs (mean 815 g/1.5 m(2)) was mainly because there were 25% more plants on Flat land. Overall when the data in the 2 outer rows and the 2 inner rows on PRBs were each combined, there was a higher number of productive tillers in the combined outer rows (mean 20.7 tillers/hill) compared with that in the combined inner rows on each PRB (mean 18.2 tillers/hill). However, there were no differences in grain yield between combined rows (mean 142 g/m row). Hence with a gap of 500 mm (the distance between the outer rows of plants on adjacent raised beds), plants did not compensate in grain yield for missing plants in furrows. This suggests that rice (a) also sown in furrows, or (b) sown in 7 rows with narrower row-spacing, or (c) sown in 6 rows with slightly wider row-spacing, and narrower gap between outer rows on adjacent beds, may further increase grain yield (g/1.5 m(2)) in this system of PRBs. The growth and the grain yield (y in g/m(2)) of rainfed rice (with rainfall on-site the only source of water for irrigation) depended mainly on the rainfall (x in mm) in the wet season up to harvest (due either to site or year) with y = 1. 1x -308; r(2) = 0.54; p < 0.005. However, 280 mm (i.e. 32%) of the rainfall was not directly used to produce grain (i.e. when y = 0 g/m(2)). Manual tillage did not affect growth and grain yield of rice (g/m(2); g/1.5 m(2)), either on PRB or on Flat land.
Resumo:
This project focussed on the phosphorus (P) and potassium (K) status of northern cropping soils. Stores of P and K have been depleted by crop removal and limited fertiliser application, with depletion most significant in the subsoil. Soil testing strategies are confounded by slowly available mineral reserves with uncertain availability. The utility of new soil tests was assessed to measure these reserves, their availability to plants quantified and a regional sampling strategy undertaken to identify areas of greatest P and K deficit. Fertiliser application strategies for P and K have been tested and the interactions between these and other nutrients have been determined in a large field program.
Resumo:
Macfadyena unguis-cati (L.) Gentry (Bignoniaceae) is a major environmental weed in coastal Queensland, Australia. There is a lack of quantitative data on its leaf chemistry and its impact on soil properties. Soils from infested vs uninfested areas, and leaves of M. unguis-cati and three co-occurring vine species (one exotic, two native) were collected at six sites (riparian and non-riparian) in south-eastern Queensland. Effects of invasion status, species, site and habitat type were examined using univariate and multivariate analyses. Habitat type had a greater effect on soil nutrients than on leaf chemistry. Invasion effect of M. unguis-cati on soil chemistry was more pronounced in non-riparian than in riparian habitat. Significantly higher values were obtained in M. unguis-cati infested (vs. uninfested) soils for ~50% of traits. Leaf ion concentrations differed significantly between exotic and native vines. Observed higher leaf-nutrient load (especially nitrogen, phosphorus and potassium) in exotic plants aligns with the preference of invasive plant species for disturbed habitats with higher nutrient input. Higher load of trace elements (aluminium, boron, cadmium and iron) in its leaves suggests that cycling of heavy-metal ions, many of which are potentially toxic at excess level, could be accelerated in soils of M. unguis-cati-invaded landscape. Although inferences from the present study are based on correlative data, the consistency of the patterns across many sites suggests that M. unguis-cati may improve soil fertility and influence nutrient cycling, perhaps through legacy effects of its own litter input.
Resumo:
The major banana production areas in Australia are particularly sensitive to environments due to their close proximity to areas of World Heritage rainforest and the Great Barrier Reef catchment. Management of soil quality, nutrients and pesticides are vital to maintaining the integrity of these sensitive areas. Studies on cropping systems have suggested that integrating organic matter into ground cover management would improve the quality of soil under banana cultivation. In this study, an alternative management practice for bananas, which addresses the management of organic matter and fertiliser application, was assessed and compared to the conventional practice currently employed in the banana industry. Several chemical, physical and biological soil parameters were measured including: pH, electrical conductivity, water stable aggregates, bulk density, water filled pore space, porosity, water content, fluorescein diacetate hydrolyis (FDA) and beta-glucosidase activity. The alternative management practice did not have a significant impact of the production and growth of bananas but overall improved the quality of the soil. Although some differences were observed, the chemical and physical soil characteristics did not differ dramatically between the two management systems. The addition of organic matter resulted in the soil under alternative practice having higher FDA and beta-glucosidase levels, indicating higher microbial activity. The integration of organic matter into the management of bananas resulted in positive benefits on soil properties under bananas, however, methods of maintaining organic matter in the soil need to be further researched.
Resumo:
In semi-arid sub-tropical areas, a number of studies concerning no-till (NT) farming systems have demonstrated advantages in economic, environmental and soil quality aspects over conventional tillage (CT). However, adoption of continuous NT has contributed to the build-up of herbicide resistant weed populations, increased incidence of soil- and stubble-borne diseases, and stratification of nutrients and organic carbon near the soil surface. Some farmers often resort to an occasional strategic tillage (ST) to manage these problems of NT systems. However, farmers who practice strict NT systems are concerned that even one-time tillage may undo positive soil condition benefits of NT farming systems. We reviewed the pros and cons of the use of occasional ST in NT farming systems. Impacts of occasional ST on agronomy, soil and environment are site-specific and depend on many interacting soil, climatic and management conditions. Most studies conducted in North America and Europe suggest that introducing occasional ST in continuous NT farming systems could improve productivity and profitability in the short term; however in the long-term, the impact is negligible or may be negative. The short term impacts immediately following occasional ST on soil and environment include reduced protective cover, soil loss by erosion, increased runoff, loss of C and water, and reduced microbial activity with little or no detrimental impact in the long-term. A potential negative effect immediately following ST would be reduced plant available water which may result in unreliability of crop sowing in variable seasons. The occurrence of rainfall between the ST and sowing or immediately after the sowing is necessary to replenish soil water lost from the seed zone. Timing of ST is likely to be critical and must be balanced with optimising soil water prior to seeding. The impact of occasional ST varies with the tillage implement used; for example, inversion tillage using mouldboard tillage results in greater impacts as compared to chisel or disc. Opportunities for future research on occasional ST with the most commonly used implements such as tine and/or disc in Australia’s northern grains-growing region are presented in the context of agronomy, soil and the environment.
Resumo:
Many rainfed wheat production systems are reliant on stored soil water for some or all of their water inputs. Selection and breeding for root traits could result in a yield benefit; however, breeding for root traits has traditionally been avoided due to the difficulty of phenotyping mature root systems, limited understanding of root system development and function, and the strong influence of environmental conditions on the phenotype of the mature root system. This paper outlines an international field selection program for beneficial root traits at maturity using soil coring in India and Australia. In the rainfed areas of India, wheat is sown at the end of the monsoon into hot soils with a quickly receding soil water profile; in season water inputs are minimal. We hypothesised that wheat selected and bred for high yield under these conditions would have deep, vigorous root systems, allowing them to access and utilise the stored soil water at depth around anthesis and grain-filling when surface layers were dry. The Indian trials resulted in 49 lines being sent to Australia for phenotyping. These lines were ranked against 41 high yielding Australian lines. Variation was observed for deep root traits e.g. in eastern Australia in 2012, maximum depth ranged from 118.8 to 146.3 cm. There was significant variation for root traits between sites and years, however, several Indian genotypes were identified that consistently ranked highly across sites and years for deep rooting traits.
Resumo:
Seed production and soil seed hanks of H. contortus were studied in a subset of treatments within an extensive grazing study conducted in H. contortus pasture in southern Queensland between 1990 and 1996. Seed production of H. contortus in autumn ranged from 260 to 1800 seeds/m2 with much of this variation due to differences in rainfall between years. Seed production was generally higher in the silver-leaved ironbark than in the narrow-leaved ironbark land class and was also influenced by a consistent stocking rate x pasture type interaction. Inflorescence density was the main factor contributing to the variable seed production and was related to the rainfall received during February. The number of seeds per inflorescence was unaffected by seasonal rainfall, landscape position, stocking rate or legume oversowing. Seed viability was related to the rainfall received during March. Soil seed banks in spring varied from 130 to 520 seeds/m2 between 1990 and 1995 with generally more seed present in the silver-leaved ironbark than in the narrow-leaved ironbark land class. There were poor relationships between viable seed production and the size of the soil seed bank, and between the size of the soil seed bank and seedling recruitment. This study indicates that H. contortus has the potential to produce relatively large amounts of seed and showed that the seasonal pattern of rainfall plays a major role in achieving this potential
Resumo:
Soil nitrogen (N) supply in the Vertosols of southern Queensland, Australia has steadily declined as a result of long-term cereal cropping without N fertiliser application or rotations with legumes. Nitrogen-fixing legumes such as lucerne may enhance soil N supply and therefore could be used in lucerne-wheat rotations. However, lucerne leys in this subtropical environment can create a soil moisture deficit, which may persist for a number of seasons. Therefore, we evaluated the effect of varying the duration of a lucerne ley (for up to 4 years) on soil N increase, N supply to wheat, soil water changes, wheat yields and wheat protein on a fertility-depleted Vertosol in a field experiment between 1989 and 1996 at Warra (26degrees 47'S, 150degrees53'E), southern Queensland. The experiment consisted of a wheat-wheat rotation, and 8 treatments of lucerne leys starting in 1989 (phase 1) or 1990 (phase 2) for 1,2,3 or 4 years duration, followed by wheat cropping. Lucerne DM yield and N yield increased with increasing duration of lucerne leys. Soil N increased over time following 2 years of lucerne but there was no further significant increase after 3 or 4 years of lucerne ley. Soil nitrate concentrations increased significantly with all lucerne leys and moved progressively downward in the soil profile from 1992 to 1995. Soil water, especially at 0.9-1.2 m depth, remained significantly lower for the next 3 years after the termination of the 4 year lucerne ley than under continuous wheat. No significant increase in wheat yields was observed from 1992 to 1995, irrespective of the lucerne ley. However, wheat grain protein concentrations were significantly higher under lucerne-wheat than under wheat wheat rotations for 3-5 years. The lucerne yield and soil water and nitrate-N concentrations were satisfactorily simulated with the APSIM model. Although significant N accretion occurred in the soil following lucerne leys, in drier seasons, recharge of the drier soil profile following long duration lucerne occurred after 3 years. Consequently, 3- and 4-year lucerne-wheat rotations resulted in more variable wheat yields than wheat-wheat rotations in this region. The remaining challenge in using lucerne-wheat rotations is balancing the N accretion benefits with plant-available water deficits, which are most likely to occur in the highly variable rainfall conditions of this region.
Resumo:
The size of the soil microbial biomass carbon (SMBC) has been proposed as a sensitive indicator for measuring the adverse effects of contaminants on the soil microbial community. In this study of Australian agricultural systems, we demonstrated that field variability of SMBC measured using the fumigation-extraction procedure limited its use as a robust ecotoxicological endpoint. The SMBC varied up to 4-fold across control samples collected from a single field site, due to small-scale spatial heterogeneity in the soil physicochemical environment. Power analysis revealed that large numbers of replicates (3-93) were required to identify 20% or 50% decreases in the size of the SMBC of contaminated soil samples relative to their uncontaminated control samples at the 0.05% level of statistical significance. We question the value of the routine measurement of SMBC as an ecotoxicological endpoint at the field scale, and suggest more robust and predictive microbiological indicators.
Resumo:
In south-eastern Queensland, Australia, sorghum planted in early spring usually escapes sorghum midge, Stenodiplosis sorghicola, attack. Experiments were conducted to better understand the role of winter diapause in the population dynamics of this pest. Emergence patterns of adult midge from diapausing larvae on the soil surface and at various depths were investigated during spring to autumn of 1987/88–1989/90. From 1987/88 to 1989/90, 89%, 65% and 98% of adult emergence, respectively, occurred during November and December. Adult emergence from larvae diapausing on the soil surface was severely reduced due to high mortality attributed to surface soil temperatures in excess of 40°C, with much of this mortality occurring between mid-September and mid-October. Emergence of adults from the soil surface was considerably delayed in the 1988/89 season compared with larvae buried at 5 or 10 cm which had similar emergence patterns for all three seasons. In 1989/90, when a 1-cm-deep treatment was included, there was a 392% increase in adult emergence from this treatment compared with deeper treatments. Some diapausing larvae on the surface did not emerge at the end of summer in only 1 year (1989/90), when 28.0% of the larvae on the surface remained in diapause, whereas only 0.8% of the buried larvae remained in diapause. We conclude that the pattern of emergence explains why spring plantings of sorghum in south-eastern Queensland usually escape sorghum midge attack.
Resumo:
An urgent need exists for indicators of soil health and patch functionality in extensive rangelands that can be measured efficiently and at low cost. Soil mites are candidate indicators, but their identification and handling is so specialised and time-consuming that their inclusion in routine monitoring is unlikely. The aim of this study was to measure the relationship between patch type and mite assemblages using a conventional approach. An additional aim was to determine if a molecular approach traditionally used for soil microbes could be adapted for soil mites to overcome some of the bottlenecks associated with soil fauna diversity assessment. Soil mite species abundance and diversity were measured using conventional ecological methods in soil from patches with perennial grass and litter cover (PGL), and compared to soil from bare patches with annual grasses and/or litter cover (BAL). Soil mite assemblages were also assessed using a molecular method called terminal-restriction fragment length polymorphism (T-RFLP) analysis. The conventional data showed a relationship between patch type and mite assemblage. The Prostigmata and Oribatida were well represented in the PGL sites, particularly the Aphelacaridae (Oribatida). For T-RFLP analysis, the mite community was represented by a series of DNA fragment lengths that reflected mite sequence diversity. The T-RFLP data showed a distinct difference in the mite assemblage between the patch types. Where possible, T-RFLP peaks were matched to mite families using a reference 18S rDNA database, and the Aphelacaridae prevalent in the conventional samples at PGL sites were identified, as were prostigmatids and oribatids. We identified limits to the T-RFLP approach and this included an inability to distinguish some species whose DNA sequences were similar. Despite these limitations, the data still showed a clear difference between sites, and the molecular taxonomic inferences also compared well with the conventional ecological data. The results from this study indicated that the T-RFLP approach was effective in measuring mite assemblages in this system. The power of this technique lies in the fact that species diversity and abundance data can be obtained quickly because of the time taken to process hundreds of samples, from soil DNA extraction to data output on the gene analyser, can be as little as 4 days.
Resumo:
In dryland agricultural systems of the subtropical, semi-arid region of north-eastern Australia, water is the most limiting resource. Crop productivity depends on the efficient use of rainfall and available water stored in the soil during fallow. Agronomic management practices including a period of fallow, stubble retention, and reduced tillage enhance reserves of soil water. However, access to stored water in these soils may be restricted by the presence of growth-limiting conditions in the rooting zone of the crop. These have been termed as subsoil constraints. Subsoil constraints may include compacted or gravel layers (physical), sodicity, salinity, acidity, nutrient deficiencies, presence of toxic elements (chemical) and low microbial activity (biological). Several of these constraints may occur together in some soils. Farmers have often not been able to obtain the potential yield determined by their prevailing climatic conditions in the marginal rainfall areas of the northern grains region. In the past, the adoption of soil management practices had been largely restricted to the top 100 mm soil layer. Exploitation of the subsoil as a source of water and nutrients has largely been overlooked. The key towards realising potential yields would be to gain better understanding of subsoils and their limitations, then develop options to manage them practically and economically. Due to the complex nature of the causal factors of these constraints, efforts are required for a combination of management approaches rather than individual options, with the aim to combat these constraints for sustainable crop production, managing natural resources and avoiding environmental damage.
Resumo:
Two examples of GIS-based multiple-criteria evaluations of plantation forests are presented. These desktop assessments use available topographical, geological and pedological information to establish the risk of occurrence of certain environmentally detrimental processes. The first case study is concerned with the risk that chemical additives (i.e. simazine) applied within the forestry landscape may reach the drainage system. The second case study assesses the vulnerability of forested areas to landslides. The subject of the first multiple-criteria evaluation (MCE) was a 4 km2 logging area, which had been recently site-prepared for a Pinus plantation. The criteria considered relevant to the assessment were proximity to creeks, slope, soil depth to the restrictive layer (i.e. potential depth to a perched water table) and soil erodability (based on clay content). The output of the MCE was in accordance with field observations, showing that this approach has the potential to provide management support by highlighting areas vulnerable to waterlogging, which in turn can trigger overland flow and export of pollutants to the local stream network. The subject of the second evaluation was an Araucaria plantation which is prone to landslips during heavy rain. The parameters included in the assessment were drainage system, the slope of the terrain and geological features such as rocks and structures. A good correlation between the MCE results and field observations was found, suggesting that this GIS approach is useful for the assessment of natural hazards. Multiple-criteria evaluations are highly flexible as they can be designed in either vector or raster format, depending on the type of available data. Although tested on specific areas, the MCEs presented here can be easily used elsewhere and assist both management intervention and the protection of the adjacent environment by assessing the vulnerability of the forest landscape to either introduced chemicals or natural hazards.