16 resultados para runoff processes

em eResearch Archive - Queensland Department of Agriculture


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Summary Poor land condition resulting from unsustainable grazing practices can reduce enterprise profitability and increase water, sediment and associated nutrient runoff from properties and catchments. This paper presents the results of a 6 year field study that used a series of hillslope flume experiments to evaluate the impact of improved grazing land management (GLM) on hillslope runoff and sediment yields. The study was carried out on a commercial grazing property in a catchment draining to the Burdekin River in northern Australia. During this study average ground cover on hillslopes increased from ~35% to ~75%, although average biomass and litter levels are still relatively low for this landscape type (~60 increasing to 1100 kg of dry matter per hectare). Pasture recovery was greatest on the upper and middle parts of hillslopes. Areas that did not respond to the improved grazing management had <10% cover and were on the lower slopes associated with the location of sodic soil and the initiation of gullies. Comparison of ground cover changes and soil conditions with adjacent properties suggest that grazing management, and not just improved rainfall conditions, were responsible for the improvements in ground cover in this study. The ground cover improvements resulted in progressively lower runoff coefficients for the first event in each wet season, however, runoff coefficients were not reduced at the annual time scale. The hillslope annual sediment yields declined by ~70% on two out of three hillslopes, although where bare patches (with <10% cover) were connected to gullies and streams, annual sediment yields increased in response to higher rainfall in latter years of the study. It appears that bare patches are the primary source areas for both runoff and erosion on these hillslopes. Achieving further reductions in runoff and erosion in these landscapes may require management practices that improve ground cover and biomass in bare areas, particularly when they are located adjacent to concentrated drainage lines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Runoff and sediment loss from forest roads were monitored for a two-year period in a Pinus plantation in southeast Queensland. Two classes of road were investigated: a gravelled road, which is used as a primary daily haulage route for the logging area, and an ungravelled road, which provides the main access route for individual logging compartments and is intensively used as a haulage route only during the harvest of these areas (approximately every 30 years). Both roads were subjected to routine traffic loads and maintenance during the study. Surface runoff in response to natural rainfall was measured and samples taken for the determination of sediment and nutrient (total nitrogen, total phosphorus, dissolved organic carbon and total iron) loads from each road. Results revealed that the mean runoff coefficient (runoff depth/rainfall depth) was consistently higher from the gravelled road plot with 0.57, as compared to the ungravelled road with 0.38. Total sediment loss over the two-year period was greatest from the gravelled road plot at 5.7 t km−1 compared to the ungravelled road plot with 3.9 t km−1. Suspended solids contributed 86% of the total sediment loss from the gravelled road, and 72% from the ungravelled road over the two years. Nitrogen loads from the two roads were both relatively constant throughout the study, and averaged 5.2 and 2.9 kg km−1 from the gravelled and ungravelled road, respectively. Mean annual phosphorus loads were 0.6 kg km−1 from the gravelled road and 0.2 kg km−1 from the ungravelled road. Organic carbon and total iron loads increased in the second year of the study, which was a much wetter year, and are thought to reflect the breakdown of organic matter in roadside drains and increased sediment generation, respectively. When road and drain maintenance (grading) was performed runoff and sediment loss were increased from both road types. Additionally, the breakdown of the gravel road base due to high traffic intensity during wet conditions resulted in the formation of deep (10 cm) ruts which increased erosion. The Water Erosion Prediction Project (WEPP):Road model was used to compare predicted to observed runoff and sediment loss from the two road classes investigated. For individual rainfall events, WEPP:Road predicted output showed strong agreement with observed values of runoff and sediment loss. WEPP:Road predictions for annual sediment loss from the entire forestry road network in the study area also showed reasonable agreement with the extrapolated observed values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tillage is defined here in a broad sense, including disturbance of the soil and crop residues, wheel traffic and sowing opportunities. In sub-tropical, semi-arid cropping areas in Australia, tillage systems have evolved from intensively tilled bare fallow systems, with high soil losses, to reduced and no tillage systems. In recent years, the use of controlled traffic has also increased. These conservation tillage systems are successful in reducing water erosion of soil and sediment-bound chemicals. Control of runoff of dissolved nutrients and weakly sorbed chemicals is less certain. Adoption of new practices appears to have been related to practical and economic considerations, and proved to be more profitable after a considerable period of research and development. However there are still challenges. One challenge is to ensure that systems that reduce soil erosion, which may involve greater use of chemicals, do not degrade water quality in streams. Another challenge is to ensure that systems that improve water entry do not increase drainage below the crop root zone, which would increase the risk of salinity. Better understanding of how tillage practices influence soil hydrology, runoff and erosion processes should lead to better tillage systems and enable better management of risks to water quality and soil health. Finally, the need to determine the effectiveness of in-field management practices in achieving stream water quality targets in large, multi-land use catchments will challenge our current knowledge base and the tools available.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This special issue of Continental Shelf Research contains 20 papers giving research results produced as part of Australia's Torres Strait Co-operative Research Centre (CRC) Program, which was funded over a three-year period during 2003-2006. Marine biophysical, fisheries, socioeconomic-cultural and extension research in the Torres Strait region of northeastern Australia was carried out to meet three aims: 1) support the sustainable development of marine resources and minimize impacts of resource use in Torres Strait; 2) enhance the conservation of the marine environment and the social, cultural and economic well being of all stakeholders, particularly the Torres Strait peoples; and 3) contribute to effective policy formulation and management decision making. Subjects covered, including commercial and traditional fisheries management, impacts of anthropogenic sediment inputs on seagrass meadows and communication of science results to local communities, have broad applications to other similar environments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Australia communities are concerned about atrazine being detected in drinking water supplies. It is important to understand mechanisms by which atrazine is transported from paddocks to waterways if we are to reduce movement of agricultural chemicals from the site of application. Two paddocks cropped with grain sorghum on a Black Vertosol were monitored for atrazine, potassium chloride (KCl) extractable atrazine, desethylatrazine (DEA), and desisopropylatrazine (DIA) at 4 soil depths (0-0.05, 0.05-0.10, 0.10-0.20, and 0.20-0.30 m) and in runoff water and runoff sediment. Atrazine + DEA + DIA (total atrazine) had a half-life in soil of 16-20 days, more rapid dissipation than in many earlier reports. Atrazine extracted in dilute potassium chloride, considered available for weed control, was initially 34% of the total and had a half-life of 15-20 days until day 30, after which it dissipated rapidly with a half life of 6 days. We conclude that, in this region, atrazine may not pose a risk for groundwater contamination, as only 0.5% of applied atrazine moved deeper than 0.20 m into the soil, where it dissipated rapidly. In runoff (including suspended sediment) atrazine concentrations were greatest during the first runoff event (57 days after application) (85 μg/L) and declined with time. After 160 days, the total atrazine lost in runoff was 0.4% of the initial application. The total atrazine concentration in runoff was strongly related to the total concentration in soil, as expected. Even after 98% of the KCl-extractable atrazine had dissipated (and no longer provided weed control), runoff concentrations still exceeded the human health guideline value of 40 μg/L. For total atrazine in soil (0-0.05 m), the range for coefficient of soil sorption (Kd) was 1.9-28.4 mL/g and for soil organic carbon sorption (KOC) was 100-2184 mL/g, increasing with time of contact with the soil and rapid dissipation of the more soluble, available phase. Partition coefficients in runoff for total atrazine were initially 3, increasing to 32 and 51 with time, values for DEA being half these. To minimise atrazine losses, cultural practices that maximise rain infiltration, and thereby minimise runoff, and minimise concentrations in the soil surface should be adopted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The authors identify and track processes that have resulted in the detection of six tropical weeds targeted for eradication. The habitats and distributions of these species make detection by field officers and members of the public more likely than targeted searches. The eradication program is increasing the scope of detection processes by conducting and documenting activities to improve weed recognition amongst public, government and industry stakeholders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite biocontrol research spanning over 100 years, the hybrid weed, commonly referred to as Lantana camara, is not under adequate control. Host specificity and varietal preference of released agents, climatic suitability of a region for released agents, number of agents introduced and range or area of infestation appear to play a role in limiting biocontrol success. At least one of 41 species of mainly leaf- or flower-feeding insects has been introduced, or spread, to 41 of the 70 countries or regions where lantana occurs. Over half (26) of these species have established, achieving varying levels of herbivory and presumably some degree of control. Accurate taxonomy of the plant and adaptation of potential agents to the host plant are some of the better predictors of at least establishment success. Retrospective analysis of the hosts of introduced biocontrol agents for L. camara show that a greater proportion of agents that were collected from L. camara or Lantana urticifolia established, than agents that were collected from other species of Lantana. Of the introduced agents that had established and were oligophagous, 18 out of 22 established. The proportion of species establishing, declined with the number of species introduced. However, there was no trend when oceanic islands were treated separately from mainland areas and the result is likely an artefact of how introductions have changed over time. A calculated index of the degree of herbivory due to agents known to have caused some damage per country, was not related to land area infested with lantana for mainlands nor for oceanic islands. However, the degree of herbivory is much higher on islands than mainlands. This difference between island and mainland situations may reflect population dynamics in patchy or metapopulation landscapes. Basic systematic studies of the host remain crucial to successful biocontrol, especially of hybrid weeds like L. camara. Potential biocontrol agents should be monophages collected from the most closely related species to the target weed or be phytophages that attack several species of lantana. Suitable agents should be released in the most ideal ecoclimatic area. Since collection of biocontrol agents has been limited to a fraction of the known number of phytophagous species available, biocontrol may be improved by targeting insects that feed on stems and roots, as well as the agents that feed on leaves and flowers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Surface losses of nitrogen from horticulture farms in coastal Queensland, Australia, may have the potential to eutrophy sensitive coastal marine habitats nearby. A case-study of the potential extent of such losses was investigated in a coastal macadamia plantation. Nitrogen losses were quantified in 5 consecutive runoff events during the 13-month study. Irrigation did not contribute to surface flows. Runoff was generated by storms at combined intensities and durations that were 20–40 mm/h for >9 min. These intensities and durations were within expected short-term (1 year) and long-term (up to 20 years) frequencies of rainfall in the study area. Surface flow volumes were 5.3 ± 1.1% of the episodic rainfall generated by such storms. Therefore, the largest part of each rainfall event was attributed to infiltration and drainage in this farm soil (Kandosol). The estimated annual loss of total nitrogen in runoff was 0.26 kg N/ha.year, representing a minimal loading of nitrogen in surface runoff when compared to other studies. The weighted average concentrations of total sediment nitrogen (TSN) and total dissolved nitrogen (TDN) generated in the farm runoff were 2.81 ± 0.77% N and 1.11 ± 0.27 mg N/L, respectively. These concentrations were considerably greater than ambient levels in an adjoining catchment waterway. Concentrations of TSN and TDN in the waterway were 0.11 ± 0.02% N and 0.50 ± 0.09 mg N/L, respectively. The steep concentration gradient of TSN and TDN between the farm runoff and the waterway demonstrated the occurrence of nutrient loading from the farming landscapes to the waterway. The TDN levels in the stream exceeded the current specified threshold of 0.2–0.3 mg N/L for eutrophication of such a waterway. Therefore, while the estimate of annual loading of N from runoff losses was comparatively low, it was evident that the stream catchment and associated agricultural land uses were already characterised by significant nitrogen loadings that pose eutrophication risks. The reported levels of nitrogen and the proximity of such waterways (8 km) to the coastline may have also have implications for the nearshore (oligotrophic) marine environment during periods of turbulent flow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Models are abstractions of reality that have predetermined limits (often not consciously thought through) on what problem domains the models can be used to explore. These limits are determined by the range of observed data used to construct and validate the model. However, it is important to remember that operating the model beyond these limits, one of the reasons for building the model in the first place, potentially brings unwanted behaviour and thus reduces the usefulness of the model. Our experience with the Agricultural Production Systems Simulator (APSIM), a farming systems model, has led us to adapt techniques from the disciplines of modelling and software development to create a model development process. This process is simple, easy to follow, and brings a much higher level of stability to the development effort, which then delivers a much more useful model. A major part of the process relies on having a range of detailed model tests (unit, simulation, sensibility, validation) that exercise a model at various levels (sub-model, model and simulation). To underline the usefulness of testing, we examine several case studies where simulated output can be compared with simple relationships. For example, output is compared with crop water use efficiency relationships gleaned from the literature to check that the model reproduces the expected function. Similarly, another case study attempts to reproduce generalised hydrological relationships found in the literature. This paper then describes a simple model development process (using version control, automated testing and differencing tools), that will enhance the reliability and usefulness of a model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Runoff, soil loss, and nutrient loss were assessed on a Red Ferrosol in tropical Australia over 3 years. The experiment was conducted using bounded, 100-m(2) field plots cropped to peanuts, maize, or grass. A bare plot, without cover or crop, was also instigated as an extreme treatment. Results showed the importance of cover in reducing runoff, soil loss, and nutrient loss from these soils. Runoff ranged from 13% of incident rainfall for the conventional cultivation to 29% under bare conditions during the highest rainfall year, and was well correlated with event rainfall and rainfall energy. Soil loss ranged from 30 t/ha. year under bare conditions to <6 t/ha. year under cropping. Nutrient losses of 35 kg N and 35 kg P/ha. year under bare conditions and 17 kg N and 11 kg P/ha. year under cropping were measured. Soil carbon analyses showed a relationship with treatment runoff, suggesting that soil properties influenced the rainfall runoff response. The cropping systems model PERFECT was calibrated using runoff, soil loss, and soil water data. Runoff and soil loss showed good agreement with observed data in the calibration, and soil water and yield had reasonable agreement. Longterm runs using historical weather data showed the episodic nature of runoff and soil loss events in this region and emphasise the need to manage land using protective measures such as conservation cropping practices. Farmers involved in related, action-learning activities wished to incorporate conservation cropping findings into their systems but also needed clear production benefits to hasten practice change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The off-site transport of agricultural chemicals, such as herbicides, into freshwater and marine ecosystems is a world-wide concern. The adoption of farm management practices that minimise herbicide transport in rainfall-runoff is a priority for the Australian sugarcane industry, particularly in the coastal catchments draining into the World Heritage listed Great Barrier Reef (GBR) lagoon. In this study, residual herbicide runoff and infiltration were measured using a rainfall simulator in a replicated trial on a brown Chromosol with 90–100% cane trash blanket cover in the Mackay Whitsunday region, Queensland. Management treatments included conventional 1.5 m spaced sugarcane beds with a single row of sugarcane (CONV) and 2 m spaced, controlled traffic sugarcane beds with dual sugarcane rows (0.8 m apart) (2mCT). The aim was to simulate the first rainfall event after the application of the photosynthesis inhibiting (PSII) herbicides ametryn, atrazine, diuron and hexazinone, by broadcast (100% coverage, on bed and furrow) and banding (50–60% coverage, on bed only) methods. These events included heavy rainfall 1 day after herbicide application, considered a worst case scenario, or rainfall 21 days after application. The 2mCT rows had significantly (P < 0.05) less runoff (38%) and lower peak runoff rates (43%) than CONV rows for a rainfall average of 93 mm at 100 mm h−1 (1:20 yr Average Return Interval). Additionally, final infiltration rates were higher in 2mCT rows than CONV rows, with 72 and 52 mm h−1 respectively. This resulted in load reductions of 60, 55, 47, and 48% for ametryn, atrazine, diuron and hexazinone from 2mCT rows, respectively. Herbicide losses in runoff were also reduced by 32–42% when applications were banded rather than broadcast. When rainfall was experienced 1 day after application, a large percentage of herbicides were washed off the cane trash. However, by day 21, concentrations of herbicide residues on cane trash were lower and more resistant to washoff, resulting in lower losses in runoff. Consequently, ametryn and atrazine event mean concentrations in runoff were approximately 8 fold lower at day 21 compared with day 1, whilst diuron and hexazinone were only 1.6–1.9 fold lower, suggesting longer persistence of these chemicals. Runoff collected at the end of the paddock in natural rainfall events indicated consistent though smaller treatment differences to the rainfall simulation study. Overall, it was the combination of early application, banding and controlled traffic that was most effective in reducing herbicide losses in runoff. Crown copyright © 2012

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One major benefit of land application of biosolids is to supply nitrogen (N) for agricultural crops, and understanding mineralisation processes is the key for better N-management strategies. Field studies were conducted to investigate the process of mineralisation of three biosolids products (aerobic, anaerobic, and thermally dried biosolids) incorporated into four different soils at rates of 7-90 wet t/ha in subtropical Queensland. Two of these studies also examined mineralisation rates of commonly used organic amendments (composts, manures, and sugarcane mill muds). Organic N in all biosolids products mineralised very rapidly under ambient conditions in subtropical Queensland, with rates much faster than from other common amendments. Biosolids mineralisation rates ranged from 30 to 80% of applied N during periods ranging from 3.5 to 18 months after biosolids application; these rates were much higher than those suggested in the biosolids land application guidelines established by the NSW EPA (15% for anaerobic and 25% for aerobic biosolids). There was no consistently significant difference in mineralisation rate between aerobic and anaerobic biosolids in our studies. When applied at similar rates of N addition, other organic amendments supplied much less N to the soil mineral N and plant N pools during the crop season. A significant proportion of the applied biosolids total N (up to 60%) was unaccounted for at the end of the observation period. High rates of N addition in calculated Nitrogen Limited Biosolids Application Rates (850-1250 kg N/ha) resulted in excessive accumulation of mineral N in the soil profile, which increases the environmental risks due to leaching, runoff, or gaseous N losses. Moreover, the rapid mineralisation of the biosolids organic N in these subtropical environments suggests that biosolids should be applied at lower rates than in temperate areas, and that care must be taken with the timing to maximise plant uptake and minimise possible leaching, runoff, or denitrification losses of mineralised N.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose This study investigated how nitrogen (N) nutrition and key physiological processes varied under changed water and nitrogen competition resulting from different weed control and fertilisation treatments in a 2-year-old F1 hybrid (Pinus elliottii Engelm var. elliottii × P. caribaea var. hondurensis Barr. ex Golf.) plantation on a grey podzolic soil type, in Southeast Queensland. Materials and methods The study integrated a range of measures including growth variables (diameter at ground level (DGL), diameter at breast height (DBH) and height (H)), foliar variables (including foliar N concentration, foliar δ13C and δ15N) and physiological variables (including photosynthesis (An), stomatal conductance (gs), transpiration (E), intrinsic water use efficiency (WUEi) (A/gs) and xylem pressure potential (ΨXPP)) to better understand the mechanisms influencing growth under different weed control and fertilisation treatments. Five levels of weed control were applied: standard (routine), luxury, intermediate, mechanical and nil weed control, all with routine fertilisation plus an additional treatment, routine weed control and luxury fertilisation. Relative weed cover was assessed at 0.8, 1.1 and 1.6 years after plantation establishment to monitor the effectiveness of weed control treatments. Soil investigation included soil ammonium (NH4 +-N), nitrate (NO3 −-N), potentially mineralizable N (PMN), gravimetric soil moisture content (MC), hot water extractable organic carbon (HWETC), hot water extractable total N (HWETN), total C, total N, stable C isotope composition (δ13C), stable N isotope composition (δ15N), total P and extractable K. Results and discussion There were significant relationships between foliar N concentrations and relative weed cover and between tree growth and foliar N concentration or foliar δ15N, but initial site preparation practices also increased soil N transformations in the planting rows reducing the observable effects of weed control on foliar δ15N. A positive relationship between foliar N concentration and foliar δ13C or photosynthesis indicated that increased N availability to trees positively influenced non-stomatal limitations to photosynthesis. However, trees with increased foliar N concentrations and photosynthesis were negatively related to xylem pressure potential in the afternoons which enhanced stomatal limitations to photosynthesis and WUEi. Conclusions Luxury and intermediate weed control and luxury fertilisation positively influenced growth at early establishment by reducing the competition for water and N resources. This influenced fundamental key physiological processes such as the relationships between foliar N concentration, A n, E, gs and ΨXPP. Results also confirmed that time from cultivation is an important factor influencing the effectiveness of using foliar δ15N as an indicator of soil N transformations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Increased disease resistance is a key target of cereal breeding programs, with disease outbreaks continuing to threaten global food production, particularly in Africa. Of the disease resistance gene families, the nucleotide-binding site plus leucine-rich repeat (NBS-LRR) family is the most prevalent and ancient and is also one of the largest gene families known in plants. The sequence diversity in NBS-encoding genes was explored in sorghum, a critical food staple in Africa, with comparisons to rice and maize and with comparisons to fungal pathogen resistance QTL. Results In sorghum, NBS-encoding genes had significantly higher diversity in comparison to non NBS-encoding genes and were significantly enriched in regions of the genome under purifying and balancing selection, both through domestication and improvement. Ancestral genes, pre-dating species divergence, were more abundant in regions with signatures of selection than in regions not under selection. Sorghum NBS-encoding genes were also significantly enriched in the regions of the genome containing fungal pathogen disease resistance QTL; with the diversity of the NBS-encoding genes influenced by the type of co-locating biotic stress resistance QTL. Conclusions NBS-encoding genes are under strong selection pressure in sorghum, through the contrasting evolutionary processes of purifying and balancing selection. Such contrasting evolutionary processes have impacted ancestral genes more than species-specific genes. Fungal disease resistance hot-spots in the genome, with resistance against multiple pathogens, provides further insight into the mechanisms that cereals use in the “arms race” with rapidly evolving pathogens in addition to providing plant breeders with selection targets for fast-tracking the development of high performing varieties with more durable pathogen resistance.