913 resultados para Irrigation and drainage
Resumo:
Attention is directed at land application of piggery effluent (containing urine, faeces, water, and wasted feed) as a potential source of water resource contamination with phosphorus (P). This paper summarises P-related properties of soil from 0-0.05 m depth at 11 piggery effluent application sites, in order to explore the impact that effluent application has had on the potential for run-off transport of P. The sites investigated were situated on Alfisol, Mollisol, Vertisol, and Spodosol soils in areas that received effluent for 1.5-30 years (estimated effluent-P applications of 100-310000 kg P/ha in total). Total (PT), bicarbonate extractable (PB), and soluble P forms were determined for the soil (0-0.05 m) at paired effluent and no-effluent sites, as well as texture, oxalate-extractable Fe and Al, organic carbon, and pH. All forms of soil P at 0-0.05 m depth increased with effluent application (PB at effluent sites was 1.7-15 times that at no-effluent sites) at 10 of the 11 sites. Increases in PB were strongly related to net P applications (regression analysis of log values for 7 sites with complete data sets: 82.6 % of variance accounted for, p <0.01). Effluent irrigation tended to increase the proportion of soil PT in dilute CaCl2-extractable forms (PTC: effluent average 2.0 %; no-effluent average 0.6%). The proportion of PTC in non-molybdate reactive forms (centrifuged supernatant) decreased (no-effluent average, 46.4 %; effluent average, 13.7 %). Anaerobic lagoon effluent did not reliably acidify soil, since no consistent relationship was observed for pH with effluent application. Soil organic carbon was increased in most of the effluent areas relative to the no-effluent areas. The four effluent areas where organic carbon was reduced had undergone intensive cultivation and cropping. Current effluent management at many of the piggeries failed to maximise the potential for waste P recapture. Ten of the case-study effluent application areas have received effluent-P in excess of crop uptake. While this may not represent a significant risk of leaching where sorption retains P, it has increased the risk of transport of P by run-off. Where such sites are close to surface water, run-off P loads should be managed.
Resumo:
Variable-rate technologies and site-specific crop nutrient management require real-time spatial information about the potential for response to in-season crop management interventions. Thermal and spectral properties of canopies can provide relevant information for non-destructive measurement of crop water and nitrogen stresses. In previous studies, foliage temperature was successfully estimated from canopy-scale (mixed foliage and soil) temperatures and the multispectral Canopy Chlorophyll Content Index (CCCI) was effective in measuring canopy-scale N status in rainfed wheat (Triticum aestivum L.) systems in Horsham, Victoria, Australia. In the present study, results showed that under irrigated wheat systems in Maricopa, Arizona, USA, the theoretical derivation of foliage temperature unmixing produced relationships similar to those in Horsham. Derivation of the CCCI led to an r2 relationship with chlorophyll a of 0.53 after Zadoks stage 43. This was later than the relationship (r2 = 0.68) developed for Horsham after Zadoks stage 33 but early enough to be used for potential mid-season N fertilizer recommendations. Additionally, ground-based hyperspectral data estimated plant N (g kg)1) in Horsham with an r2 = 0.86 but was confounded by water supply and N interactions. By combining canopy thermal and spectral properties, varying water and N status can potentially be identified eventually permitting targeted N applications to those parts of a field where N can be used most efficiently by the crop.
Resumo:
The establishment of experimental populations of scarab larvae using eggs and early instar larvae has proven to be difficult for many researchers. Despite this, little work has been published examining ways to optimise establishment under artificial conditions. In this experiment, we examined the effect of shade and irrigation on the establishment of Heteronyx piceus Blanchard larvae introduced into pots as eggs and first-, second- and third-instar larvae to optimise artificial infestation techniques. The most important factor affecting larval establishment was the life stage introduced. Establishment of eggs and first instars was very low, with only 21% of eggs and 11% of first-instar larvae establishing. In contrast, 82% of second-instar larvae and 84% of third-instar larvae established successfully. The addition of shade marginally improved overall survival from 45% in the unshaded pots to 53% in the shaded pots. However, most of this increase was in the eggs and first instars. Irrigation did not improve survival. These results suggest that when introducing scarab larvae to field or pot experiments, second- or thirdinstar larvae should be used to maximise establishment. The provision of shade and supplementary irrigation is optional.
Resumo:
Aims: To investigate the occurrence and levels of Arcobacter spp. in pig effluent ponds and effluent-treated soil. Methods and Results: A Most Probable Number (MPN) method was developed to assess the levels of Arcobacter spp. in seven pig effluent ponds and six effluent-treated soils, immediately after effluent irrigation. Arcobacter spp. levels in the effluent ponds varied from 6.5 × 105 to 1.1 × 108 MPN 100 ml-1 and in freshly irrigated soils from 9.5 × 102 to 2.8 × 104 MPN g-1 in all piggery environments tested. Eighty-three Arcobacter isolates were subjected to an abbreviated phenotypic test scheme and examined using a multiplex polymerase chain reaction (PCR). The PCR identified 35% of these isolates as Arcobacter butzleri, 49% as Arcobacter cryaerophilus while 16% gave no band. All 13 nonreactive isolates were subjected to partial 16S rDNA sequencing and showed a high similarity (>99%) to Arcobacter cibarius. Conclusions: A. butzleri, A. cryaerophilus and A. cibarius were isolated from both piggery effluent and effluent-irrigated soil, at levels suggestive of good survival in the effluent pond. Significance and Impact of the Study: This is the first study to provide quantitative information on Arcobacter spp. levels in piggery effluent and to associate A. cibarius with pigs and piggery effluent environments.
Resumo:
This study reports on the use of naturally occurring F-specific coliphages, as well as spiked MS-2 phage, to evaluate a land-based effluent treatment/reuse system and an effluent irrigation scheme. Both the natural phages and the spiked MS-2 phage indicated that the effluent treatment/reuse system (FILTER - Filtration and Irrigated cropping for Land Treatment and Effluent Reuse) achieved a reduction in phage levels over the treatment system by one to two log10. FILTER reduced natural F-specific phage numbers from around 103 to below 102 100-ml-1 and the spiked phage from 105 to around 104 100-ml-1 (incoming compared with outgoing water). In the effluent irrigation scheme, phage spiked into the holding ponds dropped from 106 to 102 100-ml-1 after 168 h (with no detectable levels of natural F-specific phage being found prior to spiking). Only low levels of the spiked phage (102 gm-1) could be recovered from soil irrigated with phage-spiked effluent (at 106 phage 100 ml-1) or from fruits (around 102 phage per fruit) that had direct contact with soil which had been freshly irrigated with the same phage-spiked effluent.
Resumo:
Weighing lysimeters are the standard method for directly measuring evapotranspiration (ET). This paper discusses the construction, installation, and performance of two (1.52 m × 1.52 m × 2.13-m deep) repacked weighing lysimeters for measuring ET of corn and soybean in West Central Nebraska. The cost of constructing and installing each lysimeter was approximately US $12,500, which could vary depending on the availability and cost of equipment and labor. The resolution of the lysimeters was 0.0001 mV V-1, which was limited by the data processing and storage resolution of the datalogger. This resolution was equivalent to 0.064 and 0.078 mm of ET for the north and south lysimeters, respectively. Since the percent measurement error decreases with the magnitude of the ET measured, this resolution is adequate for measuring ET for daily and longer periods, but not for shorter time steps. This resolution would result in measurement errors of less than 5% for measuring ET values of ≥3 mm, but the percent error rapidly increases for lower ET values. The resolution of the lysimeters could potentially be improved by choosing a datalogger that could process and store data with a higher resolution than the one used in this study.
Resumo:
This paper aims to compare the shift in frequency distribution and skill of seasonal climate forecasting of both streamflow and rainfall in eastern Australia based on the Southern Oscillation Index (SOI) Phase system. Recent advances in seasonal forecasting of climate variables have highlighted opportunities for improving decision making in natural resources management. Forecasting of rainfall probabilities for different regions in Australia is available, but the use of similar forecasts for water resource supply has not been developed. The use of streamflow forecasts may provide better information for decision-making in irrigation supply and flow management for improved ecological outcomes. To examine the relative efficacy of seasonal forecasting of streamflow and rainfall, the shift in probability distributions and the forecast skill were evaluated using the Wilcoxon rank-sum test and the linear error in probability space (LEPS) skill score, respectively, at three river gauging stations in the Border Rivers Catchment of the Murray-Darling Basin in eastern Australia. A comparison of rainfall and streamflow distributions confirms higher statistical significance in the shift of streamflow distribution than that in rainfall distribution. Moreover, streamflow distribution showed greater skill of forecasting with 0-3 month lead time, compared to rainfall distribution.
Resumo:
Tillage is defined here in a broad sense, including disturbance of the soil and crop residues, wheel traffic and sowing opportunities. In sub-tropical, semi-arid cropping areas in Australia, tillage systems have evolved from intensively tilled bare fallow systems, with high soil losses, to reduced and no tillage systems. In recent years, the use of controlled traffic has also increased. These conservation tillage systems are successful in reducing water erosion of soil and sediment-bound chemicals. Control of runoff of dissolved nutrients and weakly sorbed chemicals is less certain. Adoption of new practices appears to have been related to practical and economic considerations, and proved to be more profitable after a considerable period of research and development. However there are still challenges. One challenge is to ensure that systems that reduce soil erosion, which may involve greater use of chemicals, do not degrade water quality in streams. Another challenge is to ensure that systems that improve water entry do not increase drainage below the crop root zone, which would increase the risk of salinity. Better understanding of how tillage practices influence soil hydrology, runoff and erosion processes should lead to better tillage systems and enable better management of risks to water quality and soil health. Finally, the need to determine the effectiveness of in-field management practices in achieving stream water quality targets in large, multi-land use catchments will challenge our current knowledge base and the tools available.
Resumo:
In semi-arid areas such as western Nebraska, interest in subsurface drip irrigation (SDI) for corn is increasing due to restricted irrigation allocations. However, crop response quantification to nitrogen (N) applications with SDI and the environmental benefits of multiple in-season (IS) SDI N applications instead of a single early-season (ES) surface application are lacking. The study was conducted in 2004, 2005, and 2006 at the University of Nebraska-Lincoln West Central Research and Extension Center in North Platte, Nebraska, comparing two N application methods (IS and ES) and three N rates (128, 186, and 278 kg N ha(-1)) using a randomized complete block design with four replications. No grain yield or biomass response was observed in 2004. In 2005 and 2006, corn grain yield and biomass production increased with increasing N rates, and the IS treatment increased grain yield, total N uptake, and gross return after N application costs (GRN) compared to the ES treatment. Chlorophyll meter readings taken at the R3 corn growth stage in 2006 showed that less N was supplied to the plant with ES compared to the IS treatment. At the end of the study, soil NO3-N masses in the 0.9 to 1.8 m depth were greater under the IS treatment compared to the ES treatment. Results suggested that greater losses of NO3-N below the root zone under the ES treatment may have had a negative effect on corn production. Under SDI systems, fertigating a recommended N rate at various corn growth stages can increase yields, GRN, and reduce NO3-N leaching in soils compared to concentrated early-season applications.
Resumo:
Adoption of conservation tillage practices on Red Ferrosol soils in the inland Burnett area of south-east Queensland has been shown to reduce runoff and subsequent soil erosion. However, improved infiltration resulting from these measures has not improved crop performance and there are suggestions of increased loss of soil water via deep drainage. This paper reports data monitoring soil water under real and artificial rainfall events in commercial fields and long-term tillage experiments, and uses the data to explore the rate and mechanisms of deep drainage in this soil type. Soils were characterised by large drainable porosities (≥0.10 m3/m3) in all parts of the profile to depths of 1.50 m, with drainable porosity similar to available water content (AWC) at 0.25 and 0.75 m, but >60% higher than AWC at 1.50 m. Hydraulic conductivity immediately below the tilled layer in both continuously cropped soils and those after a ley pasture phase was shown to decline with increasing soil moisture content, although the rate of decline was much greater in continuously cropped soil. At moisture contents approaching the drained upper limit (pore water pressure = -100cm H2O), estimates of saturated hydraulic conductivity after a ley pasture were 3-5 times greater than in continuously cropped soil, suggesting much greater rates of deep drainage in the former when soils are moist. Hydraulic tensiometers and fringe capacitance sensors monitored during real and artificial rainfall events showed evidence of soils approaching saturation in the surface layers (top 0.30-0.40 m), but there was no evidence of soil moistures exceeding the drained upper limit (i.e. pore water pressures ≤ -100 cm H2O) in deeper layers. Recovery of applied soil water within the top 1.00-1.20 m of the profile during or immediately after rainfall events declined as the starting profile moisture content increased. These effects were consistent with very rapid rates of internal drainage. Sensors deeper in the profile were unable to detect this drainage due to either non-uniformity of conducting macropores (i.e. bypass flow) or unsaturated conductivities in deeper layers that far exceed the saturated hydraulic conductivity of the infiltration throttle at the bottom of the cultivated layer. Large increases in unsaturated hydraulic conductivities are likely with only small increases in water content above the drained upper limit. Further studies with drainage lysimeters and large banks of hydraulic tensiometers are planned to quantify drainage risk in these soil types.
Resumo:
An experiment using herds of similar to 20 cows (farmlets) assessed the effects of high stocking rates on production and profitability of feeding systems based on dryland and irrigated perennial ryegrass-based pastures in a Mediterranean environment in South Australia over 4 years. A target level of milk production of 7000 L/cow.year was set, based on predicted intakes of 2.7 t DM/cow.year as concentrates, pasture intakes from 1.5 to 2.7 t/cow.year and purchased fodder. In years 1 and 2, up to 1.5 t DM/cow.year of purchased fodder was used and in years 3 and 4 the amounts were increased if necessary to enable levels of milk production per cow to be maintained at target levels. Cows in dryland farmlets calved in March to May inclusive and were stocked at 2.5, 2.9, 3.3, 3.6 and 4.1 cows/ha, while those in irrigated farmlets calved in August to October inclusive and were stocked at 4.1, 5.2, 6.3 and 7.4 cows/ha. In the first 2 years, when inputs of purchased fodder were limited, milk production per cow was reduced with higher stocking rates (P < 0.01), but in years 3 and 4 there were no differences. Mean production was 7149 kg/cow.year in years 1 and 2, and 8162 kg/cow.year in years 3 and 4. Production per hectare was very closely related to stocking rate in all years (P < 0.01), increasing from 18 to 34 t milk/ha.year for dryland farmlets (1300 to 2200 kg milk solids/ha) and from 30 to 60 t milk/ha.year for irrigated farmlets (2200 to 4100 kg milk solids/ha). Almost all of these increases were attributed to the increases in grain and purchased fodder inputs associated with the increases in stocking rate. Net pasture accumulation rates and pasture harvest were generally not altered with stocking rate, though as stocking rate increased there was a change to more of the pasture being grazed and less conserved in both dryland and irrigated farmlets. Total pasture harvest averaged similar to 8 and 14 t DM/ha.year for dryland and irrigated pastures, respectively. An exception was at the highest stocking rate under irrigation, where pugging during winter was associated with a 14% reduction in annual pasture growth. There were several indications that these high stocking rates may not be sustainable without substantial changes in management practice. There were large and positive nutrient balances and associated increases in soil mineral content (P < 0.01), especially for phosphorus and nitrate nitrogen, with both stocking rate and succeeding years. Levels under irrigation were considerably higher (up to 90 and 240 mg/kg of soil for nitrate nitrogen and phosphorus, respectively) than under dryland pastures (60 and 140 mg/kg, respectively). Soil organic carbon levels did not change with stocking rate, indicating a high level of utilisation of forage grown. Weed ingress was also high (to 22% DM) in all treatments and especially in heavily stocked irrigated pastures during winter. It was concluded the higher stocking rates used exceeded those that are feasible for Mediterranean pastures in this environment and upper levels of stocking are suggested to be 2.5 cows/ha for dryland pastures and 5.2 cows/ha for irrigated pastures. To sustain these suggested stocking rates will require further development of management practices to avoid large increases in soil minerals and weed invasion of pastures.
Resumo:
Stephen Setter, Melissa Setter, Michael Graham and Joe Vitelli recently published their paper 'Buoyancy and germination of pond apple (Annona glabra L.) propagules in fresh and salt water' in Proceedings of the 16th Australian Weeds Conference. Stephen also presented this paper at the conference. Pond apple is an aggressive woody weed which has invaded many wetlands, drainage lines and riparian systems across the Wet Tropics bioregion of Far North Queensland. Most fruit and seed produced by pond apple during the summer wet season fall directly into creeks, river banks, flood plains and swamps from where they are dispersed. They reported that pond apple seeds can float for up to 12 months in either fresh or salt water, with approximately 38% of these seeds germinating in a soil medium once removed from the experimental water tanks at South Johnstone. Their study suggested that the removal of reproductive trees from areas adjacent to creeks and rivers will have an immediate impact on potential spread of pond apple by limiting seed input into flowing water bodies.
Resumo:
Bemisia tabaci, biotype B, commonly known as the silverleaf whitefly (SLW) is an alien species that invaded Australia in the mid-90s. This paper reports on the invasion ecology of SLW and the factors that are likely to have contributed to the first outbreak of this major pest in an Australian cotton cropping system, population dynamics of SLW within whitefly-susceptible crop (cotton and cucurbit) and non-crop vegetation (sowthistle, Sonchus spp.) components of the cropping system were investigated over four consecutive growing seasons (September-June) 2001/02-2004/05 in the Emerald Irrigation Area (EIA) of Queensland, Australia. Based on fixed geo-referenced sampling sites, variation in spatial and temporal abundance of SLW within each system component was quantified to provide baseline data for the development of ecologically sustainable pest management strategies. Parasitism of large (3rd and 4th instars) SLW nymphs by native aphelinid wasps was quantified to determine the potential for natural control of SLW populations. Following the initial outbreak in 2001/02, SLW abundance declined and stabilised over the next three seasons. The population dynamics of SLW is characterised by inter-seasonal population cycling between the non-crop (weed) and cotton components of the EIA cropping system. Cotton was the largest sink for and source of SLW during the study period. Over-wintering populations dispersed from weed host plant sources to cotton in spring followed by a reverse dispersal in late summer and autumn to broad-leaved crops and weeds. A basic spatial source-sink analysis showed that SLW adult and nymph densities were higher in cotton fields that were closer to over-wintering weed sources throughout spring than in fields that were further away. Cucurbit fields were not significant sources of SLW and did not appear to contribute significantly to the regional population dynamics of the pest. Substantial parasitism of nymphal stages throughout the study period indicates that native parasitoid species and other natural enemies are important sources of SLW mortality in Australian cotton production systems. Weather conditions and use of broad-spectrum insecticides for pest control are implicated in the initial outbreak and on-going pest status of SLW in the region.
Resumo:
Seed persistence is poorly quantified for invasive plants of subtropical and tropical environments and Lantana camara, one of the world's worst weeds, is no exception. We investigated germination, seedling emergence, and seed survival of two lantana biotypes (Pink and pink-edged red [PER]) in southeastern Queensland, Australia. Controlled experiments were undertaken in 2002 and repeated in 2004, with treatments comprising two differing environmental regimes (irrigated and natural rainfall) and sowing depths (0 and 2 cm). Seed survival and seedling emergence were significantly affected by all factors (time, biotype, environment, sowing depth, and cohort) (P < 0.001). Seed dormancy varied with treatment (environment, sowing depth, biotype, and cohort) (P < 0.001), but declined rapidly after 6 mo. Significant differential responses by the two biotypes to sowing depth and environment were detected for both seed survival and seedling emergence (P < 0.001). Seed mass was consistently lower in the PER biotype at the population level (P < 0.001), but this variation did not adequately explain the differential responses. Moreover, under natural rainfall the magnitude of the biotype effect was unlikely to result in ecologically significant differences. Seed survival after 36 mo under natural rainfall ranged from 6.8 to 21.3%. Best fit regression analysis of the decline in seed survival over time yielded a five-parameter exponential decay model with a lower asymptote approaching −0.38 (% seed survival = [( 55 − (−0.38)) • e (k • t)] + −0.38; R2 = 88.5%; 9 df). Environmental conditions and burial affected the slope parameter or k value significantly (P < 0.01). Seed survival projections from the model were greatest for buried seeds under natural rainfall (11 yr) and least under irrigation (3 yr). Experimental data and model projections suggest that lantana has a persistent seed bank and this should be considered in management programs, particularly those aimed at eradication.
Resumo:
The present study set out to test the hypothesis through field and simulation studies that the incorporation of short-term summer legumes, particularly annual legume lablab (Lablab purpureus cv. Highworth), in a fallow-wheat cropping system will improve the overall economic and environmental benefits in south-west Queensland. Replicated, large plot experiments were established at five commercial properties by using their machineries, and two smaller plot experiments were established at two intensively researched sites (Roma and St George). A detailed study on various other biennial and perennial summer forage legumes in rotation with wheat and influenced by phosphorus (P) supply (10 and 40 kg P/ha) was also carried out at the two research sites. The other legumes were lucerne (Medicago sativa), butterfly pea (Clitoria ternatea) and burgundy bean (Macroptilium bracteatum). After legumes, spring wheat (Triticum aestivum) was sown into the legume stubble. The annual lablab produced the highest forage yield, whereas germination, establishment and production of other biennial and perennial legumes were poor, particularly in the red soil at St George. At the commercial sites, only lablab-wheat rotations were experimented, with an increased supply of P in subsurface soil (20 kg P/ha). The lablab grown at the commercial sites yielded between 3 and 6 t/ha forage yield over 2-3 month periods, whereas the following wheat crop with no applied fertiliser yielded between 0.5 to 2.5 t/ha. The wheat following lablab yielded 30% less, on average, than the wheat in a fallow plot, and the profitability of wheat following lablab was slightly higher than that of the wheat following fallow because of greater costs associated with fallow management. The profitability of the lablab-wheat phase was determined after accounting for the input costs and additional costs associated with the management of fallow and in-crop herbicide applications for a fallow-wheat system. The economic and environmental benefits of forage lablab and wheat cropping were also assessed through simulations over a long-term climatic pattern by using economic (PreCAPS) and biophysical (Agricultural Production Systems Simulation, APSIM) decision support models. Analysis of the long-term rainfall pattern (70% in summer and 30% in winter) and simulation studies indicated that ~50% time a wheat crop would not be planted or would fail to produce a profitable crop (grain yield less than 1 t/ha) because of less and unreliable rainfall in winter. Whereas forage lablab in summer would produce a profitable crop, with a forage yield of more than 3 t/ha, ~90% times. Only 14 wheat crops (of 26 growing seasons, i.e. 54%) were profitable, compared with 22 forage lablab (of 25 seasons, i.e. 90%). An opportunistic double-cropping of lablab in summer and wheat in winter is also viable and profitable in 50% of the years. Simulation studies also indicated that an opportunistic lablab-wheat cropping can reduce the potential runoff+drainage by more than 40% in the Roma region, leading to improved economic and environmental benefits.