972 resultados para Rural irrigated soil
Resumo:
Background The onsite treatment of sewage and effluent disposal is widely prevalent in rural and urban fringe areas due to the general unavailability of reticulated wastewater collection systems. Despite the low technology of the systems, failure is common and in many cases leading to adverse public health and environmental consequences. It is important therefore that careful consideration is given to the design and location of onsite sewage treatment systems. This requires an understanding of the factors that influence treatment performance. The use of subsurface absorption systems is the most common form of effluent disposal for onsite sewage treatment, particularly for septic tanks. Also, in the case of septic tanks, a subsurface disposal system is generally an integral component of the sewage treatment process. Site specific factors play a key role in the onsite treatment of sewage. The project The primary aims of the research project were: • to relate treatment performance of onsite sewage treatment systems to soil conditions at site; • to evaluate current research relating to onsite sewage treatment; and, • to identify key issues where currently there is a lack of relevant research. These tasks were undertaken with the objective of facilitating the development of performance based planning and management strategies for onsite sewage treatment. The primary focus of this research project has been on septic tanks. By implication, the investigation has been confined to subsurface soil absorption systems. The design and treatment processes taking place within the septic tank chamber itself did not form a part of the investigation. Five broad categories of soil types prevalent in the Brisbane region have been considered in this project. The number of systems investigated was based on the proportionate area of urban development within the Brisbane region located on each of the different soil types. In the initial phase of the investigation, the majority of the systems evaluated were septic tanks. However, a small number of aerobic wastewater treatment systems (AWTS) were also included. The primary aim was to compare the effluent quality of systems employing different generic treatment processes. It is important to note that the number of each different type of system investigated was relatively small. Consequently, this does not permit a statistical analysis to be undertaken of the results obtained for comparing different systems. This is an important issue considering the large number of soil physico-chemical parameters and landscape factors that can influence treatment performance and their wide variability. The report This report is the last in a series of three reports focussing on the performance evaluation of onsite treatment of sewage. The research project was initiated at the request of the Brisbane City Council. The project component discussed in the current report outlines the detailed soil investigations undertaken at a selected number of sites. In the initial field sampling, a number of soil chemical properties were assessed as indicators to investigate the extent of effluent flow and to help understand what soil factors renovate the applied effluent. The soil profile attributes, especially texture, structure and moisture regime were examined more in an engineering sense to determine the effect of movement of water into and through the soil. It is important to note that it is not only the physical characteristics, but also the chemical characteristics of the soil as well as landscape factors play a key role in the effluent renovation process. In order to understand the complex processes taking place in a subsurface effluent disposal area, influential parameters were identified using soil chemical concepts. Accordingly, the primary focus of this final phase of the research project was to identify linkages between various soil chemical parameters and landscape patterns and their contribution to the effluent renovation process. The research outcomes will contribute to the development of robust criteria for evaluating the performance of subsurface effluent disposal systems. The outcomes The key findings from the soil investigations undertaken are: • Effluent renovation is primarily undertaken by a combination of various soil physico-chemical parameters and landscape factors, thereby making the effluent renovation processes strongly site dependent. • Decisions regarding site suitability for effluent disposal should not be based purely in terms of the soil type. A number of other factors such as the site location in the catena, the drainage characteristics and other physical and chemical characteristics, also exert a strong influence on site suitability. • Sites, which are difficult to characterise in terms of suitability for effluent disposal, will require a detailed soil physical and chemical analysis to be undertaken to a minimum depth of at least 1.2 m. • The Ca:Mg ratio and Exchangeable Sodium Percentage are important parameters in soil suitability assessment. A Ca:Mg ratio of less than 0.5 would generally indicate a high ESP. This in turn would mean that Na and possibly Mg are the dominant exchangeable cations, leading to probable clay dispersion. • A Ca:Mg ratio greater than 0.5 would generally indicate a low ESP in the profile, which in turn indicates increased soil stability. • In higher clay percentage soils, low ESP can have a significant effect. • The presence of high exchangeable Na can be counteracted by the presence of swelling clays, and an exchange complex co-dominated by exchangeable Ca and exchangeable Mg. This aids absorption of cations at depth, thereby reducing the likelihood of dispersion. • Salt is continually added to the soil by the effluent and problems may arise if the added salts accumulate to a concentration that is harmful to the soil structure. Under such conditions, good drainage is essential in order to allow continuous movement of water and salt through the profile. Therefore, for a site to be sustainable, it would have a maximum application rate of effluent. This would be dependent on subsurface characteristics and the surface area available for effluent disposal. • The dosing regime for effluent disposal can play a significant role in the prevention of salt accumulation in the case of poorly draining sites. Though intermittent dosing was not considered satisfactory for the removal of the clogging mat layer, it has positive attributes in the context of removal of accumulated salts in the soil.
Resumo:
Through a forest inventory in parts of the Amudarya river delta, Central Asia, we assessed the impact of ongoing forest degradation on the emissions of greenhouse gases (GHG) from soils. Interpretation of aerial photographs from 2001, combined with data on forest inventory in 1990 and field survey in 2003 provided comprehensive information about the extent and changes of the natural tugai riparian forests and tree plantations in the delta. The findings show an average annual deforestation rate of almost 1.3% and an even higher rate of land use change from tugai forests to land with only sparse tree cover. These annual rates of deforestation and forest degradation are higher than the global annual forest loss. By 2003, the tugai forest area had drastically decreased to about 60% compared to an inventory in 1990. Significant differences in soil GHG emissions between forest and agricultural land use underscore the impact of the ongoing land use change on the emission of soil-borne GHGs. The conversion of tugai forests into irrigated croplands will release 2.5 t CO2 equivalents per hectare per year due to elevated emissions of N2O and CH4. This demonstrates that the ongoing transformation of tugai forests into agricultural land-use systems did not only lead to a loss of biodiversity and of a unique ecosystem, but substantially impacts the biosphere-atmosphere exchange of GHG and soil C and N turnover processes.
Resumo:
Irrigation is known to stimulate soil microbial carbon and nitrogen turnover and potentially the emissions of nitrous oxide (N2O) and carbon dioxide (CO2). We conducted a study to evaluate the effect of three different irrigation intensities on soil N2O and CO2 fluxes and to determine if irrigation management can be used to mitigate N2O emissions from irrigated cotton on black vertisols in South-Eastern Queensland, Australia. Fluxes were measured over the entire 2009/2010 cotton growing season with a fully automated chamber system that measured emissions on a sub-daily basis. Irrigation intensity had a significant effect on CO2 emission. More frequent irrigation stimulated soil respiration and seasonal CO2 fluxes ranged from 2.7 to 4.1 Mg-C ha−1 for the treatments with the lowest and highest irrigation frequency, respectively. N2O emission happened episodic with highest emissions when heavy rainfall or irrigation coincided with elevated soil mineral N levels and seasonal emissions ranged from 0.80 to 1.07 kg N2O-N ha−1 for the different treatments. Emission factors (EF = proportion of N fertilizer emitted as N2O) over the cotton cropping season, uncorrected for background emissions, ranged from 0.40 to 0.53 % of total N applied for the different treatments. There was no significant effect of the different irrigation treatments on soil N2O fluxes because highest emission happened in all treatments following heavy rainfall caused by a series of summer thunderstorms which overrode the effect of the irrigation treatment. However, higher irrigation intensity increased the cotton yield and therefore reduced the N2O intensity (N2O emission per lint yield) of this cropping system. Our data suggest that there is only limited scope to reduce absolute N2O emissions by different irrigation intensities in irrigated cotton systems with summer dominated rainfall. However, the significant impact of the irrigation treatments on the N2O intensity clearly shows that irrigation can easily be used to optimize the N2O intensity of such a system.
Resumo:
Background and Aims: Irrigation management affects soil water dynamics as well as the soil microbial carbon and nitrogen turnover and potentially the biosphere-atmosphere exchange of greenhouse gasses (GHG). We present a study on the effect of three irrigation treatments on the emissions of nitrous oxide (N2O) from irrigated wheat on black vertisols in South-Eastern Queensland, Australia. Methods: Soil N2O fluxes from wheat were monitored over one season with a fully automated system that measured emissions on a sub-daily basis. Measurements were taken from 3 subplots for each treatment within a randomized split-plot design. Results: Highest N2O emissions occurred after rainfall or irrigation and the amount of irrigation water applied was found to influence the magnitude of these “emission pulses”. Daily N2O emissions varied from -0.74 to 20.46 g N2O-N ha-1 day-1 resulting in seasonal losses ranging from 0.43 to 0.75 kg N2O N ha-1 season -1 for the different irrigation treatments. Emission factors (EF = proportion of N fertilizer emitted as N2O) over the wheat cropping season, uncorrected for background emissions, ranged from 0.2 to 0.4% of total N applied for the different treatments. Highest seasonal N2O emissions were observed in the treatment with the highest irrigation intensity; however, the N2O intensity (N2O emission per crop yield) was highest in the treatment with the lowest irrigation intensity. Conclusions: Our data suggest that timing and amount of irrigation can effectively be used to reduce N2O losses from irrigated agricultural systems; however, in order to develop sustainable mitigation strategies the N2O intensity of a cropping system is an important concept that needs to be taken into account.
Resumo:
A unique high temporal frequency dataset from an irrigated cotton-wheat rotation was used to test the agroecosystem model DayCent to simulate daily N2O emissions from sub-tropical vertisols under different irrigation intensities. DayCent was able to simulate the effect of different irrigation intensities on N2O fluxes and yield, although it tended to overestimate seasonal fluxes during the cotton season. DayCent accurately predicted soil moisture dynamics and the timing and magnitude of high fluxes associated with fertilizer additions and irrigation events. At the daily scale we found a good correlation of predicted vs. measured N2O fluxes (r2 = 0.52), confirming that DayCent can be used to test agricultural practices for mitigating N2O emission from irrigated cropping systems. A 25 year scenario analysis indicated that N2O losses from irrigated cotton-wheat rotations on black vertisols in Australia can be substantially reduced by an optimized fertilizer and irrigation management system (i.e. frequent irrigation, avoidance of excessive fertiliser application), while sustaining maximum yield potentials.
Resumo:
Microbial respiratory reduction of nitrous oxide (N2O) to dinitrogen (N2) via denitrification plays a key role within the global N-cycle since it is the most important process for converting reactive nitrogen back into inert molecular N2. However, due to methodological constraints, we still lack a comprehensive, quantitative understanding of denitrification rates and controlling factors across various ecosystems. We investigated N2, N2O and NO emissions from irrigated cotton fields within the Aral Sera Basin using the He/O2 atmosphere gas flow soil core technique and an incubation assay. NH4NO3 fertilizer, equivalent to 75 kg ha−1 and irrigation water, adjusting the water holding capacity to 70, 100 and 130% were applied to the incubation vessels to assess its influence on gaseous N emissions. Under soil conditions as they are naturally found after concomitant irrigation and fertilization, denitrification was the dominant process and N2 the main end product of denitrification. The mean ratios of N2/N2O emissions increased with increasing soil moisture content. N2 emissions exceeded N2O emissions by a factor of 5 ± 2 at 70% soil water holding capacity (WHC) and a factor of 55 ± 27 at 130% WHC. The mean ratios of N2O/NO emissions varied between 1.5 ± 0.4 (70% WHC) and 644 ± 108 (130% WHC). The magnitude of N2 emissions for irrigated cotton was estimated to be in the range of 24 ± 9 to 175 ± 65 kg-N ha−1season−1, while emissions of NO were only of minor importance (between 0.1 to 0.7 kg-N ha−1 season−1). The findings demonstrate that for irrigated dryland soils in the Aral Sera Basin, denitrification is a major pathway of N-loss and that substantial amounts of N-fertilizer are lost as N2 to the atmosphere for irrigated dryland soils.
Resumo:
Nitrous oxide emissions were monitored at three sites over a 2-year period in irrigated cotton fields in Khorezm, Uzbekistan, a region located in the arid deserts of the Aral Sea Basin. The fields were managed using different fertilizer management strategies and irrigation water regimes. N2O emissions varied widely between years, within 1 year throughout the vegetation season, and between the sites. The amount of irrigation water applied, the amount and type of N fertilizer used, and topsoil temperature had the greatest effect on these emissions. Very high N2O emissions of up to 3000 μg N2O-N m−2 h−1 were measured in periods following N-fertilizer application in combination with irrigation events. These “emission pulses” accounted for 80–95% of the total N2O emissions between April and September and varied from 0.9 to 6.5 kg N2O-N ha−1.. Emission factors (EF), uncorrected for background emission, ranged from 0.4% to 2.6% of total N applied, corresponding to an average EF of 1.48% of applied N fertilizer lost as N2O-N. This is in line with the default global average value of 1.25% of applied N used in calculations of N2O emissions by the Intergovernmental Panel on Climate Change. During the emission pulses, which were triggered by high soil moisture and high availability of mineral N, a clear diurnal pattern of N2O emissions was observed, driven by daily changes in topsoil temperature. For these periods, air sampling from 8:00 to 10:00 and from 18:00 to 20:00 was found to best represent the mean daily N2O flux rates. The wet topsoil conditions caused by irrigation favored the production of N2O from NO3− fertilizers, but not from NH4+ fertilizers, thus indicating that denitrification was the main process causing N2O emissions. It is therefore argued that there is scope for reducing N2O emission from irrigated cotton production; i.e. through the exclusive use of NH4+ fertilizers. Advanced application and irrigation techniques such as subsurface fertilizer application, drip irrigation and fertigation may also minimize N2O emission from this regionally dominant agro-ecosystem.
Resumo:
Amelioration of sodic soils is commonly achieved by applying gypsum, which increases soil hydraulic conductivity by altering soil chemistry. The magnitude of hydraulic conductivity increases expected in response to gypsum applications depends on soil properties including clay content, clay mineralogy, and bulk density. The soil analyzed in this study was a kaolinite rich sodic clay soil from an irrigated area of the Lower Burdekin coastal floodplain in tropical North Queensland, Australia. The impact of gypsum amelioration was investigated by continuously leaching soil columns with a saturated gypsum solution, until the hydraulic conductivity and leachate chemistry stabilized. Extended leaching enabled the full impacts of electrolyte effects and cation exchange to be determined. For the columns packed to 1.4 g/cm3, exchangeable sodium concentrations were reduced from 5.0 ± 0.5 mEq/100 g to 0.41 ± 0.06 mEq/100 g, exchangeable magnesium concentrations were reduced from 13.9 ± 0.3 mEq/100 g to 4.3 ± 2.12 mEq/100 g, and hydraulic conductivity increased to 0.15 ± 0.04 cm/d. For the columns packed to 1.3 g/cm3, exchangeable sodium concentrations were reduced from 5.0 ± 0.5 mEq/100 g to 0.51 ± 0.03 mEq/100 g, exchangeable magnesium concentrations were reduced from 13.9 ± 0.3 mEq/100 g to 0.55 ± 0.36 mEq/100 g, and hydraulic conductivity increased to 0.96 ± 0.53 cm/d. The results of this study highlight that both sodium and magnesium need to be taken into account when determining the suitability of water quality for irrigation of sodic soils and that soil bulk density plays a major role in controlling the extent of reclamation that can be achieved using gypsum applications.
Resumo:
Variable-rate technologies and site-specific crop nutrient management require real-time spatial information about the potential for response to in-season crop management interventions. Thermal and spectral properties of canopies can provide relevant information for non-destructive measurement of crop water and nitrogen stresses. In previous studies, foliage temperature was successfully estimated from canopy-scale (mixed foliage and soil) temperatures and the multispectral Canopy Chlorophyll Content Index (CCCI) was effective in measuring canopy-scale N status in rainfed wheat (Triticum aestivum L.) systems in Horsham, Victoria, Australia. In the present study, results showed that under irrigated wheat systems in Maricopa, Arizona, USA, the theoretical derivation of foliage temperature unmixing produced relationships similar to those in Horsham. Derivation of the CCCI led to an r2 relationship with chlorophyll a of 0.53 after Zadoks stage 43. This was later than the relationship (r2 = 0.68) developed for Horsham after Zadoks stage 33 but early enough to be used for potential mid-season N fertilizer recommendations. Additionally, ground-based hyperspectral data estimated plant N (g kg)1) in Horsham with an r2 = 0.86 but was confounded by water supply and N interactions. By combining canopy thermal and spectral properties, varying water and N status can potentially be identified eventually permitting targeted N applications to those parts of a field where N can be used most efficiently by the crop.
Resumo:
Groundwater tables are rising beneath irrigated fields in some areas of the Lower Burdekin in North Queensland, Australia. The soils where this occurs are predominantly sodic clay soils with low hydraulic conductivities. Many of these soils have been treated by applying gypsum or by increasing the salinity of irrigation water by mixing saline groundwater with fresh river water. While the purpose of these treatments is to increase infiltration into the surface soils and improve productivity of the root zone, it is thought that the treatments may have altered the soil hydraulic properties well below the root zone leading to increased groundwater recharge and rising water tables. In this paper we discuss the use of column experiments and HYDRUS modelling, with major ion reaction and transport and soil water chemistry-dependent hydraulic conductivity, to assess the likely depth, magnitude and timing of the impacts of surface soil amelioration on soil hydraulic properties below the root zone and hence groundwater recharge. In the experiments, columns of sodic clays from the Lower Burdekin were leached for extended periods of time with either gypsum solutions or mixed cation salt solutions and change s in hydraulic conductivity were measured. Leaching with a gypsum solution for an extended time period, until the flow rate stabilised, resulted in an approximately twenty fold increase in the hydraulic conductivity when compared with a low salinity, mixed cation solution. HYDRUS modelling was used to high light the role of those factors which might influence the impacts of soil treatment, particularly at depth, including the large amounts of rain during the relatively short wet season and the presence of thick low permeability clay layers.
Resumo:
In semi-arid areas such as western Nebraska, interest in subsurface drip irrigation (SDI) for corn is increasing due to restricted irrigation allocations. However, crop response quantification to nitrogen (N) applications with SDI and the environmental benefits of multiple in-season (IS) SDI N applications instead of a single early-season (ES) surface application are lacking. The study was conducted in 2004, 2005, and 2006 at the University of Nebraska-Lincoln West Central Research and Extension Center in North Platte, Nebraska, comparing two N application methods (IS and ES) and three N rates (128, 186, and 278 kg N ha(-1)) using a randomized complete block design with four replications. No grain yield or biomass response was observed in 2004. In 2005 and 2006, corn grain yield and biomass production increased with increasing N rates, and the IS treatment increased grain yield, total N uptake, and gross return after N application costs (GRN) compared to the ES treatment. Chlorophyll meter readings taken at the R3 corn growth stage in 2006 showed that less N was supplied to the plant with ES compared to the IS treatment. At the end of the study, soil NO3-N masses in the 0.9 to 1.8 m depth were greater under the IS treatment compared to the ES treatment. Results suggested that greater losses of NO3-N below the root zone under the ES treatment may have had a negative effect on corn production. Under SDI systems, fertigating a recommended N rate at various corn growth stages can increase yields, GRN, and reduce NO3-N leaching in soils compared to concentrated early-season applications.
Resumo:
Dairy farms in subtropical Australia use irrigated, annually sown short-term ryegrass (Lolium multiflorum) or mixtures of short-term ryegrass and white (Trifolium repens) and Persian (shaftal) (T. resupinatum) clover during the winter-spring period in all-year-round milk production systems. A series of small plot cutting experiments was conducted in 3 dairying regions (tropical upland, north Queensland, and subtropical southeast Queensland and northern New South Wales) to determine the most effective rate and frequency of application of nitrogen (N) fertiliser. The experiments were not grazed, nor was harvested material returned to the plots, after sampling. Rates up to 100 kg N/ha.month (as urea or calcium ammonium nitrate) and up to 200 kg N/ha every 2 months (as urea) were applied to pure stands of ryegrass in 1991. In 1993 and 1994, urea, at rates up to 150 kg N/ha.month and to 200 kg N/ha every 2 months, was applied to pure stands of ryegrass; urea, at rates up to 50 kg N/ha.month, was also applied to ryegrass-clover mixtures. The results indicate that applications of 50-85 kg N/ha.month can be recommended for short-term ryegrass pastures throughout the subtropics and tropical uplands of eastern Australia, irrespective of soil type. At this rate, dry matter yields will reach about 90% of their potential, forage nitrogen concentration will be increased, there is minimal risk to stock from nitrate poisoning and there will be no substantial increase in soil N. The rate of N for ryegrass-clover pastures is slightly higher than for pure ryegrass but, at these rates, the clover component will be suppressed. However, increased ryegrass yields and higher forage nitrogen concentrations will compensate for the reduced clover component. At application rates up to 100 kg N/ha.month, build-up of NO3--N and NH4+-N in soil was generally restricted to the surface layers (0-20 cm) of the soil, but there was a substantial increase throughout the soil profile at 150 kg N/ha.month. The build-up of NO3--N and NH4+-N was greater and was found at lower rates on the lighter soil compared with heavy clays. Generally, most of the soil N was in the NO3--N form and most was in the top 20 cm.
Resumo:
An experiment using herds of similar to 20 cows (farmlets) assessed the effects of high stocking rates on production and profitability of feeding systems based on dryland and irrigated perennial ryegrass-based pastures in a Mediterranean environment in South Australia over 4 years. A target level of milk production of 7000 L/cow.year was set, based on predicted intakes of 2.7 t DM/cow.year as concentrates, pasture intakes from 1.5 to 2.7 t/cow.year and purchased fodder. In years 1 and 2, up to 1.5 t DM/cow.year of purchased fodder was used and in years 3 and 4 the amounts were increased if necessary to enable levels of milk production per cow to be maintained at target levels. Cows in dryland farmlets calved in March to May inclusive and were stocked at 2.5, 2.9, 3.3, 3.6 and 4.1 cows/ha, while those in irrigated farmlets calved in August to October inclusive and were stocked at 4.1, 5.2, 6.3 and 7.4 cows/ha. In the first 2 years, when inputs of purchased fodder were limited, milk production per cow was reduced with higher stocking rates (P < 0.01), but in years 3 and 4 there were no differences. Mean production was 7149 kg/cow.year in years 1 and 2, and 8162 kg/cow.year in years 3 and 4. Production per hectare was very closely related to stocking rate in all years (P < 0.01), increasing from 18 to 34 t milk/ha.year for dryland farmlets (1300 to 2200 kg milk solids/ha) and from 30 to 60 t milk/ha.year for irrigated farmlets (2200 to 4100 kg milk solids/ha). Almost all of these increases were attributed to the increases in grain and purchased fodder inputs associated with the increases in stocking rate. Net pasture accumulation rates and pasture harvest were generally not altered with stocking rate, though as stocking rate increased there was a change to more of the pasture being grazed and less conserved in both dryland and irrigated farmlets. Total pasture harvest averaged similar to 8 and 14 t DM/ha.year for dryland and irrigated pastures, respectively. An exception was at the highest stocking rate under irrigation, where pugging during winter was associated with a 14% reduction in annual pasture growth. There were several indications that these high stocking rates may not be sustainable without substantial changes in management practice. There were large and positive nutrient balances and associated increases in soil mineral content (P < 0.01), especially for phosphorus and nitrate nitrogen, with both stocking rate and succeeding years. Levels under irrigation were considerably higher (up to 90 and 240 mg/kg of soil for nitrate nitrogen and phosphorus, respectively) than under dryland pastures (60 and 140 mg/kg, respectively). Soil organic carbon levels did not change with stocking rate, indicating a high level of utilisation of forage grown. Weed ingress was also high (to 22% DM) in all treatments and especially in heavily stocked irrigated pastures during winter. It was concluded the higher stocking rates used exceeded those that are feasible for Mediterranean pastures in this environment and upper levels of stocking are suggested to be 2.5 cows/ha for dryland pastures and 5.2 cows/ha for irrigated pastures. To sustain these suggested stocking rates will require further development of management practices to avoid large increases in soil minerals and weed invasion of pastures.
Resumo:
It has been reported that high-density planting of sugarcane can improve cane and sugar yield through promoting rapid canopy closure and increasing radiation interception earlier in crop growth. It is widely known that the control of adverse soil biota through fumigation (removes soil biological constraints and improves soil health) can improve cane and sugar yield. Whether the responses to high-density planting and improved soil health are additive or interactive has important implications for the sugarcane production system. Field experiments established at Bundaberg and Mackay, Queensland, Australia, involved all combinations of 2-row spacings (0.5 and 1.5 m), two planting densities (27 000 and 81 000 two-eyed setts/ha), and two soil fumigation treatments (fumigated and non-fumigated). The Bundaberg experiment had two cultivars (Q124, Q155), was fully irrigated, and harvested 15 months after planting. The Mackay experiment had one cultivar (Q117), was grown under rainfed conditions, and harvested 10 months after planting. High-density planting (81 000 setts/ha in 0.5-m rows) did not produce any more cane or sugar yield at harvest than low-density planting (27 000 setts/ha in 1.5-m rows) regardless of location, crop duration (15 v. 10 months), water supply (irrigated v. rainfed), or soil health (fumigated v. non-fumigated). Conversely, soil fumigation generally increased cane and sugar yields regardless of site, row spacing, and planting density. In the Bundaberg experiment there was a large fumigation x cultivar x density interaction (P<0.01). Cultivar Q155 responded positively to higher planting density in non-fumigated soil but not in fumigated soil, while Q124 showed a negative response to higher planting density in non-fumigated soil but no response in fumigated soil. In the Mackay experiment, Q117 showed a non-significant trend of increasing yield in response to increasing planting density in non-fumigated soil, similar to the Q155 response in non-fumigated soil at Bundaberg. The similarity in yield across the range of row spacings and planting densities within experiments was largely due to compensation between stalk number and stalk weight, particularly when fumigation was used to address soil health. Further, the different cultivars (Q124 and Q155 at Bundaberg and Q117 at Mackay) exhibited differing physiological responses to the fumigation, row spacing, and planting density treatments. These included the rate of tiller initiation and subsequent loss, changes in stalk weight, and propensity to lodging. These responses suggest that there may be potential for selecting cultivars suited to different planting configurations.
Resumo:
A 2000-03 study to improve irrigation efficiency of grassed urban public areas in northern Australia found it would be difficult to grow most species in dry areas without supplementary watering. Sporoboulus virginicus and sand couch, Zoysia macrantha, were relatively drought-tolerant. Managers of sporting fields, parks and gardens could more than halve their current water use by irrigating over a long cycle, irrigating according to seasonal conditions and using grasses with low water use and sound soil management practices that encourage deep rooting. The use of effluent water provides irrigation and fertiliser cost savings and reduced nitrogen and phosphorus discharge to local waterways. Projected savings are $8000/ha/year in water costs for a typical sporting field.