976 resultados para available water
Resumo:
Vitis vinifera L. cv. Crimson Seedless is a late season red table grape developed in 1989, with a high market value and increasingly cultivated under protected environments to extend the availability of seedless table grapes into the late fall. The purpose of this work was to evaluate leaf water potential and sap flow as indicators of water stress in Crimson Seedless vines under standard and reduced irrigation strategy, consisting of 70 % of the standard irrigation depth. Additionally, two sub-treatments were applied, consisting of normal irrigation throughout the growing season and a short irrigation induced stress period between veraison and harvest. Leaf water potential measurements coherently signaled crop-available water variations caused by different irrigation treatments, suggesting that this plant-based method can be reliably used to identify water-stress conditions. The use of sap flow density data to establish a ratio based on a reference ‘well irrigated vine’ and less irrigated vines can potentially be used to signal differences in the transpiration rates, which may be suitable for improving irrigation management strategies while preventing undesirable levels of water stress. Although all four irrigation strategies resulted in the production of quality table grapes, significant differences (p ≤ 0.05) were found in both berry weight and sugar content between the standard irrigation and reduced irrigation treatments. Reduced irrigation increased slightly the average berry size as well as sugar content and technical maturity index. The 2-week irrigation stress period had a negative effect on these parameters.
Resumo:
Abstract Vitis vinifera L. cv. Crimson Seedless is a late season red table grape developed in 1989, with a high market value and increasingly cultivated under protected environments to extend the availability of seedless table grapes into the late fall. The purpose of this work was to evaluate leaf water potential and sap flow as indicators of water stress in Crimson Seedless vines under standard and reduced irrigation strategy, consisting of 70 % of the standard irrigation depth. Additionally, two sub-treatments were applied, consisting of normal irrigation throughout the growing season and a short irrigation induced stress period between veraison and harvest. Leaf water potential measurements coherently signaled crop-available water variations caused by different irrigation treatments, suggesting that this plant-based method can be reliably used to identify water-stress conditions. The use of sap flow density data to establish a ratio based on a reference ‘well irrigated vine’ and less irrigated vines can potentially be used to signal differences in the transpiration rates, which may be suitable for improving irrigation management strategies while preventing undesirable levels of water stress. Although all four irrigation strategies resulted in the production of quality table grapes, significant differences (p ≤ 0.05) were found in both berry weight and sugar content between the standard irrigation and reduced irrigation treatments. Reduced irrigation increased slightly the average berry size as well as sugar content and technical maturity index. The 2-week irrigation stress period had a negative effect on these parameters.
Resumo:
Reliable pollutant build-up prediction plays a critical role in the accuracy of urban stormwater quality modelling outcomes. However, water quality data collection is resource demanding compared to streamflow data monitoring, where a greater quantity of data is generally available. Consequently, available water quality data sets span only relatively short time scales unlike water quantity data. Therefore, the ability to take due consideration of the variability associated with pollutant processes and natural phenomena is constrained. This in turn gives rise to uncertainty in the modelling outcomes as research has shown that pollutant loadings on catchment surfaces and rainfall within an area can vary considerably over space and time scales. Therefore, the assessment of model uncertainty is an essential element of informed decision making in urban stormwater management. This paper presents the application of a range of regression approaches such as ordinary least squares regression, weighted least squares Regression and Bayesian Weighted Least Squares Regression for the estimation of uncertainty associated with pollutant build-up prediction using limited data sets. The study outcomes confirmed that the use of ordinary least squares regression with fixed model inputs and limited observational data may not provide realistic estimates. The stochastic nature of the dependent and independent variables need to be taken into consideration in pollutant build-up prediction. It was found that the use of the Bayesian approach along with the Monte Carlo simulation technique provides a powerful tool, which attempts to make the best use of the available knowledge in the prediction and thereby presents a practical solution to counteract the limitations which are otherwise imposed on water quality modelling.
Resumo:
Stormwater pollution is linked to stream ecosystem degradation. In predicting stormwater pollution, various types of modelling techniques are adopted. The accuracy of predictions provided by these models depends on the data quality, appropriate estimation of model parameters, and the validation undertaken. It is well understood that available water quality datasets in urban areas span only relatively short time scales unlike water quantity data, which limits the applicability of the developed models in engineering and ecological assessment of urban waterways. This paper presents the application of leave-one-out (LOO) and Monte Carlo cross validation (MCCV) procedures in a Monte Carlo framework for the validation and estimation of uncertainty associated with pollutant wash-off when models are developed using a limited dataset. It was found that the application of MCCV is likely to result in a more realistic measure of model coefficients than LOO. Most importantly, MCCV and LOO were found to be effective in model validation when dealing with a small sample size which hinders detailed model validation and can undermine the effectiveness of stormwater quality management strategies.
Resumo:
Continuous cultivation and cereal cropping of southern Queensland soils previously supporting native vegetation have resulted in reduced soil nitrogen supply, and consequently decreased cereal grain yields and low grain protein. To enhance yields and protein concentrations of wheat, management practices involving N fertiliser application, with no-tillage and stubble retention, grain legumes, and legume leys were evaluated from 1987 to 1998 on a fertility-depleted Vertosol at Warra, southern Queensland. The objective of this study was to examine the effect of lucerne in a 2-year lucerne–wheat rotation for its nitrogen and disease-break benefits to subsequent grain yield and protein content of wheat as compared with continuous wheat cropping. Dry matter production and nitrogen yields of lucerne were closely correlated with the total rainfall for October–September as well as March–September rainfall. Each 100 mm of total rainfall resulted in 0.97 t/ha of dry matter and 26 kg/ha of nitrogen yield. For the March–September rainfall, the corresponding values were 1.26 t/ha of dry matter and 36 kg/ha of nitrogen yield. The latter values were 10% lower than those produced by annual medics during a similar period. Compared with wheat–wheat cropping, significant increases in total soil nitrogen were observed only in 1990, 1992 and 1994 but increases in soil mineralisable nitrogen were observed in most years following lucerne. Similarly, pre-plant nitrate nitrogen in the soil profile following lucerne was higher by 74 kg/ha (9–167 kg N/ha) than that of wheat–wheat without N fertiliser in all years except 1996. Consequently, higher wheat grain protein (7 out of 9 seasons) and grain yield (4 out of 9 seasons) were produced compared with continuous wheat. There was significant depression in grain yield in 2 (1993 and 1995) out of 9 seasons attributed to soil moisture depletion and/or low growing season rainfall. Consequently, the overall responses in yield were lower than those of 50 kg/ha of fertiliser nitrogen applied to wheat–wheat crops, 2-year medic–wheat or chickpea–wheat rotation, although grain protein concentrations were higher following lucerne. The incidence and severity of the soilborne disease, common root rot of wheat caused by Bipolaris sorokiniana, was generally higher in lucerne–wheat than in continuous wheat with no nitrogen fertiliser applications, since its severity was significantly correlated with plant available water at sowing. No significant incidence of crown rot or root lesion nematode was observed. Thus, productivity, which was mainly due to nitrogen accretion in this experiment, can be maintained where short duration lucerne leys are grown in rotations with wheat.
Resumo:
In dryland agricultural systems of the subtropical, semi-arid region of north-eastern Australia, water is the most limiting resource. Crop productivity depends on the efficient use of rainfall and available water stored in the soil during fallow. Agronomic management practices including a period of fallow, stubble retention, and reduced tillage enhance reserves of soil water. However, access to stored water in these soils may be restricted by the presence of growth-limiting conditions in the rooting zone of the crop. These have been termed as subsoil constraints. Subsoil constraints may include compacted or gravel layers (physical), sodicity, salinity, acidity, nutrient deficiencies, presence of toxic elements (chemical) and low microbial activity (biological). Several of these constraints may occur together in some soils. Farmers have often not been able to obtain the potential yield determined by their prevailing climatic conditions in the marginal rainfall areas of the northern grains region. In the past, the adoption of soil management practices had been largely restricted to the top 100 mm soil layer. Exploitation of the subsoil as a source of water and nutrients has largely been overlooked. The key towards realising potential yields would be to gain better understanding of subsoils and their limitations, then develop options to manage them practically and economically. Due to the complex nature of the causal factors of these constraints, efforts are required for a combination of management approaches rather than individual options, with the aim to combat these constraints for sustainable crop production, managing natural resources and avoiding environmental damage.
Resumo:
Reduced supplies of nitrogen (N) in many soils of southern Queensland that were cropped exhaustively with cereals over many decades have been the focus of much research to avoid declines in profitability and sustainability of farming systems. A 45-month period of mixed grass (purple pigeon grass, Setaria incrassata Stapf; Rhodes grass, Chloris gayana Kunth.) and legume (lucerne, Medicago sativa L.; annual medics, M. scutellata L. Mill. and M. truncatula Gaertn.) pasture was one of several options that were compared at a fertility-depleted Vertosol at Warra, southern Queensland, to improve grain yields or increase grain protein concentration of subsequent wheat crops. Objectives of the study were to measure the productivity of a mixed grass and legume pasture grown over 45 months (cut and removed over 36 months) and its effects on yield and protein concentrations of the following wheat crops. Pasture production (DM t/ha) and aboveground plant N yield (kg/ha) for grass, legume (including a small amount of weeds) and total components of pasture responded linearly to total rainfall over the duration of each of 3 pastures sown in 1986, 1987 and 1988. Averaged over the 3 pastures, each 100 mm of rainfall resulted in 0.52 t/ha of grass, 0.44 t/ha of legume and 0.97 t/ha of total pasture DM, there being little variation between the 3 pastures. Aboveground plant N yield of the 3 pastures ranged from 17.2 to 20.5 kg/ha per 100 mm rainfall. Aboveground legume N in response to total rainfall was similar (10.6 - 13.2 kg/ha. 100 mm rainfall) across the 3 pastures in spite of very different populations of legumes and grasses at establishment. Aboveground grass N yield was 5.2 - 7.0 kg/ha per 100mm rainfall. In most wheat crops following pasture, wheat yields were similar to that of unfertilised wheat except in 1990 and 1994, when grain yields were significantly higher but similar to that for continuous wheat fertilised with 75 kg N/ha. In contrast, grain protein concentrations of most wheat crops following pasture responded positively, being substantially higher than unfertilised wheat but similar to that of wheat fertilised with 75 kg N/ha. Grain protein averaged over all years of assay was increased by 25 - 40% compared with that of unfertilised wheat. Stored water supplies after pasture were < 134mm (< 55% of plant available water capacity); for most assay crops water storages were 67 - 110 mm, an equivalent wet soil depth of only 0.3 - 0.45 m. Thus, the crop assays of pasture benefits were limited by low water supply to wheat crops. Moreover, the severity of common root rot in wheat crop was not reduced by pasture - wheat rotation.
Resumo:
Adoption of conservation tillage practices on Red Ferrosol soils in the inland Burnett area of south-east Queensland has been shown to reduce runoff and subsequent soil erosion. However, improved infiltration resulting from these measures has not improved crop performance and there are suggestions of increased loss of soil water via deep drainage. This paper reports data monitoring soil water under real and artificial rainfall events in commercial fields and long-term tillage experiments, and uses the data to explore the rate and mechanisms of deep drainage in this soil type. Soils were characterised by large drainable porosities (≥0.10 m3/m3) in all parts of the profile to depths of 1.50 m, with drainable porosity similar to available water content (AWC) at 0.25 and 0.75 m, but >60% higher than AWC at 1.50 m. Hydraulic conductivity immediately below the tilled layer in both continuously cropped soils and those after a ley pasture phase was shown to decline with increasing soil moisture content, although the rate of decline was much greater in continuously cropped soil. At moisture contents approaching the drained upper limit (pore water pressure = -100cm H2O), estimates of saturated hydraulic conductivity after a ley pasture were 3-5 times greater than in continuously cropped soil, suggesting much greater rates of deep drainage in the former when soils are moist. Hydraulic tensiometers and fringe capacitance sensors monitored during real and artificial rainfall events showed evidence of soils approaching saturation in the surface layers (top 0.30-0.40 m), but there was no evidence of soil moistures exceeding the drained upper limit (i.e. pore water pressures ≤ -100 cm H2O) in deeper layers. Recovery of applied soil water within the top 1.00-1.20 m of the profile during or immediately after rainfall events declined as the starting profile moisture content increased. These effects were consistent with very rapid rates of internal drainage. Sensors deeper in the profile were unable to detect this drainage due to either non-uniformity of conducting macropores (i.e. bypass flow) or unsaturated conductivities in deeper layers that far exceed the saturated hydraulic conductivity of the infiltration throttle at the bottom of the cultivated layer. Large increases in unsaturated hydraulic conductivities are likely with only small increases in water content above the drained upper limit. Further studies with drainage lysimeters and large banks of hydraulic tensiometers are planned to quantify drainage risk in these soil types.
Resumo:
Soils with high levels of chloride and/or sodium in their subsurface layers are often referred to as having subsoil constraints (SSCs). There is growing evidence that SSCs affect wheat yields by increasing the lower limit of a crop's available soil water (CLL) and thus reducing the soil's plant-available water capacity (PAWC). This proposal was tested by simulation of 33 farmers' paddocks in south-western Queensland and north-western New South Wales. The simulated results accounted for 79% of observed variation in grain yield, with a root mean squared deviation (RMSD) of 0.50 t/ha. This result was as close as any achieved from sites without SSCs, thus providing strong support for the proposed mechanism that SSCs affect wheat yields by increasing the CLL and thus reducing the soil's PAWC. In order to reduce the need to measure CLL of every paddock or management zone, two additional approaches to simulating the effects of SSCs were tested. In the first approach the CLL of soils was predicted from the 0.3-0.5 m soil layer, which was taken as the reference CLL of a soil regardless of its level of SSCs, while the CLL values of soil layers below 0.5 m depth were calculated as a function of these soils' 0.3-0.5 m CLL values as well as of soil depth plus one of the SSC indices EC, Cl, ESP, or Na. The best estimates of subsoil CLL values were obtained when the effects of SSCs were described by an ESP-dependent function. In the second approach, depth-dependent CLL values were also derived from the CLL values of the 0.3-0.5 m soil layer. However, instead of using SSC indices to further modify CLL, the default values of the water-extraction coefficient (kl) of each depth layer were modified as a function of the SSC indices. The strength of this approach was evaluated on the basis of correlation of observed and simulated grain yields. In this approach the best estimates were obtained when the default kl values were multiplied by a Cl-determined function. The kl approach was also evaluated with respect to simulated soil moisture at anthesis and at grain maturity. Results using this approach were highly correlated with soil moisture results obtained from simulations based on the measured CLL values. This research provides strong evidence that the effects of SSCs on wheat yields are accounted for by the effects of these constraints on wheat CLL values. The study also produced two satisfactory methods for simulating the effects of SSCs on CLL and on grain yield. While Cl and ESP proved to be effective indices of SSCs, EC was not effective due to the confounding effect of the presence of gypsum in some of these soils. This study provides the tools necessary for investigating the effects of SSCs on wheat crop yields and natural resource management (NRM) issues such as runoff, recharge, and nutrient loss through simulation studies. It also facilitates investigation of suggested agronomic adaptations to SSCs.
Resumo:
Winter cereal cropping is marginal in south-west Queensland because of low and variable rainfall and declining soil fertility. Increasing the soil water storage and the efficiency of water and nitrogen (N) use is essential for sustainable cereal production. The effect of zero tillage and N fertiliser application on these factors was evaluated in wheat and barley from 1996 to 2001 on a grey Vertosol. Annual rainfall was above average in 1996, 1997, 1998 and 1999 and below average in 2000 and 2001. Due to drought, no crop was grown in the 2000 winter cropping season. Zero tillage improved fallow soil water storage by a mean value of 20 mm over 4 years, compared with conventional tillage. However, mean grain yield and gross margin of wheat were similar under conventional and zero tillage. Wheat grain yield and/or grain protein increased with N fertiliser application in all years, resulting in an increase in mean gross margin over 5 years from $86/ha, with no N fertiliser applied, to $250/ha, with N applied to target ≥13% grain protein. A similar increase in gross margin occurred in barley where N fertiliser was applied to target malting grade. The highest N fertiliser application rate in wheat resulted in a residual benefit to soil N supply for the following crop. This study has shown that profitable responses to N fertiliser addition in wheat and barley can be obtained on long-term cultivated Vertosols in south-west Queensland when soil water reserves at sowing are at least 60% of plant available water capacity, or rainfall during the growing season is above average. An integrative benchmark for improved N fertiliser management appears to be the gross margin/water use of ~$1/ha.mm. Greater fallow soil water storage or crop water use efficiency under zero tillage has the potential to improve winter cereal production in drier growing seasons than experienced during the period of this study.
Resumo:
Provision of artificial waterpoints in Australian rangelands has resulted in an increase in the range and density of kangaroos. At high densities, kangaroos can inhibit vegetation regeneration, particularly in some protected areas where harvesting is prohibited. Fencing off waterpoints has been proposed to limit these impacts. Our aim was to determine whether fencing off waterpoints during a drought (when kangaroos would be especially water-limited) would influence the density and distribution of red kangaroos (Macropus rufus). Two waterpoints were fenced within the first 6 months of the 27-month study and a further two waterpoints were kept unfenced as controls in Idalia National Park, western Queensland. We estimated kangaroo densities around waterpoints from walked line-transect counts, and their grazing distribution from dung-pellet counts. Fencing off waterpoints failed to influence either the density or distribution up to 4 km from the waterpoints. Our results indicate that food availability, rather than the location of waterpoints, determines kangaroo distribution. Few areas in the rangelands are beyond kangaroos' convenient reach from permanent waterpoints. Therefore, fencing off waterpoints without explicitly considering the spatial context in relation to other available water sources will fail to achieve vegetation regeneration.
Resumo:
In semi-arid sub-tropical areas, a number of studies concerning no-till (NT) farming systems have demonstrated advantages in economic, environmental and soil quality aspects over conventional tillage (CT). However, adoption of continuous NT has contributed to the build-up of herbicide resistant weed populations, increased incidence of soil- and stubble-borne diseases, and stratification of nutrients and organic carbon near the soil surface. Some farmers often resort to an occasional strategic tillage (ST) to manage these problems of NT systems. However, farmers who practice strict NT systems are concerned that even one-time tillage may undo positive soil condition benefits of NT farming systems. We reviewed the pros and cons of the use of occasional ST in NT farming systems. Impacts of occasional ST on agronomy, soil and environment are site-specific and depend on many interacting soil, climatic and management conditions. Most studies conducted in North America and Europe suggest that introducing occasional ST in continuous NT farming systems could improve productivity and profitability in the short term; however in the long-term, the impact is negligible or may be negative. The short term impacts immediately following occasional ST on soil and environment include reduced protective cover, soil loss by erosion, increased runoff, loss of C and water, and reduced microbial activity with little or no detrimental impact in the long-term. A potential negative effect immediately following ST would be reduced plant available water which may result in unreliability of crop sowing in variable seasons. The occurrence of rainfall between the ST and sowing or immediately after the sowing is necessary to replenish soil water lost from the seed zone. Timing of ST is likely to be critical and must be balanced with optimising soil water prior to seeding. The impact of occasional ST varies with the tillage implement used; for example, inversion tillage using mouldboard tillage results in greater impacts as compared to chisel or disc. Opportunities for future research on occasional ST with the most commonly used implements such as tine and/or disc in Australia’s northern grains-growing region are presented in the context of agronomy, soil and the environment.
Resumo:
An estimate of the groundwater budget at the catchment scale is extremely important for the sustainable management of available water resources. Water resources are generally subjected to over-exploitation for agricultural and domestic purposes in agrarian economies like India. The double water-table fluctuation method is a reliable method for calculating the water budget in semi-arid crystalline rock areas. Extensive measurements of water levels from a dense network before and after the monsoon rainfall were made in a 53 km(2)atershed in southern India and various components of the water balance were then calculated. Later, water level data underwent geostatistical analyses to determine the priority and/or redundancy of each measurement point using a cross-validation method. An optimal network evolved from these analyses. The network was then used in re-calculation of the water-balance components. It was established that such an optimized network provides far fewer measurement points without considerably changing the conclusions regarding groundwater budget. This exercise is helpful in reducing the time and expenditure involved in exhaustive piezometric surveys and also in determining the water budget for large watersheds (watersheds greater than 50 km(2)).
Resumo:
This paper presents a genetic algorithm (GA) model for obtaining an optimal operating policy and optimal crop water allocations from an irrigation reservoir. The objective is to maximize the sum of the relative yields from all crops in the irrigated area. The model takes into account reservoir inflow, rainfall on the irrigated area, intraseasonal competition for water among multiple crops, the soil moisture dynamics in each cropped area, the heterogeneous nature of soils. and crop response to the level of irrigation applied. The model is applied to the Malaprabha single-purpose irrigation reservoir in Karnataka State, India. The optimal operating policy obtained using the GA is similar to that obtained by linear programming. This model can be used for optimal utilization of the available water resources of any reservoir system to obtain maximum benefits.
Resumo:
A summary of the inventory survey of Nigeria inland waters is presented. The survey reveals that Kano State tops the list in reservoir development with an existing water surface area of about 42,773 ha, while Anambra State has the least with about 38 hectares. No reservoir was recorded for Lagos and Rivers States. However, in aspects of existing fish ponds, a total of about 471 ha was recorded for Plateau State and about 5 ha for Niger State. Preliminary estimates of Nigeria's fish yield potentials based on established production records of comparable water bodies in the tropics, at different levels of management, show that the available water mass in the country, estimated at about 12.5 million hectares, could yield a minimum of about 334,214 metric tonnes (m.t.) of fish per annum with little or no management and a maximum of about 511,703 metric tonnes per annum with adequate management. Comparison of the potential yields from inland sources with the projected fish production in Nigeria (1981-1985) based on supply and demand statistics shows that potential yield from inland sources even at a low level of management is relatively higher than the projected inland production and more than double the observed production. The variation between the potential and the observed fish yields in the country has been attributed to the absolute lack of management strategies for our various inland waters. The paper elaborates on possible management strategies for various categories of inland waters as a prelude towards increased fish production in the country