54 resultados para nutrient use efficiency
Resumo:
The aim of this review is to report changes in irrigated cotton water use from research projects and on-farm practice-change programs in Australia, in relation to both plant-based and irrigation engineering disciplines. At least 80% of the Australian cotton-growing area is irrigated using gravity surface-irrigation systems. This review found that, over 23 years, cotton crops utilise 6-7ML/ha of irrigation water, depending on the amount of seasonal rain received. The seasonal evapotranspiration of surface-irrigated crops averaged 729mm over this period. Over the past decade, water-use productivity by Australian cotton growers has improved by 40%. This has been achieved by both yield increases and more efficient water-management systems. The whole-farm irrigation efficiency index improved from 57% to 70%, and the crop water use index is >3kg/mm.ha, high by international standards. Yield increases over the last decade can be attributed to plant-breeding advances, the adoption of genetically modified varieties, and improved crop management. Also, there has been increased use of irrigation scheduling tools and furrow-irrigation system optimisation evaluations. This has reduced in-field deep-drainage losses. The largest loss component of the farm water balance on cotton farms is evaporation from on-farm water storages. Some farmers are changing to alternative systems such as centre pivots and lateral-move machines, and increasing numbers of these alternatives are expected. These systems can achieve considerable labour and water savings, but have significantly higher energy costs associated with water pumping and machine operation. The optimisation of interactions between water, soils, labour, carbon emissions and energy efficiency requires more research and on-farm evaluations. Standardisation of water-use efficiency measures and improved water measurement techniques for surface irrigation are important research outcomes to enable valid irrigation benchmarks to be established and compared. Water-use performance is highly variable between cotton farmers and farming fields and across regions. Therefore, site-specific measurement is important. The range in the presented datasets indicates potential for further improvement in water-use efficiency and productivity on Australian cotton farms.
Resumo:
To break the yield ceiling of rice production, a super rice project was developed in 1996 to breed rice varieties with super high yield. A two-year experiment was conducted to evaluate yield and nitrogen (N)-use response of super rice to different planting methods in the single cropping season. A total of 17 rice varieties, including 13 super rice and four non-super checks (CK), were grown under three N levels [0 (N0), 150 (N150), and 225 (N225) kg ha−1] and two planting methods [transplanting (TP) and direct-seeding in wet conditions (WDS)]. Grain yield under WDS (7.69 t ha−1) was generally lower than TP (8.58 t ha−1). However, grain yield under different planting methods was affected by N rates as well as variety groups. In both years, there was no difference in grain yield between super and CK varieties at N150, irrespective of planting methods. However, grain yield difference was dramatic in japonica groups at N225, that is, there was an 11.3% and 14.1% average increase in super rice than in CK varieties in WDS and TP, respectively. This suggests that high N input contributes to narrowing the yield gap in super rice varieties, which also indicates that super rice was bred for high fertility conditions. In the japonica group, more N was accumulated in super rice than in CK at N225, but no difference was found between super and CK varieties at N0 and N150. Similar results were also found for N agronomic efficiency. The results suggest that super rice varieties have an advantage for N-use efficiency when high N is applied. The response of super rice was greater under TP than under WDS. The results suggest that the need to further improve agronomic and other management practices to achieve high yield and N-use efficiency for super rice varieties in WDS.
Resumo:
Nitrogen fertilizer inputs dominate the fertilizer budget of grain sorghum growers in northern Australia, so optimizing use efficiency and minimizing losses are a primary agronomic objective. We report results from three experiments in southern Queensland sown on contrasting soil types and with contrasting rotation histories in the 2012-2013 summer season. Experiments were designed to quantify the response of grain sorghum to rates of N fertilizer applied as urea. Labelled 15N fertilizer was applied in microplots to determine the fate of applied N, while nitrous oxide (N2O) emissions were continuously monitored at Kingaroy (grass or legume ley histories) and Kingsthorpe (continuous grain cropping). Nitrous oxide is a useful indicator of gaseous N losses. Crops at all sites responded strongly to fertilizer N applications, with yields of unfertilized treatments ranging from 17% to 52% of N-unlimited potential. Maximum yields ranged from 4500 (Kupunn) to 5450 (Kingaroy) and 8010 (Kingsthorpe) kg/ha. Agronomic efficiency (kg additional grain produced/kg fertilizer N applied) at the optimum N rate on the Vertosol sites was 23 (80 N, Kupunn) to 25 (160N, Kingsthorpe), but 40-42 on the Ferrosols at Kingaroy (70-100N). Cumulative N2O emissions ranged from 0.44% (Kingaroy legume) to 0.93% (Kingsthorpe) and 1.15% (Kingaroy grass) of the optimum fertilizer N rate at each site, with greatest emissions from the Vertosol at Kingsthorpe. The similarity in N2O emissions factors between Kingaroy and Kingsthorpe contrasted markedly with the recovery of applied fertilizer N in plant and soil. Apparent losses of fertilizer N ranged from 0-5% (Ferrosols at Kingaroy) to 40-48% (Vertosols at Kupunn and Kingsthorpe). The greater losses on the Vertosols were attributed to denitrification losses and illustrate the greater risks of N losses in these soils in wet seasonal conditions.
Resumo:
Assessing storage impacts on manure properties is relevant to research associated with nutrient-use efficiency and greenhouse gas (GHG) emissions. We examined the impact of cold storage on physicochemical properties, biochemical methane-emitting potential (BMP) and the composition of microbial communities of beef feedlot manure and poultry broiler litter. Manures were analysed within 2 days of collection and after 2 and 8 weeks in refrigerated (4 °C) or frozen (–20 °C) storage. Compared with fresh manure, stored manures had statistically significant (p < 0.05) but comparatively minor (<10%) changes in electrical conductivity, chloride and ammonium concentrations. Refrigeration and freezing did not significantly affect (p > 0.05) BMP in both manure types. We did not detect ammonium- or nitrite-oxidising bacterial taxa (AOB, NOB) using fluorescence in situ hybridisation (FISH). Importantly, the viability of microbes was unchanged by storage. We conclude that storage at –20 °C or 4 °C adequately preserves the investigated traits of the studied manures for research aimed at improving nutrient cycling and reducing GHG emissions.
Resumo:
The potential for fertiliser use in Lockyer Valleys intensive vegetable production to impact on the Moreton Bay Waterways (MBW) is not well defined. Notwithstanding nutrient runoff through soil erosion of agricultural lands has been identified as a process that significantly contributes artificial fertiliser to the MBW (SEQ Healthy Waterways Draft Strategy 2006). In order to better understand this issue the present study undertakes a nutrient mass balance to evaluate nitrogen use efficiency in the intensive horticultural industry of the Lockyer Valley.
Resumo:
Grazing is a major land use in Australia's rangelands. The 'safe' livestock carrying capacity (LCC) required to maintain resource condition is strongly dependent on climate. We reviewed: the approaches for quantifying LCC; current trends in climate and their effect on components of the grazing system; implications of the 'best estimates' of climate change projections for LCC; the agreement and disagreement between the current trends and projections; and the adequacy of current models of forage production in simulating the impact of climate change. We report the results of a sensitivity study of climate change impacts on forage production across the rangelands, and we discuss the more general issues facing grazing enterprises associated with climate change, such as 'known uncertainties' and adaptation responses (e.g. use of climate risk assessment). We found that the method of quantifying LCC from a combination of estimates (simulations) of long-term (>30 years) forage production and successful grazier experience has been well tested across northern Australian rangelands with different climatic regions. This methodology provides a sound base for the assessment of climate change impacts, even though there are many identified gaps in knowledge. The evaluation of current trends indicated substantial differences in the trends of annual rainfall (and simulated forage production) across Australian rangelands with general increases in most of western Australian rangelands ( including northern regions of the Northern Territory) and decreases in eastern Australian rangelands and south-western Western Australia. Some of the projected changes in rainfall and temperature appear small compared with year-to-year variability. Nevertheless, the impacts on rangeland production systems are expected to be important in terms of required managerial and enterprise adaptations. Some important aspects of climate systems science remain unresolved, and we suggest that a risk-averse approach to rangeland management, based on the 'best estimate' projections, in combination with appropriate responses to short-term (1-5 years) climate variability, would reduce the risk of resource degradation. Climate change projections - including changes in rainfall, temperature, carbon dioxide and other climatic variables - if realised, are likely to affect forage and animal production, and ecosystem functioning. The major known uncertainties in quantifying climate change impacts are: (i) carbon dioxide effects on forage production, quality, nutrient cycling and competition between life forms (e.g. grass, shrubs and trees); and (ii) the future role of woody plants including effects of. re, climatic extremes and management for carbon storage. In a simple example of simulating climate change impacts on forage production, we found that increased temperature (3 degrees C) was likely to result in a decrease in forage production for most rangeland locations (e. g. -21% calculated as an unweighted average across 90 locations). The increase in temperature exacerbated or reduced the effects of a 10% decrease/increase in rainfall respectively (-33% or -9%). Estimates of the beneficial effects of increased CO2 (from 350 to 650 ppm) on forage production and water use efficiency indicated enhanced forage production (+26%). The increase was approximately equivalent to the decline in forage production associated with a 3 degrees C temperature increase. The large magnitude of these opposing effects emphasised the importance of the uncertainties in quantifying the impacts of these components of climate change. We anticipate decreases in LCC given that the 'best estimate' of climate change across the rangelands is for a decline (or little change) in rainfall and an increase in temperature. As a consequence, we suggest that public policy have regard for: the implications for livestock enterprises, regional communities, potential resource damage, animal welfare and human distress. However, the capability to quantify these warnings is yet to be developed and this important task remains as a challenge for rangeland and climate systems science.
Resumo:
Soil water repellency occurs widely in horticultural and agricultural soils when very dry. The gradual accumulation and breakdown of surface organic matter over time produces wax-like organic acids, which coat soil particles preventing uniform entry of water into the soil. Water repellency is usually managed by regular surfactant applications. Surfactants, literally, are surface active agents (SURFace ACTive AgeNTS). Their mode of action is to reduce the surface tension of water, allowing it to penetrate and wet the soil more easily and completely. This practice improves water use efficiency (by requiring less water to wet the soil and by capturing rainfall and irrigation more effectively and rapidly). It also reduces nutrient losses through run-off erosion or leaching. These nutrients have the potential to pollute the surrounding environment and water courses. This project investigated potential improvements to standard practices (product combination and scheduling) for surfactant use to overcome localised dry spots on water repellent soils and thus improve turf quality and water use efficiency. Weather conditions for the duration of the trial prevented the identification of improved practices in terms of combination and scheduling. However, the findings support previous research that the use of soil surfactants decreased the time for water to infiltrate dry soil samples taken from a previously severely hydrophobic site. Data will be continually collected from this trial site on a private contractual basis, with the hope that improvements to standard practices will be observed during the drier winter months when moisture availability is a limiting factor for turfgrass growth and quality.
Resumo:
This project investigates the impact of vegetable production systems on sensitive waterways focusing on the risk of off-site nutrient movement at farm block scale under current management practices. The project establishes a series of case studies in two environmentally important Queensland catchments and conducts a broader survey of partial nutrient budgets across tropical vegetable production. It will deliver tools to growers that can improve fertiliser use efficiency delivering profitability and environmental improvements.
Resumo:
The availability and quality of irrigation water has become an issue limiting productivity in many Australian vegetable regions. Production is also under competitive pressure from supply chain forces. Producers look to new technologies, including changing irrigation infrastructure, exploring new water sources, and more complex irrigation management, to survive these stresses. Often there is little objective information investigating which improvements could improve outcomes for vegetable producers, and external communities (e.g. meeting NRM targets). This has led to investment in inappropriate technologies, and costly repetition of errors, as business independently discover the worth of technologies by personal experience. In our project, we investigated technology improvements for vegetable irrigation. Through engagement with industry and other researchers, we identified technologies most applicable to growers, particularly those that addressed priority issues. We developed analytical tools for ‘what if’ scenario testing of technologies. We conducted nine detailed experiments in the Lockyer Valley and Riverina vegetable growing districts, as well as case studies on grower properties in southern Queensland. We investigated root zone monitoring tools (FullStop™ wetting front detectors and Soil Solution Extraction Tubes - SSET), drip system layout, fertigation equipment, and altering planting arrangements. Our project team developed and validated models for broccoli, sweet corn, green beans and lettuce, and spreadsheets for evaluating economic risks associated with new technologies. We presented project outcomes at over 100 extension events, including irrigation showcases, conferences, field days, farm walks and workshops. The FullStops™ were excellent for monitoring root zone conditions (EC, nitrate levels), and managing irrigation with poor quality water. They were easier to interpret than the SSET. The SSET were simpler to install, but required wet soil to be reliable. SSET were an option for monitoring deeper soil zones, unsuitable for FullStop™ installations. Because these root zone tools require expertise, and are labour intensive, we recommend they be used to address specific problems, or as a periodic auditing strategy, not for routine monitoring. In our research, we routinely found high residual N in horticultural soils, with subsequently little crop yield response to additional nitrogen fertiliser. With improved irrigation efficiency (and less leaching), it may be timely to re-examine nitrogen budgets and recommendations for vegetable crops. Where the drip irrigation tube was located close to the crop row (i.e. within 5-8 cm), management of irrigation was easier. It improved nitrogen uptake, water use efficiency, and reduced the risk of poor crop performance through moisture stress, particularly in the early crop establishment phases. Close proximity of the drip tube to the crop row gives the producer more options for managing salty water, and more flexibility in taking risks with forecast rain. In many vegetable crops, proximate drip systems may not be cost-effective. The next best alternative is to push crop rows closer to the drip tube (leading to an asymmetric row structure). The vegetable crop models are good at predicting crop phenology (development stages, time to harvest), input use (water, fertiliser), environmental impacts (nutrient, salt movement) and total yields. The two immediate applications for the models are understanding/predicting/manipulating harvest dates and nitrogen movements in vegetable cropping systems. From the economic tools, the major influences on accumulated profit are price and yield. In doing ‘what if’ analyses, it is very important to be as accurate as possible in ascertaining what the assumed yield and price ranges are. In most vegetable production systems, lowering the required inputs (e.g. irrigation requirement, fertiliser requirement) is unlikely to have a major influence on accumulated profit. However, if a resource is constraining (e.g. available irrigation water), it is usually most profitable to maximise return per unit of that resource.
Resumo:
Purpose We investigated the effects of weed control and fertilization at early establishment on foliar stable carbon (δ13C) and nitrogen (N) isotope (δ15N) compositions, foliar N concentration, tree growth and biomass, relative weed cover and other physiological traits in a 2-year old F1 hybrid (Pinus elliottii var. elliottii (Engelm) × Pinus caribaea var. hondurensis (Barr. ex Golf.)) plantation grown on a yellow earth in southeast Queensland of subtropical Australia. Materials and methods Treatments included routine weed control, luxury weed control, intermediate weed control, mechanical weed control, nil weed control, and routine and luxury fertilization in a randomised complete block design. Initial soil nutrition and soil fertility parameters included (hot water extractable organic carbon (C) and total nitrogen (N), total C and N, C/N ratio, labile N pools (nitrate (NO3 −) and ammonium (NH4 +)), extractable potassium (K+)), soil δ15N and δ13C. Relative weed cover, foliar N concentrations, tree growth rate and physiological parameters including photosynthesis, stomatal conductance, photosynthetic nitrogen use efficiency, foliar δ15N and foliar δ13C were also measured at early establishment. Results and discussion Foliar N concentration at 1.25 years was significantly different amongst the weed control treatments and was negatively correlated to the relative weed cover at 1.1 years. Foliar N concentration was also positively correlated to foliar δ15N and foliar δ13C, tree height, height growth rates and tree biomass. Foliar δ15N was negatively correlated to the relative weed cover at 0.8 and 1.1 years. The physiological measurements indicated that luxury fertilization and increasing weed competition on these soils decreased leaf xylem pressure potential (Ψxpp) when compared to the other treatments. Conclusions These results indicate how increasing N resources and weed competition have implications for tree N and water use at establishment in F1 hybrid plantations of southeast Queensland, Australia. These results suggest the desirability of weed control, in the inter-planting row, in the first year to maximise site N and water resources available for seedling growth. It also showed the need to avoid over-fertilisation, which interfered with the balance between available N and water on these soils.
Resumo:
A high proportion of the Australian and New Zealand dairy industry is based on a relatively simple, low input and low cost pasture feedbase. These factors enable this type of production system to remain internationally competitive. However, a key limitation of pasture-based dairy systems is periodic imbalances between herd intake requirements and pasture DM production, caused by strong seasonality and high inter-annual variation in feed supply. This disparity can be moderated to a certain degree through the strategic management of the herd through altering calving dates and stocking rates, and the feedbase by conserving excess forage and irrigating to flatten seasonal forage availability. Australasian dairy systems are experiencing emerging market and environmental challenges, which includes increased competition for land and water resources, decreasing terms of trade, a changing and variable climate, an increasing environmental focus that requires improved nutrient and water-use efficiency and lower greenhouse gas emissions. The integration of complementary forages has long been viewed as a means to manipulate the home-grown feed supply, to improve the nutritive value and DM intake of the diet, and to increase the efficiency of inputs utilised. Only recently has integrating complementary forages at the whole-farm system level received the significant attention and investment required to examine their potential benefit. Recent whole-of-farm research undertaken in both Australia and New Zealand has highlighted the importance of understanding the challenges of the current feedbase and the level of complementarity between forage types required to improve profit, manage risk and/or alleviate/mitigate against adverse outcomes. This paper reviews the most recent systems-level research into complementary forages, discusses approaches to modelling their integration at the whole-farm level and highlights the potential of complementary forages to address the major challenges currently facing pasture-based dairy systems.
Resumo:
Macadamias, adapted to the fringes of subtropical rainforests of coastal, eastern Australia, are resilient to mild water stress. Even after prolonged drought, it is difficult to detect stress in commercial trees. Despite this, macadamia orchards in newer irrigated regions produce more consistent crops than those from traditional, rain-fed regions. Crop fluctuations in the latter tend to follow rainfall patterns. The benefit of irrigation in lower rainfall areas is undisputed, but there are many unanswered questions about the most efficient use of irrigation water. Water is used more efficiently when it is less readily available, causing partial stomatal closure that restricts transpiration more than it restricts photosynthesis. Limited research suggests that macadamias can withstand mild stress. In fact, water use efficiency can be increased by strategic deficit irrigation. However, macadamias are susceptible to stress during oil accumulation. There may be benefits of applying more water at critical times, less at others, and this may vary with cultivar. Currently, it is common for macadamia growers to apply about 20-40 L tree-1 day-1 of water to their orchards in winter and 70-90 L tree-1 day-1 in summer. Research reported water use at 20-30 L tree-1 day-1 during winter and 40-50 L tree-1 day-1 in summer using the Granier sap flow technique. The discrepancy between actual water use and farmer practice may be due to water loss via evaporation from the ground, deep drainage and/or greater transpiration due to luxury water consumption. More irrigation research is needed to develop efficient water use and to set practical limits for deficit irrigation management.
Resumo:
Winter cereal cropping is marginal in south-west Queensland because of low and variable rainfall and declining soil fertility. Increasing the soil water storage and the efficiency of water and nitrogen (N) use is essential for sustainable cereal production. The effect of zero tillage and N fertiliser application on these factors was evaluated in wheat and barley from 1996 to 2001 on a grey Vertosol. Annual rainfall was above average in 1996, 1997, 1998 and 1999 and below average in 2000 and 2001. Due to drought, no crop was grown in the 2000 winter cropping season. Zero tillage improved fallow soil water storage by a mean value of 20 mm over 4 years, compared with conventional tillage. However, mean grain yield and gross margin of wheat were similar under conventional and zero tillage. Wheat grain yield and/or grain protein increased with N fertiliser application in all years, resulting in an increase in mean gross margin over 5 years from $86/ha, with no N fertiliser applied, to $250/ha, with N applied to target ≥13% grain protein. A similar increase in gross margin occurred in barley where N fertiliser was applied to target malting grade. The highest N fertiliser application rate in wheat resulted in a residual benefit to soil N supply for the following crop. This study has shown that profitable responses to N fertiliser addition in wheat and barley can be obtained on long-term cultivated Vertosols in south-west Queensland when soil water reserves at sowing are at least 60% of plant available water capacity, or rainfall during the growing season is above average. An integrative benchmark for improved N fertiliser management appears to be the gross margin/water use of ~$1/ha.mm. Greater fallow soil water storage or crop water use efficiency under zero tillage has the potential to improve winter cereal production in drier growing seasons than experienced during the period of this study.
Resumo:
Varying the spatial distribution of applied nitrogen (N) fertilizer to match demand in crops has been shown to increase profits in Australia. Better matching the timing of N inputs to plant requirements has been shown to improve nitrogen use efficiency and crop yields and could reduce nitrous oxide emissions from broad acre grains. Farmers in the wheat production area of south eastern Australia are increasingly splitting N application with the second timing applied at stem elongation (Zadoks 30). Spectral indices have shown the ability to detect crop canopy N status but a robust method using a consistent calibration that functions across seasons has been lacking. One spectral index, the canopy chlorophyll content index (CCCI) designed to detect canopy N using three wavebands along the "red edge" of the spectrum was combined with the canopy nitrogen index (CNI), which was developed to normalize for crop biomass and correct for the N dilution effect of crop canopies. The CCCI-CNI index approach was applied to a 3-year study to develop a single calibration derived from a wheat crop sown in research plots near Horsham, Victoria, Australia. The index was able to predict canopy N (g m-2) from Zadoks 14-37 with an r2 of 0.97 and RMSE of 0.65 g N m-2 when dry weight biomass by area was also considered. We suggest that measures of N estimated from remote methods use N per unit area as the metric and that reference directly to canopy %N is not an appropriate method for estimating plant concentration without first accounting for the N dilution effect. This approach provides a link to crop development rather than creating a purely numerical relationship. The sole biophysical input, biomass, is challenging to quantify robustly via spectral methods. Combining remote sensing with crop modelling could provide a robust method for estimating biomass and therefore a method to estimate canopy N remotely. Future research will explore this and the use of active and passive sensor technologies for use in precision farming for targeted N management.