51 resultados para Radiation-use efficiency
Resumo:
The aim of this review is to report changes in irrigated cotton water use from research projects and on-farm practice-change programs in Australia, in relation to both plant-based and irrigation engineering disciplines. At least 80% of the Australian cotton-growing area is irrigated using gravity surface-irrigation systems. This review found that, over 23 years, cotton crops utilise 6-7ML/ha of irrigation water, depending on the amount of seasonal rain received. The seasonal evapotranspiration of surface-irrigated crops averaged 729mm over this period. Over the past decade, water-use productivity by Australian cotton growers has improved by 40%. This has been achieved by both yield increases and more efficient water-management systems. The whole-farm irrigation efficiency index improved from 57% to 70%, and the crop water use index is >3kg/mm.ha, high by international standards. Yield increases over the last decade can be attributed to plant-breeding advances, the adoption of genetically modified varieties, and improved crop management. Also, there has been increased use of irrigation scheduling tools and furrow-irrigation system optimisation evaluations. This has reduced in-field deep-drainage losses. The largest loss component of the farm water balance on cotton farms is evaporation from on-farm water storages. Some farmers are changing to alternative systems such as centre pivots and lateral-move machines, and increasing numbers of these alternatives are expected. These systems can achieve considerable labour and water savings, but have significantly higher energy costs associated with water pumping and machine operation. The optimisation of interactions between water, soils, labour, carbon emissions and energy efficiency requires more research and on-farm evaluations. Standardisation of water-use efficiency measures and improved water measurement techniques for surface irrigation are important research outcomes to enable valid irrigation benchmarks to be established and compared. Water-use performance is highly variable between cotton farmers and farming fields and across regions. Therefore, site-specific measurement is important. The range in the presented datasets indicates potential for further improvement in water-use efficiency and productivity on Australian cotton farms.
Resumo:
To break the yield ceiling of rice production, a super rice project was developed in 1996 to breed rice varieties with super high yield. A two-year experiment was conducted to evaluate yield and nitrogen (N)-use response of super rice to different planting methods in the single cropping season. A total of 17 rice varieties, including 13 super rice and four non-super checks (CK), were grown under three N levels [0 (N0), 150 (N150), and 225 (N225) kg ha−1] and two planting methods [transplanting (TP) and direct-seeding in wet conditions (WDS)]. Grain yield under WDS (7.69 t ha−1) was generally lower than TP (8.58 t ha−1). However, grain yield under different planting methods was affected by N rates as well as variety groups. In both years, there was no difference in grain yield between super and CK varieties at N150, irrespective of planting methods. However, grain yield difference was dramatic in japonica groups at N225, that is, there was an 11.3% and 14.1% average increase in super rice than in CK varieties in WDS and TP, respectively. This suggests that high N input contributes to narrowing the yield gap in super rice varieties, which also indicates that super rice was bred for high fertility conditions. In the japonica group, more N was accumulated in super rice than in CK at N225, but no difference was found between super and CK varieties at N0 and N150. Similar results were also found for N agronomic efficiency. The results suggest that super rice varieties have an advantage for N-use efficiency when high N is applied. The response of super rice was greater under TP than under WDS. The results suggest that the need to further improve agronomic and other management practices to achieve high yield and N-use efficiency for super rice varieties in WDS.
Resumo:
Nitrogen fertilizer inputs dominate the fertilizer budget of grain sorghum growers in northern Australia, so optimizing use efficiency and minimizing losses are a primary agronomic objective. We report results from three experiments in southern Queensland sown on contrasting soil types and with contrasting rotation histories in the 2012-2013 summer season. Experiments were designed to quantify the response of grain sorghum to rates of N fertilizer applied as urea. Labelled 15N fertilizer was applied in microplots to determine the fate of applied N, while nitrous oxide (N2O) emissions were continuously monitored at Kingaroy (grass or legume ley histories) and Kingsthorpe (continuous grain cropping). Nitrous oxide is a useful indicator of gaseous N losses. Crops at all sites responded strongly to fertilizer N applications, with yields of unfertilized treatments ranging from 17% to 52% of N-unlimited potential. Maximum yields ranged from 4500 (Kupunn) to 5450 (Kingaroy) and 8010 (Kingsthorpe) kg/ha. Agronomic efficiency (kg additional grain produced/kg fertilizer N applied) at the optimum N rate on the Vertosol sites was 23 (80 N, Kupunn) to 25 (160N, Kingsthorpe), but 40-42 on the Ferrosols at Kingaroy (70-100N). Cumulative N2O emissions ranged from 0.44% (Kingaroy legume) to 0.93% (Kingsthorpe) and 1.15% (Kingaroy grass) of the optimum fertilizer N rate at each site, with greatest emissions from the Vertosol at Kingsthorpe. The similarity in N2O emissions factors between Kingaroy and Kingsthorpe contrasted markedly with the recovery of applied fertilizer N in plant and soil. Apparent losses of fertilizer N ranged from 0-5% (Ferrosols at Kingaroy) to 40-48% (Vertosols at Kupunn and Kingsthorpe). The greater losses on the Vertosols were attributed to denitrification losses and illustrate the greater risks of N losses in these soils in wet seasonal conditions.
Resumo:
There is a large gap between the refined approaches to characterise genotypes and the common use of location and season as a coarse surrogate for environmental characterisation of breeding trials. As a framework for breeding, the aim of this paper is quantifying the spatial and temporal patterns of thermal and water stress for field pea in Australia. We compiled a dataset for yield of the cv. Kaspa measured in 185 environments, and investigated the associations between yield and seasonal patterns of actual temperature and modelled water stress. Correlations between yield and temperature indicated two distinct stages. In the first stage, during crop establishment and canopy expansion before flowering, yield was positively associated with minimum temperature. Mean minimum temperature below similar to 7 degrees C suggests that crops were under suboptimal temperature for both canopy expansion and radiation-use efficiency during a significant part of this early growth period. In the second stage, during critical reproductive phases, grain yield was negatively associated with maximum temperature over 25 degrees C. Correlations between yield and modelled water supply/demand ratio showed a consistent pattern with three phases: no correlation at early stages of the growth cycle, a progressive increase in the association that peaked as the crop approached the flowering window, and a progressive decline at later reproductive stages. Using long-term weather records (1957-2010) and modelled water stress for 104 locations, we identified three major patterns of water deficit nation wide. Environment type 1 (ET1) represents the most favourable condition, with no stress during most of the pre-flowering phase and gradual development of mild stress after flowering. Type 2 is characterised by increasing water deficit between 400 degree-days before flowering and 200 degree-days after flowering and rainfall that relieves stress late in the season. Type 3 represents the more stressful condition with increasing water deficit between 400 degree-days before flowering and maturity. Across Australia, the frequency of occurrence was 24% for ET1, 32% for ET2 and 43% for ET3, highlighting the dominance of the most stressful condition. Actual yield averaged 2.2 t/ha for ET1, 1.9 t/ha for ET2 and 1.4 t/ha for ET3, and the frequency of each pattern varied substantially among locations. Shifting from a nominal (i.e. location and season) to a quantitative (i.e. stress type) characterisation of environments could help improving breeding efficiency of field pea in Australia.
Resumo:
Characterization of drought environment types (ETs) has proven useful for breeding crops for drought-prone regions. Here we consider how changes in climate and atmospheric carbon dioxide (CO2) concentrations will affect drought ET frequencies in sorghum and wheat systems of Northeast Australia. We also modify APSIM (the Agricultural Production Systems Simulator) to incorporate extreme heat effects on grain number and weight, and then evaluate changes in the occurrence of heat-induced yield losses of more than 10, as well as the co-occurrence of drought and heat. More than six million simulations spanning representative locations, soil types, management systems, and 33 climate projections led to three key findings. First, the projected frequency of drought decreased slightly for most climate projections for both sorghum and wheat, but for different reasons. In sorghum, warming exacerbated drought stresses by raising the atmospheric vapor pressure deficit and reducing transpiration efficiency (TE), but an increase in TE due to elevated CO2 more than offset these effects. In wheat, warming reduced drought stress during spring by hastening development through winter and reducing exposure to terminal drought. Elevated CO2 increased TE but also raised radiation use efficiency and overall growth rates and water use, thereby offsetting much of the drought reduction from warming. Second, adding explicit effects of heat on grain number and grain size often switched projected yield impacts from positive to negative. Finally, although average yield losses associated with drought will remain generally higher than for heat stress for the next half century, the relative importance of heat is steadily growing. This trend, as well as the likely high degree of genetic variability in heat tolerance, suggests that more emphasis on heat tolerance is warranted in breeding programs. At the same time, work on drought tolerance should continue with an emphasis on drought that co-occurs with extreme heat. This article is protected by copyright. All rights reserved.
Resumo:
Characterization of drought environment types (ETs) has proven useful for breeding crops for drought-prone regions. Here we consider how changes in climate and atmospheric carbon dioxide (CO2) concentrations will affect drought ET frequencies in sorghum and wheat systems of Northeast Australia. We also modify APSIM (the Agricultural Production Systems Simulator) to incorporate extreme heat effects on grain number and weight, and then evaluate changes in the occurrence of heat-induced yield losses of more than 10%, as well as the co-occurrence of drought and heat. More than six million simulations spanning representative locations, soil types, management systems, and 33 climate projections led to three key findings. First, the projected frequency of drought decreased slightly for most climate projections for both sorghum and wheat, but for different reasons. In sorghum, warming exacerbated drought stresses by raising the atmospheric vapor pressure deficit and reducing transpiration efficiency (TE), but an increase in TE due to elevated CO2 more than offset these effects. In wheat, warming reduced drought stress during spring by hastening development through winter and reducing exposure to terminal drought. Elevated CO2 increased TE but also raised radiation use efficiency and overall growth rates and water use, thereby offsetting much of the drought reduction from warming. Second, adding explicit effects of heat on grain number and grain size often switched projected yield impacts from positive to negative. Finally, although average yield losses associated with drought will remain generally higher than for heat stress for the next half century, the relative importance of heat is steadily growing. This trend, as well as the likely high degree of genetic variability in heat tolerance, suggests that more emphasis on heat tolerance is warranted in breeding programs. At the same time, work on drought tolerance should continue with an emphasis on drought that co-occurs with extreme heat. This article is protected by copyright. All rights reserved.
Resumo:
Purpose We investigated the effects of weed control and fertilization at early establishment on foliar stable carbon (δ13C) and nitrogen (N) isotope (δ15N) compositions, foliar N concentration, tree growth and biomass, relative weed cover and other physiological traits in a 2-year old F1 hybrid (Pinus elliottii var. elliottii (Engelm) × Pinus caribaea var. hondurensis (Barr. ex Golf.)) plantation grown on a yellow earth in southeast Queensland of subtropical Australia. Materials and methods Treatments included routine weed control, luxury weed control, intermediate weed control, mechanical weed control, nil weed control, and routine and luxury fertilization in a randomised complete block design. Initial soil nutrition and soil fertility parameters included (hot water extractable organic carbon (C) and total nitrogen (N), total C and N, C/N ratio, labile N pools (nitrate (NO3 −) and ammonium (NH4 +)), extractable potassium (K+)), soil δ15N and δ13C. Relative weed cover, foliar N concentrations, tree growth rate and physiological parameters including photosynthesis, stomatal conductance, photosynthetic nitrogen use efficiency, foliar δ15N and foliar δ13C were also measured at early establishment. Results and discussion Foliar N concentration at 1.25 years was significantly different amongst the weed control treatments and was negatively correlated to the relative weed cover at 1.1 years. Foliar N concentration was also positively correlated to foliar δ15N and foliar δ13C, tree height, height growth rates and tree biomass. Foliar δ15N was negatively correlated to the relative weed cover at 0.8 and 1.1 years. The physiological measurements indicated that luxury fertilization and increasing weed competition on these soils decreased leaf xylem pressure potential (Ψxpp) when compared to the other treatments. Conclusions These results indicate how increasing N resources and weed competition have implications for tree N and water use at establishment in F1 hybrid plantations of southeast Queensland, Australia. These results suggest the desirability of weed control, in the inter-planting row, in the first year to maximise site N and water resources available for seedling growth. It also showed the need to avoid over-fertilisation, which interfered with the balance between available N and water on these soils.
Resumo:
Macadamias, adapted to the fringes of subtropical rainforests of coastal, eastern Australia, are resilient to mild water stress. Even after prolonged drought, it is difficult to detect stress in commercial trees. Despite this, macadamia orchards in newer irrigated regions produce more consistent crops than those from traditional, rain-fed regions. Crop fluctuations in the latter tend to follow rainfall patterns. The benefit of irrigation in lower rainfall areas is undisputed, but there are many unanswered questions about the most efficient use of irrigation water. Water is used more efficiently when it is less readily available, causing partial stomatal closure that restricts transpiration more than it restricts photosynthesis. Limited research suggests that macadamias can withstand mild stress. In fact, water use efficiency can be increased by strategic deficit irrigation. However, macadamias are susceptible to stress during oil accumulation. There may be benefits of applying more water at critical times, less at others, and this may vary with cultivar. Currently, it is common for macadamia growers to apply about 20-40 L tree-1 day-1 of water to their orchards in winter and 70-90 L tree-1 day-1 in summer. Research reported water use at 20-30 L tree-1 day-1 during winter and 40-50 L tree-1 day-1 in summer using the Granier sap flow technique. The discrepancy between actual water use and farmer practice may be due to water loss via evaporation from the ground, deep drainage and/or greater transpiration due to luxury water consumption. More irrigation research is needed to develop efficient water use and to set practical limits for deficit irrigation management.
Resumo:
Winter cereal cropping is marginal in south-west Queensland because of low and variable rainfall and declining soil fertility. Increasing the soil water storage and the efficiency of water and nitrogen (N) use is essential for sustainable cereal production. The effect of zero tillage and N fertiliser application on these factors was evaluated in wheat and barley from 1996 to 2001 on a grey Vertosol. Annual rainfall was above average in 1996, 1997, 1998 and 1999 and below average in 2000 and 2001. Due to drought, no crop was grown in the 2000 winter cropping season. Zero tillage improved fallow soil water storage by a mean value of 20 mm over 4 years, compared with conventional tillage. However, mean grain yield and gross margin of wheat were similar under conventional and zero tillage. Wheat grain yield and/or grain protein increased with N fertiliser application in all years, resulting in an increase in mean gross margin over 5 years from $86/ha, with no N fertiliser applied, to $250/ha, with N applied to target ≥13% grain protein. A similar increase in gross margin occurred in barley where N fertiliser was applied to target malting grade. The highest N fertiliser application rate in wheat resulted in a residual benefit to soil N supply for the following crop. This study has shown that profitable responses to N fertiliser addition in wheat and barley can be obtained on long-term cultivated Vertosols in south-west Queensland when soil water reserves at sowing are at least 60% of plant available water capacity, or rainfall during the growing season is above average. An integrative benchmark for improved N fertiliser management appears to be the gross margin/water use of ~$1/ha.mm. Greater fallow soil water storage or crop water use efficiency under zero tillage has the potential to improve winter cereal production in drier growing seasons than experienced during the period of this study.
Resumo:
Grazing is a major land use in Australia's rangelands. The 'safe' livestock carrying capacity (LCC) required to maintain resource condition is strongly dependent on climate. We reviewed: the approaches for quantifying LCC; current trends in climate and their effect on components of the grazing system; implications of the 'best estimates' of climate change projections for LCC; the agreement and disagreement between the current trends and projections; and the adequacy of current models of forage production in simulating the impact of climate change. We report the results of a sensitivity study of climate change impacts on forage production across the rangelands, and we discuss the more general issues facing grazing enterprises associated with climate change, such as 'known uncertainties' and adaptation responses (e.g. use of climate risk assessment). We found that the method of quantifying LCC from a combination of estimates (simulations) of long-term (>30 years) forage production and successful grazier experience has been well tested across northern Australian rangelands with different climatic regions. This methodology provides a sound base for the assessment of climate change impacts, even though there are many identified gaps in knowledge. The evaluation of current trends indicated substantial differences in the trends of annual rainfall (and simulated forage production) across Australian rangelands with general increases in most of western Australian rangelands ( including northern regions of the Northern Territory) and decreases in eastern Australian rangelands and south-western Western Australia. Some of the projected changes in rainfall and temperature appear small compared with year-to-year variability. Nevertheless, the impacts on rangeland production systems are expected to be important in terms of required managerial and enterprise adaptations. Some important aspects of climate systems science remain unresolved, and we suggest that a risk-averse approach to rangeland management, based on the 'best estimate' projections, in combination with appropriate responses to short-term (1-5 years) climate variability, would reduce the risk of resource degradation. Climate change projections - including changes in rainfall, temperature, carbon dioxide and other climatic variables - if realised, are likely to affect forage and animal production, and ecosystem functioning. The major known uncertainties in quantifying climate change impacts are: (i) carbon dioxide effects on forage production, quality, nutrient cycling and competition between life forms (e.g. grass, shrubs and trees); and (ii) the future role of woody plants including effects of. re, climatic extremes and management for carbon storage. In a simple example of simulating climate change impacts on forage production, we found that increased temperature (3 degrees C) was likely to result in a decrease in forage production for most rangeland locations (e. g. -21% calculated as an unweighted average across 90 locations). The increase in temperature exacerbated or reduced the effects of a 10% decrease/increase in rainfall respectively (-33% or -9%). Estimates of the beneficial effects of increased CO2 (from 350 to 650 ppm) on forage production and water use efficiency indicated enhanced forage production (+26%). The increase was approximately equivalent to the decline in forage production associated with a 3 degrees C temperature increase. The large magnitude of these opposing effects emphasised the importance of the uncertainties in quantifying the impacts of these components of climate change. We anticipate decreases in LCC given that the 'best estimate' of climate change across the rangelands is for a decline (or little change) in rainfall and an increase in temperature. As a consequence, we suggest that public policy have regard for: the implications for livestock enterprises, regional communities, potential resource damage, animal welfare and human distress. However, the capability to quantify these warnings is yet to be developed and this important task remains as a challenge for rangeland and climate systems science.
Resumo:
Varying the spatial distribution of applied nitrogen (N) fertilizer to match demand in crops has been shown to increase profits in Australia. Better matching the timing of N inputs to plant requirements has been shown to improve nitrogen use efficiency and crop yields and could reduce nitrous oxide emissions from broad acre grains. Farmers in the wheat production area of south eastern Australia are increasingly splitting N application with the second timing applied at stem elongation (Zadoks 30). Spectral indices have shown the ability to detect crop canopy N status but a robust method using a consistent calibration that functions across seasons has been lacking. One spectral index, the canopy chlorophyll content index (CCCI) designed to detect canopy N using three wavebands along the "red edge" of the spectrum was combined with the canopy nitrogen index (CNI), which was developed to normalize for crop biomass and correct for the N dilution effect of crop canopies. The CCCI-CNI index approach was applied to a 3-year study to develop a single calibration derived from a wheat crop sown in research plots near Horsham, Victoria, Australia. The index was able to predict canopy N (g m-2) from Zadoks 14-37 with an r2 of 0.97 and RMSE of 0.65 g N m-2 when dry weight biomass by area was also considered. We suggest that measures of N estimated from remote methods use N per unit area as the metric and that reference directly to canopy %N is not an appropriate method for estimating plant concentration without first accounting for the N dilution effect. This approach provides a link to crop development rather than creating a purely numerical relationship. The sole biophysical input, biomass, is challenging to quantify robustly via spectral methods. Combining remote sensing with crop modelling could provide a robust method for estimating biomass and therefore a method to estimate canopy N remotely. Future research will explore this and the use of active and passive sensor technologies for use in precision farming for targeted N management.
Resumo:
Soil water repellency occurs widely in horticultural and agricultural soils when very dry. The gradual accumulation and breakdown of surface organic matter over time produces wax-like organic acids, which coat soil particles preventing uniform entry of water into the soil. Water repellency is usually managed by regular surfactant applications. Surfactants, literally, are surface active agents (SURFace ACTive AgeNTS). Their mode of action is to reduce the surface tension of water, allowing it to penetrate and wet the soil more easily and completely. This practice improves water use efficiency (by requiring less water to wet the soil and by capturing rainfall and irrigation more effectively and rapidly). It also reduces nutrient losses through run-off erosion or leaching. These nutrients have the potential to pollute the surrounding environment and water courses. This project investigated potential improvements to standard practices (product combination and scheduling) for surfactant use to overcome localised dry spots on water repellent soils and thus improve turf quality and water use efficiency. Weather conditions for the duration of the trial prevented the identification of improved practices in terms of combination and scheduling. However, the findings support previous research that the use of soil surfactants decreased the time for water to infiltrate dry soil samples taken from a previously severely hydrophobic site. Data will be continually collected from this trial site on a private contractual basis, with the hope that improvements to standard practices will be observed during the drier winter months when moisture availability is a limiting factor for turfgrass growth and quality.
Resumo:
Exotic and invasive woody vines are major environmental weeds of riparian areas, rainforest communities and remnant natural vegetation in coastal eastern Australia, where they smother standing vegetation, including large trees, and cause canopy collapse. We investigated, through glasshouse resource manipulative experiments, the ecophysiological traits that might facilitate faster growth, better resource acquisition and/or utilization and thus dominance of four exotic and invasive vines of South East Queensland, Australia, compared with their native counterparts. Relative growth rate was not significantly different between the two groups but water use efficiency (WUE) was higher in the native species while the converse was observed for light use efficiency (quantum efficiency, AQE) and maximum photosynthesis on a mass basis (Amax mass). The invasive species, as a group, also exhibited higher respiration load, higher light compensation point and higher specific leaf area. There were stronger correlations of leaf traits and greater structural (but not physiological) plasticity in invasive species than in their native counterparts. The scaling coefficients of resource use efficiencies (WUE, AQE and respiration efficiency) as well as those of fitness (biomass accumulated) versus many of the performance traits examined did not differ between the two species-origin groups, but there were indications of significant shifts in elevation (intercept values) and shifts along common slopes in many of these relationships – signalling differences in carbon economy (revenue returned per unit energy invested) and/or resource usage. Using ordination and based on 14 ecophysiological attributes, a fair level of separation between the two groups was achieved (51.5% explanatory power), with AQE, light compensation point, respiration load, WUE, specific leaf area and leaf area ratio, in decreasing order, being the main drivers. This study suggests similarity in trait plasticity, especially for physiological traits, but there appear to be fundamental differences in carbon economy and resource conservation between native and invasive vine species.
Resumo:
Models are abstractions of reality that have predetermined limits (often not consciously thought through) on what problem domains the models can be used to explore. These limits are determined by the range of observed data used to construct and validate the model. However, it is important to remember that operating the model beyond these limits, one of the reasons for building the model in the first place, potentially brings unwanted behaviour and thus reduces the usefulness of the model. Our experience with the Agricultural Production Systems Simulator (APSIM), a farming systems model, has led us to adapt techniques from the disciplines of modelling and software development to create a model development process. This process is simple, easy to follow, and brings a much higher level of stability to the development effort, which then delivers a much more useful model. A major part of the process relies on having a range of detailed model tests (unit, simulation, sensibility, validation) that exercise a model at various levels (sub-model, model and simulation). To underline the usefulness of testing, we examine several case studies where simulated output can be compared with simple relationships. For example, output is compared with crop water use efficiency relationships gleaned from the literature to check that the model reproduces the expected function. Similarly, another case study attempts to reproduce generalised hydrological relationships found in the literature. This paper then describes a simple model development process (using version control, automated testing and differencing tools), that will enhance the reliability and usefulness of a model.