9 resultados para Double-deficit hypothesis
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Physiological and genetic studies of leaf growth often focus on short-term responses, leaving a gap to whole-plant models that predict biomass accumulation, transpiration and yield at crop scale. To bridge this gap, we developed a model that combines an existing model of leaf 6 expansion in response to short-term environmental variations with a model coordinating the development of all leaves of a plant. The latter was based on: (1) rates of leaf initiation, appearance and end of elongation measured in field experiments; and (2) the hypothesis of an independence of the growth between leaves. The resulting whole-plant leaf model was integrated into the generic crop model APSIM which provided dynamic feedback of environmental conditions to the leaf model and allowed simulation of crop growth at canopy level. The model was tested in 12 field situations with contrasting temperature, evaporative demand and soil water status. In observed and simulated data, high evaporative demand reduced leaf area at the whole-plant level, and short water deficits affected only leaves developing during the stress, either visible or still hidden in the whorl. The model adequately simulated whole-plant profiles of leaf area with a single set of parameters that applied to the same hybrid in all experiments. It was also suitable to predict biomass accumulation and yield of a similar hybrid grown in different conditions. This model extends to field conditions existing knowledge of the environmental controls of leaf elongation, and can be used to simulate how their genetic controls flow through to yield.
Resumo:
The enemy release hypothesis predicts that native herbivores will either prefer or cause more damage to native than introduced plant species. We tested this using preference and performance experiments in the laboratory and surveys of leaf damage caused by the magpie moth Nyctemera amica on a co-occuring native and introduced species of fireweed (Senecio) in eastern Australia. In the laboratory, ovipositing females and feeding larvae preferred the native S. pinnatifolius over the introduced S. madagascariensis. Larvae performed equally well on foliage of S. pinnatifolius and S. madagascariensis: pupal weights did not differ between insects reared on the two species, but growth rates were significantly faster on S. pinnatifolius. In the field, foliage damage was significantly greater on native S. pinnatifolius than introduced S. madagascariensis. These results support the enemy release hypothesis, and suggest that the failure of native consumers to switch to introduced species contributes to their invasive success. Both plant species experienced reduced, rather than increased, levels of herbivory when growing in mixed populations, as opposed to pure stands in the field; thus, there was no evidence that apparent competition occurred.
Resumo:
Water regulations have decreased irrigation water supplies in Nebraska and some other areas of the USA Great Plains. When available water is not enough to meet crop water requirements during the entire growing cycle, it becomes critical to know the proper irrigation timing that would maximize yields and profits. This study evaluated the effect of timing of a deficit-irrigation allocation (150 mm) on crop evapotranspiration (ETc), yield, water use efficiency (WUE = yield/ETc), irrigation water use efficiency (IWUE = yield/irrigation), and dry mass (DM) of corn (Zea mays L.) irrigated with subsurface drip irrigation in the semiarid climate of North Platte, NE. During 2005 and 2006, a total of sixteen irrigation treatments (eight each year) were evaluated, which received different percentages of the water allocation during July, August, and September. During both years, all treatments resulted in no crop stress during the vegetative period and stress during the reproductive stages, which affected ETc, DM, yield, WUE and IWUE. Among treatments, ETc varied by 7.2 and 18.8%; yield by 17 and 33%; WUE by 12 and 22%, and IWUE by 18 and 33% in 2005 and 2006, respectively. Yield and WUE both increased linearly with ETc and with ETc/ETp (ETp = seasonal ETc with no water stress), and WUE increased linearly with yield. The yield response factor (ky) averaged 1.50 over the two seasons. Irrigation timing affected the DM of the plant, grain, and cob, but not that of the stover. It also affected the percent of DM partitioned to the grain (harvest index), which increased linearly with ETc and averaged 56.2% over the two seasons, but did not affect the percent allocated to the cob or stover. Irrigation applied in July had the highest positive coefficient of determination (R2) with yield. This high positive correlation decreased considerably for irrigation applied in August, and became negative for irrigation applied in September. The best positive correlation between the soil water deficit factor (Ks) and yield occurred during weeks 12-14 from crop emergence, during the "milk" and "dough" growth stages. Yield was poorly correlated to stress during weeks 15 and 16, and the correlation became negative after week 17. Dividing the 150 mm allocation about evenly among July, August and September was a good strategy resulting in the highest yields in 2005, but not in 2006. Applying a larger proportion of the allocation in July was a good strategy during both years, and the opposite resulted when applying a large proportion of the allocation in September. The different results obtained between years indicate that flexible irrigation scheduling techniques should be adopted, rather than relying on fixed timing strategies.
Resumo:
The present study set out to test the hypothesis through field and simulation studies that the incorporation of short-term summer legumes, particularly annual legume lablab (Lablab purpureus cv. Highworth), in a fallow-wheat cropping system will improve the overall economic and environmental benefits in south-west Queensland. Replicated, large plot experiments were established at five commercial properties by using their machineries, and two smaller plot experiments were established at two intensively researched sites (Roma and St George). A detailed study on various other biennial and perennial summer forage legumes in rotation with wheat and influenced by phosphorus (P) supply (10 and 40 kg P/ha) was also carried out at the two research sites. The other legumes were lucerne (Medicago sativa), butterfly pea (Clitoria ternatea) and burgundy bean (Macroptilium bracteatum). After legumes, spring wheat (Triticum aestivum) was sown into the legume stubble. The annual lablab produced the highest forage yield, whereas germination, establishment and production of other biennial and perennial legumes were poor, particularly in the red soil at St George. At the commercial sites, only lablab-wheat rotations were experimented, with an increased supply of P in subsurface soil (20 kg P/ha). The lablab grown at the commercial sites yielded between 3 and 6 t/ha forage yield over 2-3 month periods, whereas the following wheat crop with no applied fertiliser yielded between 0.5 to 2.5 t/ha. The wheat following lablab yielded 30% less, on average, than the wheat in a fallow plot, and the profitability of wheat following lablab was slightly higher than that of the wheat following fallow because of greater costs associated with fallow management. The profitability of the lablab-wheat phase was determined after accounting for the input costs and additional costs associated with the management of fallow and in-crop herbicide applications for a fallow-wheat system. The economic and environmental benefits of forage lablab and wheat cropping were also assessed through simulations over a long-term climatic pattern by using economic (PreCAPS) and biophysical (Agricultural Production Systems Simulation, APSIM) decision support models. Analysis of the long-term rainfall pattern (70% in summer and 30% in winter) and simulation studies indicated that ~50% time a wheat crop would not be planted or would fail to produce a profitable crop (grain yield less than 1 t/ha) because of less and unreliable rainfall in winter. Whereas forage lablab in summer would produce a profitable crop, with a forage yield of more than 3 t/ha, ~90% times. Only 14 wheat crops (of 26 growing seasons, i.e. 54%) were profitable, compared with 22 forage lablab (of 25 seasons, i.e. 90%). An opportunistic double-cropping of lablab in summer and wheat in winter is also viable and profitable in 50% of the years. Simulation studies also indicated that an opportunistic lablab-wheat cropping can reduce the potential runoff+drainage by more than 40% in the Roma region, leading to improved economic and environmental benefits.
Resumo:
The nitrogen-driven trade-off between nitrogen utilisation efficiency (yield per unit nitrogen uptake) and water use efficiency (yield per unit evapotranspiration) is widespread and results from well established, multiple effects of nitrogen availability on the water, carbon and nitrogen economy of crops. Here we used a crop model (APSIM) to simulate the yield, evapotranspiration, soil evaporation and nitrogen uptake of wheat, and analysed yield responses to water, nitrogen and climate using a framework analogous to the rate-duration model of determinate growth. The relationship between modelled grain yield (Y) and evapotranspiration (ET) was fitted to a linear-plateau function to derive three parameters: maximum yield (Ymax), the ET break-point when yield reaches its maximum (ET#), and the rate of yield response in the linear phase ([Delta]Y/[Delta]ET). Against this framework, we tested the hypothesis that nitrogen deficit reduces maximum yield by reducing both the rate ([Delta]Y/[Delta]ET) and the range of yield response to evapotranspiration, i.e. ET# - Es, where Es is modelled median soil evaporation. Modelled data reproduced the nitrogen-driven trade-off between nitrogen utilisation efficiency and water use efficiency in a transect from Horsham (36°S) to Emerald (23°S) in eastern Australia. Increasing nitrogen supply from 50 to 250 kg N ha-1 reduced yield per unit nitrogen uptake from 29 to 12 kg grain kg-1 N and increased yield per unit evapotranspiration from 6 to 15 kg grain ha-1 mm-1 at Emerald. The same increment in nitrogen supply reduced yield per unit nitrogen uptake from 30 to 25 kg grain kg-1 N and increased yield per unit evapotranspiration from 6 to 25 kg grain ha-1 mm-1 at Horsham. Maximum yield ranged from 0.9 to 6.4 t ha-1. Consistent with our working hypothesis, reductions in maximum yield with nitrogen deficit were associated with both reduction in the rate of yield response to ET and compression of the range of yield response to ET. Against the notion of managing crops to maximise water use efficiency in low rainfall environments, we emphasise the trade-off between water use efficiency and nitrogen utilisation efficiency, particularly under conditions of high nitrogen-to-grain price ratio. The rate-range framework to characterise the relationship between yield and evapotranspiration is useful to capture this trade-off as the parameters were responsive to both nitrogen supply and climatic factors.
Resumo:
Genotype-environment interactions (GEI) limit genetic gain for complex traits such as tolerance to drought. Characterization of the crop environment is an important step in understanding GEI. A modelling approach is proposed here to characterize broadly (large geographic area, long-term period) and locally (field experiment) drought-related environmental stresses, which enables breeders to analyse their experimental trials with regard to the broad population of environments that they target. Water-deficit patterns experienced by wheat crops were determined for drought-prone north-eastern Australia, using the APSIM crop model to account for the interactions of crops with their environment (e.g. feedback of plant growth on water depletion). Simulations based on more than 100 years of historical climate data were conducted for representative locations, soils, and management systems, for a check cultivar, Hartog. The three main environment types identified differed in their patterns of simulated water stress around flowering and during grain-filling. Over the entire region, the terminal drought-stress pattern was most common (50% of production environments) followed by a flowering stress (24%), although the frequencies of occurrence of the three types varied greatly across regions, years, and management. This environment classification was applied to 16 trials relevant to late stages testing of a breeding programme. The incorporation of the independently-determined environment types in a statistical analysis assisted interpretation of the GEI for yield among the 18 representative genotypes by reducing the relative effect of GEI compared with genotypic variance, and helped to identify opportunities to improve breeding and germplasm-testing strategies for this region.
Resumo:
Dairy farms located in the subtropical cereal belt of Australia rely on winter and summer cereal crops, rather than pastures, for their forage base. Crops are mostly established in tilled seedbeds and the system is vulnerable to fertility decline and water erosion, particularly over summer fallows. Field studies were conducted over 5 years on contrasting soil types, a Vertosol and Sodosol, in the 650-mm annual-rainfall zone to evaluate the benefits of a modified cropping program on forage productivity and the soil-resource base. Growing forage sorghum as a double-crop with oats increased total mean annual production over that of winter sole-crop systems by 40% and 100% on the Vertosol and Sodosol sites respectively. However, mean annual winter crop yield was halved and overall forage quality was lower. Ninety per cent of the variation in winter crop yield was attributable to fallow and in-crop rainfall. Replacing forage sorghum with the annual legume lablab reduced fertiliser nitrogen (N) requirements and increased forage N concentration, but reduced overall annual yield. Compared with sole-cropped oats, double-cropping reduced the risk of erosion by extending the duration of soil water deficits and increasing the time ground was under plant cover. When grown as a sole-crop, well fertilised forage sorghum achieved a mean annual cumulative yield of 9.64 and 6.05 t DM/ha on the Vertosol and Sodosol, respectively, being about twice that of sole-cropped oats. Forage sorghum established using zero-tillage practices and fertilised at 175 kg N/ha. crop achieved a significantly higher yield and forage N concentration than did the industry-standard forage sorghum (conventional tillage and 55 kg N/ha. crop) on the Vertosol but not on the Sodosol. On the Vertosol, mean annual yield increased from 5.65 to 9.64 t DM/ha (33 kg DM/kg N fertiliser applied above the base rate); the difference in the response between the two sites was attributed to soil type and fertiliser history. Changing both tillage practices and N-fertiliser rate had no affect on fallow water-storage efficiency but did improve fallow ground cover. When forage sorghum, grown as a sole crop, was replaced with lablab in 3 of the 5 years, overall forage N concentration increased significantly, and on the Vertosol, yield and soil nitrate-N reserves also increased significantly relative to industry-standard sorghum. All forage systems maintained or increased the concentration of soil nitrate-N (0-1.2-m soil layer) over the course of the study. Relative to sole-crop oats, alternative forage systems were generally beneficial to the concentration of surface-soil (0-0.1 m) organic carbon and systems that included sorghum showed most promise for increasing soil organic carbon concentration. We conclude that an emphasis on double-or summer sole-cropping rather than winter sole-cropping will advantage both farm productivity and the soil-resource base.
Resumo:
Modulation of the immune response is an important step in the induction of protective humoral and cellular immunity against pathogens. In this study, we investigated the possibility of using a nanomaterial conjugated with the toll-like receptor (TLR) ligand CpG to modulate the immune response towards the preferred polarity. MgAl-layered double hydroxide (LDH) nanomaterial has a very similar chemical composition to Alum, an FDA approved adjuvant for human vaccination. We used a model antigen, ovalbumin (OVA) to demonstrate that MgAl-LDH had comparable adjuvant activity to Alum, but much weaker inflammation. Conjugation of TLR9 ligand CpG to LDH nanoparticles significantly enhanced the antibody response and promoted a switch from Th2 toward Th1 response, demonstrated by a change in the IgG2a:IgG1 ratio. Moreover, immunization of mice with CpG-OVA-conjugated LDH before challenge with OVA-expressing B16/F10 tumor cells retarded tumor growth. Together, these data indicate that LDH nanomaterial can be used as an immune adjuvant to promote Th1 or Th2 dominant immune responses suitable for vaccination purposes.
Resumo:
Recently argued that observed positive relationships between dingoes and small mammals were a result of top-down processes whereby lethal dingo control reduced dingoes and increased mesopredators and herbivores, which then suppressed small mammals. Here, I show that the prerequisite negative effects of dingo control on dingoes were not shown, and that the same positive relationships observed may simply represent well-known bottom-up processes whereby more generalist predators are found in places with more of their preferred prey. Identification of top-predator controlinduced trophic cascades first requires demonstration of some actual effect of control on predators, typically possible only through manipulative experiments with the ability to identify cause and effect.