43 resultados para Soils - Tillage
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Continuous cultivation and cereal cropping of southern Queensland soils previously supporting native vegetation have resulted in reduced soil nitrogen supply, and consequently decreased cereal grain yields and low grain protein. To enhance yields and protein concentrations of wheat, management practices involving N fertiliser application, with no-tillage and stubble retention, grain legumes, and legume leys were evaluated from 1987 to 1998 on a fertility-depleted Vertosol at Warra, southern Queensland. The objective of this study was to examine the effect of lucerne in a 2-year lucerne–wheat rotation for its nitrogen and disease-break benefits to subsequent grain yield and protein content of wheat as compared with continuous wheat cropping. Dry matter production and nitrogen yields of lucerne were closely correlated with the total rainfall for October–September as well as March–September rainfall. Each 100 mm of total rainfall resulted in 0.97 t/ha of dry matter and 26 kg/ha of nitrogen yield. For the March–September rainfall, the corresponding values were 1.26 t/ha of dry matter and 36 kg/ha of nitrogen yield. The latter values were 10% lower than those produced by annual medics during a similar period. Compared with wheat–wheat cropping, significant increases in total soil nitrogen were observed only in 1990, 1992 and 1994 but increases in soil mineralisable nitrogen were observed in most years following lucerne. Similarly, pre-plant nitrate nitrogen in the soil profile following lucerne was higher by 74 kg/ha (9–167 kg N/ha) than that of wheat–wheat without N fertiliser in all years except 1996. Consequently, higher wheat grain protein (7 out of 9 seasons) and grain yield (4 out of 9 seasons) were produced compared with continuous wheat. There was significant depression in grain yield in 2 (1993 and 1995) out of 9 seasons attributed to soil moisture depletion and/or low growing season rainfall. Consequently, the overall responses in yield were lower than those of 50 kg/ha of fertiliser nitrogen applied to wheat–wheat crops, 2-year medic–wheat or chickpea–wheat rotation, although grain protein concentrations were higher following lucerne. The incidence and severity of the soilborne disease, common root rot of wheat caused by Bipolaris sorokiniana, was generally higher in lucerne–wheat than in continuous wheat with no nitrogen fertiliser applications, since its severity was significantly correlated with plant available water at sowing. No significant incidence of crown rot or root lesion nematode was observed. Thus, productivity, which was mainly due to nitrogen accretion in this experiment, can be maintained where short duration lucerne leys are grown in rotations with wheat.
Resumo:
In dryland agricultural systems of the subtropical, semi-arid region of north-eastern Australia, water is the most limiting resource. Crop productivity depends on the efficient use of rainfall and available water stored in the soil during fallow. Agronomic management practices including a period of fallow, stubble retention, and reduced tillage enhance reserves of soil water. However, access to stored water in these soils may be restricted by the presence of growth-limiting conditions in the rooting zone of the crop. These have been termed as subsoil constraints. Subsoil constraints may include compacted or gravel layers (physical), sodicity, salinity, acidity, nutrient deficiencies, presence of toxic elements (chemical) and low microbial activity (biological). Several of these constraints may occur together in some soils. Farmers have often not been able to obtain the potential yield determined by their prevailing climatic conditions in the marginal rainfall areas of the northern grains region. In the past, the adoption of soil management practices had been largely restricted to the top 100 mm soil layer. Exploitation of the subsoil as a source of water and nutrients has largely been overlooked. The key towards realising potential yields would be to gain better understanding of subsoils and their limitations, then develop options to manage them practically and economically. Due to the complex nature of the causal factors of these constraints, efforts are required for a combination of management approaches rather than individual options, with the aim to combat these constraints for sustainable crop production, managing natural resources and avoiding environmental damage.
Resumo:
Tillage is defined here in a broad sense, including disturbance of the soil and crop residues, wheel traffic and sowing opportunities. In sub-tropical, semi-arid cropping areas in Australia, tillage systems have evolved from intensively tilled bare fallow systems, with high soil losses, to reduced and no tillage systems. In recent years, the use of controlled traffic has also increased. These conservation tillage systems are successful in reducing water erosion of soil and sediment-bound chemicals. Control of runoff of dissolved nutrients and weakly sorbed chemicals is less certain. Adoption of new practices appears to have been related to practical and economic considerations, and proved to be more profitable after a considerable period of research and development. However there are still challenges. One challenge is to ensure that systems that reduce soil erosion, which may involve greater use of chemicals, do not degrade water quality in streams. Another challenge is to ensure that systems that improve water entry do not increase drainage below the crop root zone, which would increase the risk of salinity. Better understanding of how tillage practices influence soil hydrology, runoff and erosion processes should lead to better tillage systems and enable better management of risks to water quality and soil health. Finally, the need to determine the effectiveness of in-field management practices in achieving stream water quality targets in large, multi-land use catchments will challenge our current knowledge base and the tools available.
Resumo:
Reduced supplies of nitrogen (N) in many soils of southern Queensland that were cropped exhaustively with cereals over many decades have been the focus of much research to avoid declines in profitability and sustainability of farming systems. A 45-month period of mixed grass (purple pigeon grass, Setaria incrassata Stapf; Rhodes grass, Chloris gayana Kunth.) and legume (lucerne, Medicago sativa L.; annual medics, M. scutellata L. Mill. and M. truncatula Gaertn.) pasture was one of several options that were compared at a fertility-depleted Vertosol at Warra, southern Queensland, to improve grain yields or increase grain protein concentration of subsequent wheat crops. Objectives of the study were to measure the productivity of a mixed grass and legume pasture grown over 45 months (cut and removed over 36 months) and its effects on yield and protein concentrations of the following wheat crops. Pasture production (DM t/ha) and aboveground plant N yield (kg/ha) for grass, legume (including a small amount of weeds) and total components of pasture responded linearly to total rainfall over the duration of each of 3 pastures sown in 1986, 1987 and 1988. Averaged over the 3 pastures, each 100 mm of rainfall resulted in 0.52 t/ha of grass, 0.44 t/ha of legume and 0.97 t/ha of total pasture DM, there being little variation between the 3 pastures. Aboveground plant N yield of the 3 pastures ranged from 17.2 to 20.5 kg/ha per 100 mm rainfall. Aboveground legume N in response to total rainfall was similar (10.6 - 13.2 kg/ha. 100 mm rainfall) across the 3 pastures in spite of very different populations of legumes and grasses at establishment. Aboveground grass N yield was 5.2 - 7.0 kg/ha per 100mm rainfall. In most wheat crops following pasture, wheat yields were similar to that of unfertilised wheat except in 1990 and 1994, when grain yields were significantly higher but similar to that for continuous wheat fertilised with 75 kg N/ha. In contrast, grain protein concentrations of most wheat crops following pasture responded positively, being substantially higher than unfertilised wheat but similar to that of wheat fertilised with 75 kg N/ha. Grain protein averaged over all years of assay was increased by 25 - 40% compared with that of unfertilised wheat. Stored water supplies after pasture were < 134mm (< 55% of plant available water capacity); for most assay crops water storages were 67 - 110 mm, an equivalent wet soil depth of only 0.3 - 0.45 m. Thus, the crop assays of pasture benefits were limited by low water supply to wheat crops. Moreover, the severity of common root rot in wheat crop was not reduced by pasture - wheat rotation.
Resumo:
No-tillage (NT) practice, where straw is retained on the soil surface, is increasingly being used in cereal cropping systems in Australia and elsewhere. Compared to conventional tillage (CT), where straw is mixed with the ploughed soil, NT practice may reduce straw decomposition, increase nitrogen immobilisation and increase organic carbon in the soil. This study examined 15N-labelled wheat straw (stubble) decomposition in four treatments (NT v. CT, with N rates of 0 and 75 kg/ha.year) and assessed the tillage and fertiliser N effects on mineral N and organic C and N levels over a 10-year period in a field experiment. NT practice decreased the rate of straw decomposition while fertiliser N application increased it. However, there was no tillage practice x N interaction. The mean residence time of the straw N in soil was more than twice as long under the NT (1.2 years) as compared to the CT practice (0.5 years). In comparison, differences in mean residence time due to N fertiliser treatment were small. However, tillage had generally very little effect on either the amounts of mineral N at sowing or soil organic C (and N) over the study period. While application of N fertiliser increased mineral N, it had very little effect on organic C over a 10-year period. Relatively rapid decomposition of straw and short mean residence time of straw N in a Vertisol is likely to have very little long-term effect on N immobilisation and organic C level in an annual cereal cropping system in a subtropical, semiarid environment. Thus, changing the tillage practice from CT to NT may not necessitate additional N requirement unless use is made of additional stored water in the soil or mineral N loss due to increased leaching is compensated for in N supply to crops.
Resumo:
Negative potassium (K) balances in all broadacre grain cropping systems in northern Australia are resulting in a decline in the plant-available reserves of K and necessitating a closer examination of strategies to detect and respond to developing K deficiency in clay soils. Grain growers on the Red Ferrosol soils have increasingly encountered K deficiency over the last 10 years due to lower available K reserves in these soils in their native condition. However, the problem is now increasingly evident on the medium-heavy clay soils (Black and Grey Vertosols) and is made more complicated by the widespread adoption of direct drill cropping systems and the resulting strong strati. cation of available K reserves in the top 0.05-0.1 m of the soil pro. le. This paper reports glasshouse studies examining the fate of applied K fertiliser in key cropping soils of the inland Burnett region of south-east Queensland, and uses the resultant understanding of K dynamics to interpret results of field trials assessing the effectiveness of K application strategies in terms of K availability to crop plants. At similar concentrations of exchangeable K (K-exch), soil solution K concentrations and activity of K in the soil solution (AR(K)) varied by 6-7-fold between soil types. When K-exch arising from different rates of fertiliser application was expressed as a percentage of the effective cation exchange capacity (i.e. K saturation), there was evidence of greater selective adsorption of K on the exchange complex of Red Ferrosols than Black and Grey Vertosols or Brown Dermosols. Both soil solution K and AR(K) were much less responsive to increasing K-exch in the Black Vertosols; this is indicative of these soils having a high K buffer capacity (KBC). These contrasting properties have implications for the rate of diffusive supply of K to plant roots and the likely impact of K application strategies (banding v. broadcast and incorporation) on plant K uptake. Field studies investigating K application strategies (banding v. broadcasting) and the interaction with the degree of soil disturbance/mixing of different soil types are discussed in relation to K dynamics derived from glasshouse studies. Greater propensity to accumulate luxury K in crop biomass was observed in a Brown Ferrosol with a KBC lower than that of a Black Vertosol, consistent with more efficient diffusive supply to plant roots in the Ferrosol. This luxury K uptake, when combined with crops exhibiting low proportional removal of K in the harvested product (i.e. low K harvest index coarse grains and winter cereals) and residue retention, can lead to rapid re-development of stratified K profiles. There was clear evidence that some incorporation of K fertiliser into soil was required to facilitate root access and crop uptake, although there was no evidence of a need to incorporate K fertiliser any deeper than achieved by conventional disc tillage (i.e. 0.1-0.15 m). Recovery of fertiliser K applied in deep (0.25-0.3 m) bands in combination with N and P to facilitate root proliferation was quite poor in Red Ferrosols and Grey or Black Vertosols with moderate effective cation exchange capacity (ECEC, 25-35 cmol(+)/kg), was reasonable but not enough to overcome K deficiency in a Brown Dermosol (ECEC 11 cmol(+)/kg), but was quite good on a Black Vertosol (ECEC 50-60 cmol(+)/kg). Collectively, results suggest that frequent small applications of K fertiliser, preferably with some soil mixing, is an effective fertiliser application strategy on lighter clay soils with low KBC and an effective diffusive supply mechanism. Alternately, concentrated K bands and enhanced root proliferation around them may be a more effective strategy in Vertosol soils with high KBC and limited diffusive supply. Further studies to assess this hypothesis are needed.
Resumo:
Root-lesion nematodes (RLNs) are found on 75% of grain farms in southern Queensland (QLD) and northern New South Wales (NSW) and are significant pests. This project confirmed that biological suppression of RLNs occurs in soils, examined what organisms are involved and how growers might enhance suppressiveness of soils. Field trials, and glasshouse and laboratory bioassays of soils from fields with contrasting management practices, showed suppressiveness is favoured with less tillage, more stubble and continuous intensive cropping, particularly in the top 15cm of soil. Through extensive surveys key organisms, Pasteuria bacteria, nematode-trapping fungi and predatory nematodes were isolated and identified as being present.
Resumo:
A field experiment was established in which an amendment of poultry manure and sawdust (200 t/ha) was incorporated into some plots but not others and then a permanent pasture or a sequence of biomass-producing crops was grown with and without tillage, with all biomass being returned to the soil. After 4 years, soil C levels were highest in amended plots, particularly those that had been cropped using minimum tillage, and lowest in non-amended and fallowed plots, regardless of how they had been tilled. When ginger was planted, symphylans caused severe damage to all treatments, indicating that cropping, tillage and organic matter management practices commonly used to improve soil health are not necessarily effective for all crops or soils. During the rotational phase of the experiment, the development of suppressiveness to three key pathogens of ginger was monitored using bioassays. Results for root-knot nematode (Meloidogyne javanica) indicated that for the first 2 years, amended soil was more suppressive than non-amended soil from the same cropping and tillage treatment, whereas under pasture, the amendment only enhanced suppressiveness in the first year. Suppressiveness was generally associated with higher C levels and enhanced biological activity (as measured by the rate of fluorescein diacetate (FDA) hydrolysis and numbers of free-living nematodes). Reduced tillage also enhanced suppressiveness, as gall ratings and egg counts in the second and third years were usually significantly lower in cropped soils under minimum rather than conventional tillage. Additionally, soil that was not disturbed during the process of setting up bioassays was more suppressive than soil which had been gently mixed by hand. Results of bioassays with Fusarium oxysporum f. sp. zingiberi were too inconsistent to draw firm conclusions, but the severity of fusarium yellows was generally higher in fumigated fallow soil than in other treatments, with soil management practices having little impact on disease severity. With regard to Pythium myriotylum, biological factors capable of reducing rhizome rot were present, but were not effective enough to suppress the disease under environmental conditions that were ideal for disease development.
Resumo:
Dairy farms located in the subtropical cereal belt of Australia rely on winter and summer cereal crops, rather than pastures, for their forage base. Crops are mostly established in tilled seedbeds and the system is vulnerable to fertility decline and water erosion, particularly over summer fallows. Field studies were conducted over 5 years on contrasting soil types, a Vertosol and Sodosol, in the 650-mm annual-rainfall zone to evaluate the benefits of a modified cropping program on forage productivity and the soil-resource base. Growing forage sorghum as a double-crop with oats increased total mean annual production over that of winter sole-crop systems by 40% and 100% on the Vertosol and Sodosol sites respectively. However, mean annual winter crop yield was halved and overall forage quality was lower. Ninety per cent of the variation in winter crop yield was attributable to fallow and in-crop rainfall. Replacing forage sorghum with the annual legume lablab reduced fertiliser nitrogen (N) requirements and increased forage N concentration, but reduced overall annual yield. Compared with sole-cropped oats, double-cropping reduced the risk of erosion by extending the duration of soil water deficits and increasing the time ground was under plant cover. When grown as a sole-crop, well fertilised forage sorghum achieved a mean annual cumulative yield of 9.64 and 6.05 t DM/ha on the Vertosol and Sodosol, respectively, being about twice that of sole-cropped oats. Forage sorghum established using zero-tillage practices and fertilised at 175 kg N/ha. crop achieved a significantly higher yield and forage N concentration than did the industry-standard forage sorghum (conventional tillage and 55 kg N/ha. crop) on the Vertosol but not on the Sodosol. On the Vertosol, mean annual yield increased from 5.65 to 9.64 t DM/ha (33 kg DM/kg N fertiliser applied above the base rate); the difference in the response between the two sites was attributed to soil type and fertiliser history. Changing both tillage practices and N-fertiliser rate had no affect on fallow water-storage efficiency but did improve fallow ground cover. When forage sorghum, grown as a sole crop, was replaced with lablab in 3 of the 5 years, overall forage N concentration increased significantly, and on the Vertosol, yield and soil nitrate-N reserves also increased significantly relative to industry-standard sorghum. All forage systems maintained or increased the concentration of soil nitrate-N (0-1.2-m soil layer) over the course of the study. Relative to sole-crop oats, alternative forage systems were generally beneficial to the concentration of surface-soil (0-0.1 m) organic carbon and systems that included sorghum showed most promise for increasing soil organic carbon concentration. We conclude that an emphasis on double-or summer sole-cropping rather than winter sole-cropping will advantage both farm productivity and the soil-resource base.
Resumo:
Field studies were conducted over 5 years on two dairy farms in southern Queensland to evaluate the impacts of zero-tillage, nitrogen (N) fertiliser and legumes on a winter-dominant forage system based on raingrown oats. Oats was able to be successfully established using zero-tillage methods, with no yield penalties and potential benefits in stubble retention over the summer fallow. N fertiliser, applied at above industry-standard rates (140 vs. 55 kg/ha.crop) in the first 3 years, increased forage N concentration significantly and had residual effects on soil nitrate-N at both sites. At one site, crop yield was increased by 10 kg DM/ha. kg fertiliser N applied above industry-standard rates. The difference between sites in fertiliser response reflected contrasting soil and fertiliser history. There was no evidence that modifications to oats cropping practices (zero-tillage and increased N fertiliser) increased surface soil organic carbon (0-10 cm) in the time frame of the present study. When oats was substituted with annual legumes, there were benefits in improved forage N content of the oat crop immediately following, but legume yield was significantly inferior to oats. In contrast, the perennial legume Medicago sativa was competitive in biomass production and forage quality with oats at both sites and increased soil nitrate-N levels following termination. However, its contribution to winter forage was low at 10% of total production, compared with 40% for oats, and soil water reserves were significantly reduced at one site, which had an impact on the following oat production. The study demonstrated that productive grazed oat crops can be grown using zero tillage and that increased N fertiliser is more consistent in its effect on N concentration than on forage yield. A lucerne ley provides a strategy for raising soil nitrate-N concentration and increasing overall forage productivity, although winter forage production is reduced.
Resumo:
NITROUS OXIDE (N2O) IS a potent greenhouse gas and the predominant ozone-depleting substance in the atmosphere. Agricultural nitrogenous fertiliser use is the major source of human-induced N2O emissions. A field experiment was conducted at Bundaberg from October 2012 to September 2014 to examine the impacts of legume crop (soybean) rotation as an alternative nitrogen (N) source on N2O emissions during the fallow period and to investigate low-emission soybean residue management practices. An automatic monitoring system and manual gas sampling chambers were used to measure greenhouse gas emissions from soil. Soybean cropping during the fallow period reduced N2O emissions compared to the bare fallow. Based on the N content in the soybean crop residues, the fertiliser N application rate was reduced by about 120 kg N/ha for the subsequent sugarcane crop. Consequently, emissions of N2O during the sugarcane cropping season were significantly lower from the soybean cropped soil than those from the conventionally fertilised (145 kg N/ha) soil following bare fallow. However, tillage that incorporated the soybean crop residues into soil promoted N2O emissions in the first two months. Spraying a nitrification inhibitor (DMPP) onto the soybean crop residues before tillage effectively prevented the N2O emission spikes. Compared to conventional tillage, practising no-till with or without growing a nitrogen catch crop during the time after soybean harvest and before cane planting also reduced N2O emissions substantially. These results demonstrated that soybean rotation during the fallow period followed with N conservation management practices could offer a promising N2O mitigation strategy in sugarcane farming. Further investigation is required to provide guidance on N and water management following soybean fallow to maintain sugar productivity.
Resumo:
Soil nitrogen (N) supply in the Vertosols of southern Queensland, Australia has steadily declined as a result of long-term cereal cropping without N fertiliser application or rotations with legumes. Nitrogen-fixing legumes such as lucerne may enhance soil N supply and therefore could be used in lucerne-wheat rotations. However, lucerne leys in this subtropical environment can create a soil moisture deficit, which may persist for a number of seasons. Therefore, we evaluated the effect of varying the duration of a lucerne ley (for up to 4 years) on soil N increase, N supply to wheat, soil water changes, wheat yields and wheat protein on a fertility-depleted Vertosol in a field experiment between 1989 and 1996 at Warra (26degrees 47'S, 150degrees53'E), southern Queensland. The experiment consisted of a wheat-wheat rotation, and 8 treatments of lucerne leys starting in 1989 (phase 1) or 1990 (phase 2) for 1,2,3 or 4 years duration, followed by wheat cropping. Lucerne DM yield and N yield increased with increasing duration of lucerne leys. Soil N increased over time following 2 years of lucerne but there was no further significant increase after 3 or 4 years of lucerne ley. Soil nitrate concentrations increased significantly with all lucerne leys and moved progressively downward in the soil profile from 1992 to 1995. Soil water, especially at 0.9-1.2 m depth, remained significantly lower for the next 3 years after the termination of the 4 year lucerne ley than under continuous wheat. No significant increase in wheat yields was observed from 1992 to 1995, irrespective of the lucerne ley. However, wheat grain protein concentrations were significantly higher under lucerne-wheat than under wheat wheat rotations for 3-5 years. The lucerne yield and soil water and nitrate-N concentrations were satisfactorily simulated with the APSIM model. Although significant N accretion occurred in the soil following lucerne leys, in drier seasons, recharge of the drier soil profile following long duration lucerne occurred after 3 years. Consequently, 3- and 4-year lucerne-wheat rotations resulted in more variable wheat yields than wheat-wheat rotations in this region. The remaining challenge in using lucerne-wheat rotations is balancing the N accretion benefits with plant-available water deficits, which are most likely to occur in the highly variable rainfall conditions of this region.
Resumo:
A study was undertaken from 2004 to 2007 to investigate factors associated with decreased efficacy of metalaxyl to manage damping-off of cucumber in Oman. A survey over six growing seasons showed that growers lost up to 14.6% of seedlings following application of metalaxyl. No resistance to metalaxyl was found among Pythium isolates. Damping-off disease in the surveyed greenhouses followed two patterns. In most (69%) greenhouses, seedling mortality was found to occur shortly after transplanting and decrease thereafter (Phase-I). However, a second phase of seedling mortality (Phase-II) appeared 9-14 d after transplanting in about 31% of the surveyed greenhouses. Analysis of the rate of biodegradation of metalaxyl in six greenhouses indicated a significant increase in the rate of metalaxyl biodegradation in greenhouses, which encountered Phase-II damping-off. The half-life of metalaxyl dropped from 93 d in soil, which received no previous metalaxyl treatment to 14 d in soil, which received metalaxyl for eight consecutive seasons, indicating an enhanced rate of metalaxyl biodegradation after repeated use. Multiple applications of metalaxyl helped reduce the appearance of Phase-II damping-off. This appears to be the first report of rapid biodegradation of metalaxyl in greenhouse soils and the first report of its association with appearance of a second phase of mortality in cucumber seedlings.
Resumo:
Aims: To investigate the occurrence and levels of Arcobacter spp. in pig effluent ponds and effluent-treated soil. Methods and Results: A Most Probable Number (MPN) method was developed to assess the levels of Arcobacter spp. in seven pig effluent ponds and six effluent-treated soils, immediately after effluent irrigation. Arcobacter spp. levels in the effluent ponds varied from 6.5 × 105 to 1.1 × 108 MPN 100 ml-1 and in freshly irrigated soils from 9.5 × 102 to 2.8 × 104 MPN g-1 in all piggery environments tested. Eighty-three Arcobacter isolates were subjected to an abbreviated phenotypic test scheme and examined using a multiplex polymerase chain reaction (PCR). The PCR identified 35% of these isolates as Arcobacter butzleri, 49% as Arcobacter cryaerophilus while 16% gave no band. All 13 nonreactive isolates were subjected to partial 16S rDNA sequencing and showed a high similarity (>99%) to Arcobacter cibarius. Conclusions: A. butzleri, A. cryaerophilus and A. cibarius were isolated from both piggery effluent and effluent-irrigated soil, at levels suggestive of good survival in the effluent pond. Significance and Impact of the Study: This is the first study to provide quantitative information on Arcobacter spp. levels in piggery effluent and to associate A. cibarius with pigs and piggery effluent environments.
Resumo:
The APSIM-Wheat module was used to investigate our present capacity to simulate wheat yields in a semi-arid region of eastern Australia (the Victorian Mallee), where hostile subsoils associated with salinity, sodicity, and boron toxicity are known to limit grain yield. In this study we tested whether the effects of subsoil constraints on wheat growth and production could be modelled with APSIM-Wheat by assuming that either: (a) root exploration within a particular soil layer was reduced by the presence of toxic concentrations of salts, or (b) soil water uptake from a particular soil layer was reduced by high concentration of salts through osmotic effects. After evaluating the improved predictive capacity of the model we applied it to study the interactions between subsoil constraints and seasonal conditions, and to estimate the economic effect that subsoil constraints have on wheat farming in the Victorian Mallee under different climatic scenarios. Although the soils had high levels of salinity, sodicity, and boron, the observed variability in root abundance at different soil layers was mainly related to soil salinity. We concluded that: (i) whether the effect of subsoil limitations on growth and yield of wheat in the Victorian Mallee is driven by toxic, osmotic, or both effects acting simultaneously still requires further research, (ii) at present, the performance of APSIM-Wheat in the region can be improved either by assuming increased values of lower limit for soil water extraction, or by modifying the pattern of root exploration in the soil pro. le, both as a function of soil salinity. The effect of subsoil constraints on wheat yield and gross margin can be expected to be higher during drier than wetter seasons. In this region the interaction between climate and soil properties makes rainfall information alone, of little use for risk management and farm planning when not integrated with cropping systems models.