9 resultados para Subsurface Geology
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Quantifying the local crop response to irrigation is important for establishing adequate irrigation management strategies. This study evaluated the effect of irrigation applied with subsurface drip irrigation on field corn (Zea mays L.) evapotranspiration (ETc), yield, water use efficiencies (WUE = yield/ETc, and IWUE = yield/irrigation), and dry matter production in the semiarid climate of west central Nebraska. Eight treatments were imposed with irrigation amounts ranging from 53 to 356 mm in 2005 and from 22 to 226 mm in 2006. A soil water balance approach (based on FAO-56) was used to estimate daily soil water and ETc. Treatments resulted in seasonal ETc of 580-663 mm and 466-656 mm in 2005 and 2006, respectively. Yields among treatments differed by as much as 22% in 2005 and 52% in 2006. In both seasons, irrigation significantly affected yields, which increased with irrigation up to a point where irrigation became excessive. Distinct relationships were obtained each season. Yields increased linearly with seasonal ETc (R 2 = 0.89) and ETc/ETp (R 2 = 0.87) (ETp = ETc with no water stress). The yield response factor (ky), which indicates the relative reduction in yield to relative reduction in ETc, averaged 1.58 over the two seasons. WUE increased non-linearly with seasonal ETc and with yield. WUE was more sensitive to irrigation during the drier 2006 season, compared with 2005. Both seasons, IWUE decreased sharply with irrigation. Irrigation significantly affected dry matter production and partitioning into the different plant components (grain, cob, and stover). On average, the grain accounted for the majority of the above-ground plant dry mass (≈59%), followed by the stover (≈33%) and the cob (≈8%). The dry mass of the plant and that of each plant component tended to increase with seasonal ETc. The good relationships obtained in the study between crop performance indicators and seasonal ETc demonstrate that accurate estimates of ETc on a daily and seasonal basis can be valuable for making tactical in-season irrigation management decisions and for strategic irrigation planning and management.
Resumo:
In semi-arid areas such as western Nebraska, interest in subsurface drip irrigation (SDI) for corn is increasing due to restricted irrigation allocations. However, crop response quantification to nitrogen (N) applications with SDI and the environmental benefits of multiple in-season (IS) SDI N applications instead of a single early-season (ES) surface application are lacking. The study was conducted in 2004, 2005, and 2006 at the University of Nebraska-Lincoln West Central Research and Extension Center in North Platte, Nebraska, comparing two N application methods (IS and ES) and three N rates (128, 186, and 278 kg N ha(-1)) using a randomized complete block design with four replications. No grain yield or biomass response was observed in 2004. In 2005 and 2006, corn grain yield and biomass production increased with increasing N rates, and the IS treatment increased grain yield, total N uptake, and gross return after N application costs (GRN) compared to the ES treatment. Chlorophyll meter readings taken at the R3 corn growth stage in 2006 showed that less N was supplied to the plant with ES compared to the IS treatment. At the end of the study, soil NO3-N masses in the 0.9 to 1.8 m depth were greater under the IS treatment compared to the ES treatment. Results suggested that greater losses of NO3-N below the root zone under the ES treatment may have had a negative effect on corn production. Under SDI systems, fertigating a recommended N rate at various corn growth stages can increase yields, GRN, and reduce NO3-N leaching in soils compared to concentrated early-season applications.
Resumo:
Australian cotton (Gossypium hirsutum L.) is predominantly grown on heavy clay soils (Vertosols). Cotton grown on Vertosols often experiences episodes of low oxygen concentration in the root-zone, particularly after irrigation events. In subsurface drip-irrigation (SDI), cotton receives frequent irrigation and sustained wetting fronts are developed in the rhizosphere. This can lead to poor soil diffusion of oxygen, causing temporal and spatial hypoxia. As cotton is sensitive to waterlogging, exposure to this condition can result in a significant yield penalty. Use of aerated water for drip irrigation (‘oxygation’) can ameliorate hypoxia in the wetting front and, therefore, overcome the negative effects of poor soil aeration. The efficacy of oxygation, delivered via SDI to broadacre cotton, was evaluated over seven seasons (2005–06 to 2012–13). Oxygation of irrigation water by Mazzei air-injector produced significantly (P < 0.001) higher yields (200.3 v. 182.7 g m–2) and water-use efficiencies. Averaged over seven years, the yield and gross production water-use index of oxygated cotton exceeded that of the control by 10% and 7%, respectively. The improvements in yields and water-use efficiency in response to oxygation could be ascribed to greater root development and increased light interception by the crop canopies, contributing to enhanced crop physiological performance by ameliorating exposure to hypoxia. Oxygation of SDI contributed to improvements in both yields and water-use efficiency, which may contribute to greater economic feasibility of SDI for broadacre cotton production in Vertosols.
Resumo:
Soils with high levels of chloride and/or sodium in their subsurface layers are often referred to as having subsoil constraints (SSCs). There is growing evidence that SSCs affect wheat yields by increasing the lower limit of a crop's available soil water (CLL) and thus reducing the soil's plant-available water capacity (PAWC). This proposal was tested by simulation of 33 farmers' paddocks in south-western Queensland and north-western New South Wales. The simulated results accounted for 79% of observed variation in grain yield, with a root mean squared deviation (RMSD) of 0.50 t/ha. This result was as close as any achieved from sites without SSCs, thus providing strong support for the proposed mechanism that SSCs affect wheat yields by increasing the CLL and thus reducing the soil's PAWC. In order to reduce the need to measure CLL of every paddock or management zone, two additional approaches to simulating the effects of SSCs were tested. In the first approach the CLL of soils was predicted from the 0.3-0.5 m soil layer, which was taken as the reference CLL of a soil regardless of its level of SSCs, while the CLL values of soil layers below 0.5 m depth were calculated as a function of these soils' 0.3-0.5 m CLL values as well as of soil depth plus one of the SSC indices EC, Cl, ESP, or Na. The best estimates of subsoil CLL values were obtained when the effects of SSCs were described by an ESP-dependent function. In the second approach, depth-dependent CLL values were also derived from the CLL values of the 0.3-0.5 m soil layer. However, instead of using SSC indices to further modify CLL, the default values of the water-extraction coefficient (kl) of each depth layer were modified as a function of the SSC indices. The strength of this approach was evaluated on the basis of correlation of observed and simulated grain yields. In this approach the best estimates were obtained when the default kl values were multiplied by a Cl-determined function. The kl approach was also evaluated with respect to simulated soil moisture at anthesis and at grain maturity. Results using this approach were highly correlated with soil moisture results obtained from simulations based on the measured CLL values. This research provides strong evidence that the effects of SSCs on wheat yields are accounted for by the effects of these constraints on wheat CLL values. The study also produced two satisfactory methods for simulating the effects of SSCs on CLL and on grain yield. While Cl and ESP proved to be effective indices of SSCs, EC was not effective due to the confounding effect of the presence of gypsum in some of these soils. This study provides the tools necessary for investigating the effects of SSCs on wheat crop yields and natural resource management (NRM) issues such as runoff, recharge, and nutrient loss through simulation studies. It also facilitates investigation of suggested agronomic adaptations to SSCs.
Resumo:
Water regulations have decreased irrigation water supplies in Nebraska and some other areas of the USA Great Plains. When available water is not enough to meet crop water requirements during the entire growing cycle, it becomes critical to know the proper irrigation timing that would maximize yields and profits. This study evaluated the effect of timing of a deficit-irrigation allocation (150 mm) on crop evapotranspiration (ETc), yield, water use efficiency (WUE = yield/ETc), irrigation water use efficiency (IWUE = yield/irrigation), and dry mass (DM) of corn (Zea mays L.) irrigated with subsurface drip irrigation in the semiarid climate of North Platte, NE. During 2005 and 2006, a total of sixteen irrigation treatments (eight each year) were evaluated, which received different percentages of the water allocation during July, August, and September. During both years, all treatments resulted in no crop stress during the vegetative period and stress during the reproductive stages, which affected ETc, DM, yield, WUE and IWUE. Among treatments, ETc varied by 7.2 and 18.8%; yield by 17 and 33%; WUE by 12 and 22%, and IWUE by 18 and 33% in 2005 and 2006, respectively. Yield and WUE both increased linearly with ETc and with ETc/ETp (ETp = seasonal ETc with no water stress), and WUE increased linearly with yield. The yield response factor (ky) averaged 1.50 over the two seasons. Irrigation timing affected the DM of the plant, grain, and cob, but not that of the stover. It also affected the percent of DM partitioned to the grain (harvest index), which increased linearly with ETc and averaged 56.2% over the two seasons, but did not affect the percent allocated to the cob or stover. Irrigation applied in July had the highest positive coefficient of determination (R2) with yield. This high positive correlation decreased considerably for irrigation applied in August, and became negative for irrigation applied in September. The best positive correlation between the soil water deficit factor (Ks) and yield occurred during weeks 12-14 from crop emergence, during the "milk" and "dough" growth stages. Yield was poorly correlated to stress during weeks 15 and 16, and the correlation became negative after week 17. Dividing the 150 mm allocation about evenly among July, August and September was a good strategy resulting in the highest yields in 2005, but not in 2006. Applying a larger proportion of the allocation in July was a good strategy during both years, and the opposite resulted when applying a large proportion of the allocation in September. The different results obtained between years indicate that flexible irrigation scheduling techniques should be adopted, rather than relying on fixed timing strategies.
Resumo:
This two-year study examined the impacts of feral pig diggings on five ecological indicators: seedling survival, surface litter, subsurface plant biomass, earthworm biomass and soil moisture content. Twelve recovery exclosures were established in two habitats (characterised by wet and dry soil moisture) by fencing off areas of previous pig diggings. A total of 0.59 ha was excluded from further pig diggings and compared with 1.18 ha of unfenced control areas. Overall, seedling numbers increased 7% within the protected exclosures and decreased 37% within the unprotected controls over the two-year study period. A significant temporal interaction was found in the dry habitat, with seedling survival increasing with increasing time of protection from diggings. Feral pig diggings had no significant effect on surface litter biomass, subsurface plant biomass, earthworm biomass or soil moisture content.
Resumo:
The present study set out to test the hypothesis through field and simulation studies that the incorporation of short-term summer legumes, particularly annual legume lablab (Lablab purpureus cv. Highworth), in a fallow-wheat cropping system will improve the overall economic and environmental benefits in south-west Queensland. Replicated, large plot experiments were established at five commercial properties by using their machineries, and two smaller plot experiments were established at two intensively researched sites (Roma and St George). A detailed study on various other biennial and perennial summer forage legumes in rotation with wheat and influenced by phosphorus (P) supply (10 and 40 kg P/ha) was also carried out at the two research sites. The other legumes were lucerne (Medicago sativa), butterfly pea (Clitoria ternatea) and burgundy bean (Macroptilium bracteatum). After legumes, spring wheat (Triticum aestivum) was sown into the legume stubble. The annual lablab produced the highest forage yield, whereas germination, establishment and production of other biennial and perennial legumes were poor, particularly in the red soil at St George. At the commercial sites, only lablab-wheat rotations were experimented, with an increased supply of P in subsurface soil (20 kg P/ha). The lablab grown at the commercial sites yielded between 3 and 6 t/ha forage yield over 2-3 month periods, whereas the following wheat crop with no applied fertiliser yielded between 0.5 to 2.5 t/ha. The wheat following lablab yielded 30% less, on average, than the wheat in a fallow plot, and the profitability of wheat following lablab was slightly higher than that of the wheat following fallow because of greater costs associated with fallow management. The profitability of the lablab-wheat phase was determined after accounting for the input costs and additional costs associated with the management of fallow and in-crop herbicide applications for a fallow-wheat system. The economic and environmental benefits of forage lablab and wheat cropping were also assessed through simulations over a long-term climatic pattern by using economic (PreCAPS) and biophysical (Agricultural Production Systems Simulation, APSIM) decision support models. Analysis of the long-term rainfall pattern (70% in summer and 30% in winter) and simulation studies indicated that ~50% time a wheat crop would not be planted or would fail to produce a profitable crop (grain yield less than 1 t/ha) because of less and unreliable rainfall in winter. Whereas forage lablab in summer would produce a profitable crop, with a forage yield of more than 3 t/ha, ~90% times. Only 14 wheat crops (of 26 growing seasons, i.e. 54%) were profitable, compared with 22 forage lablab (of 25 seasons, i.e. 90%). An opportunistic double-cropping of lablab in summer and wheat in winter is also viable and profitable in 50% of the years. Simulation studies also indicated that an opportunistic lablab-wheat cropping can reduce the potential runoff+drainage by more than 40% in the Roma region, leading to improved economic and environmental benefits.
Resumo:
Medium bedding sand which is commonly available in coastal sedimentary deposits, and a marine polychaete-worm species from Moreton Bay recently classified as Perinereis helleri (Nereididae), were deployed in a simple low-maintenance sand filter design that potentially has application at large scale. Previous work had shown that this physical and biological combination can provide a new option for saline wastewater treatment, since the worms help to prevent sand filter blocking with organic debris and offer a profitable by-product. To test the application of this new concept in a commercial environment, six 1.84 m2 Polychaete-assisted sand filters were experimentally tested for their ability to treat wastewater from a semi-intensive prawn culture pond. Polychaetes produced exclusively on the waste nutrients that collected in these gravity-driven sand filters were assessed for their production levels and nutritional contents. Water parameters studied included temperature, salinity, pH, dissolved oxygen (DO), oxidation/ reduction potential (redox), suspended solids, chlorophyll a, biological oxygen demand (BOD), and common forms of nitrogen and phosphorus. Pond water which had percolated through the sand bed had significantly lower pH, DO and redox levels compared with inflow water. Suspended solids and chlorophyll a levels were consistently more than halved by the process. Reductions in BOD appeared dependant on regular subsurface flows. Only marginal reductions in total nitrogen and phosphorus were documented, but their forms were altered in a potentially useful way: dissolved forms (ammonia and orthophosphate) were generated by the process, and this remineralisation also seemed to be accentuated by intermittent flow patterns. Flow rates of approximately 1,500 L m-2 d-1 were achieved suggesting that a 1 ha polychaete bed of this nature could similarly treat the discharge from a 10 ha semi-intensive prawn farm. Sixteen weeks after stocking sand beds with one-month-old P. helleri, over 3.6 kg of polychaete biomass (wet weight) was recovered from the trial. Production on a sand bed area basis was 328 g m-2. Similar (P>0.05) overall biomass production was found for the two stocking densities tested (2000 and 6000 m-2; n = 3), but survival was lower and more worms were graded as small (<0.6 g) when produced at the higher density (28.2 ± 1.5 % and approx. 88 %, respectively) compared with the lower density (46.8 ± 4.4 % and approx. 76 %, respectively). When considered on a weight for weight basis, about half of the worm biomass produced was generally suitable for use as bait. The nutritional contents of the worms harvested were analysed for different stocking densities and graded sizes. These factors did not significantly affect their percentages of dry matter (DM) (18.23 ± 0.57 %), ash (19.77 ± 0.80 % of DM) or gross energy 19.39 ± 0.29 MJ kg-1 DM) (n = 12). Although stocking density did not affect the worms’ nitrogen and phosphorus contents, small worms had a higher mean proportion of nitrogen and phosphorus (10.57 ± 0.17 % and 0.70 ± 0.01 % of DM, respectively) than large worms (9.99 ± 0.12 % and 0.65 ± 0.01 % of DM, respectively) (n = 6). More lipid was present in large worms grown at the medium density (11.20 ± 0.19 %) compared with the high density (9.50 ± 0.31 %) and less was generally found in small worms (7.1-7.6 % of DM). Mean cholesterol and total phospholipid levels were 5.24 ± 0.15 mg g-1 and 13.66 ± 2.15 mg g-1 DM, respectively (n = 12). Of the specific phospholipids tested, phosphatidyl-serine or sphingomyelin were below detection limits (<0.05 mg g-1), whilst mean levels of phosphatidyl-ethanolamine, phosphatidyl-inositol, phosphatidyl-choline and lysophosphatidyl-choline were 6.89 ± 1.09, 0.89 ± 0.26, 4.04 ± 1.17 and 1.84 ± 0.37 mg g-1, respectively (n = 12). Culture density generally had a more pronounced effect on phospholipid contents than did size of worms. By contrast, worm size had a more pronounced effect on total fatty acid contents, with large worms containing significantly higher (P<0.001) levels on a DM basis (46.88 ± 2.46 mg g-1) than smaller worms (27.76 ± 1.28 mg g-1). A very broad range of fatty acids were detected with palmitic acid being the most heavily represented class (up to 14.23 ± 0.49 mg g-1 DM or 27.28 ± 0.22 % of total fatty acids). Other heavily represented classes included stearic acid (7.4-8.8 %), vaccenic acid (6.8-7.8 %), arachidonic acid (3.5-4.4 %), eicosapentaenoic acid (9.9-13.8 %) and docosenoic acid (5.7-7.0 %). Stocking density did not affect (P>0.05) the levels of amino acids present in polychaete DM, but there was generally less of each amino acid tested on a weight per weight basis in large worms than in small worms. This difference was significant (P<0.05) for the most heavily represented classes being glutamic acid (73-77 mg g-1), aspartic acid (50-54 mg g-1), and glycine (46-53 mg g-1). These results demonstrate how this polychaete species can be planted and sorted at harvest according to various strategies aimed at providing biomass with specific physical and nutritional qualities for different uses.
Resumo:
Increased sediment and nutrient losses resulting from unsustainable grazing management in the Burdekin River catchment are major threats to water quality in the Great Barrier Reef Lagoon. To test the effects of grazing management on soil and nutrient loss, five 1 ha mini-catchments were established in 1999 under different grazing strategies on a sedimentary landscape near Charters Towers. Reference samples were also collected from watercourses in the Burdekin catchment during major flow events.Soil and nutrient loss were relatively low across all grazing strategies due to a combination of good cover, low slope and low rainfall intensities. Total soil loss varied from 3 to 20 kg haˉ¹ per event while losses of N and P ranged from 10 to 1900 g haˉ¹ and from 1 to 71 g haˉ¹ per event respectively. Water quality of runoff was considered moderate across all strategies with relatively low levels of total suspended sediment (range: 8-1409 mg lˉ¹), total N (range: 101-4000 ug lˉ¹) and total P (range: 14-609 ug lˉ¹). However, treatment differences are likely to emerge with time as the impacts of the different grazing strategies on land condition become more apparent.Samples collected opportunistically from rivers and creeks during flow events displayed significantly higher levels of total suspended sediment (range: 10-6010 mg lˉ¹), total N (range: 650-6350 ug lˉ¹) and total P (range: 50-1500 ug lˉ¹) than those collected at the grazing trial. These differences can largely be attributed to variation in slope, geology and cover between the grazing trial and different catchments. In particular, watercourses draining hillier, grano-diorite landscapes with low cover had markedly higher sediment and nutrient loads compared to those draining flatter, sedimentary landscapes.These preliminary data suggest that on relatively flat, sedimentary landscapes, extensive cattle grazing is compatible with achieving water quality targets, provided high levels of ground cover are maintained. In contrast, sediment and nutrient loss under grazing on more erodable land types is cause for serious concern. Long-term empirical research and monitoring will be essential to quantify the impacts of changed land management on water quality in the spatially and temporally variable Burdekin River catchment.