8 resultados para surface effects

em eResearch Archive - Queensland Department of Agriculture


Relevância:

40.00% 40.00%

Publicador:

Resumo:

High-value fruit crops are exposed to a range of environmental conditions that can reduce fruit quality. Solar injury (SI) or sunburn is a common disorder in tropical, sub-tropical, and temperate climates and is related to: 1) high fruit surface temperature; 2) high visible light intensity; and, 3) ultraviolet radiation (UV). Positional changes in fruit that are caused by increased weight or abrupt changes that result from summer pruning, limb breakage, or other damage to the canopy can expose fruit to high solar radiation levels, increased fruit surface temperatures, and increased UV exposure that are higher than the conditions to which they are adapted. In our studies, we examined the effects of high fruit surface temperature, saturating photosynthetically-active radiation (PAR), and short-term UV exposure on chlorophyll fluorescence, respiration, and photosynthesis of fruit peel tissues from tropical and temperate fruit in a simulation of these acute environmental changes. All tropical fruits (citrus, macadamia, avocado, pineapple, and custard apple) and the apple cultivars 'Gala', 'Gold Rush', and 'Granny Smith' increased dark respiration (A0) when exposed to UV, suggesting that UV repair mechanisms were induced. The maximum quantum efficiency of photosystem II (Fv/Fm) and the quantum efficiency of photosystem II (ΦII) were unaffected, indicating no adverse effects on photosystem II (PSII). In contrast, 'Braeburn' apple had a reduced Fv/Fm with no increase in A0 on all sampling dates. There was a consistent pattern in all studies. When Fv/Fm was unaffected by UV treatment, A0 increased significantly. Conversely, when Fv/Fm was reduced by UV treatment, then A0 was unaffected. The pattern suggests that when UV repair mechanisms are effective, PSII is adequately protected, and that this protection occurs at the cost of higher respiration. However, when the UV repair mechanisms are ineffective, not only is PSII damaged, but there is additional short-term damage to the repair mechanisms, indicated by a lack of respiration to provide energy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first larval instar has been identified as a critical stage for population mortality in Lepidoptera, yet due to the body size of these larvae, the factors that contribute to mortality under field conditions are still not clear. Dispersal behaviour has been suggested as a significant, but ignored factor contributing to mortality in first-instar lepidopteran larvae. The impact that leaving the host plant has on the mortality rate of Helicoverpa armigera neonates was examined in field crops and laboratory trials. In this study the following are examined: (1) the effects of soil surface temperature, and the level of shade within the crop, on the mortality of neonates on the soil after dropping off from the host plant; (2) the percentage of neonates that dropped off from a host plant and landed on the soil; and (3) the effects of exposure to different soil surface temperatures on the development and mortality of neonates. The findings of this study showed that: (1) on the soil, surface temperatures above 43°C were lethal for neonates, and exposure to these temperatures contributed greatly to the overall mortality rate observed; however, the fate of neonates on the soil varied significantly depending on canopy closure within the crop; (2) at least 15% of neonates dropped off from the host plant and landed on the soil, meaning that the proportion of neonates exposed to these condition is not trivial; and (3) 30 min exposure to soil surface temperatures approaching the lethal level (>43°C) has no significant negative effects on the development and mortality of larvae through to the second instar. Overall leaving the plant through drop-off contributes to first-instar mortality in crops with open canopies; however, survival of neonates that have lost contact with a host plant is possible, and becomes more likely later in the crop growing season.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Soil water repellency occurs widely in horticultural and agricultural soils when very dry. The gradual accumulation and breakdown of surface organic matter over time produces wax-like organic acids, which coat soil particles preventing uniform entry of water into the soil. Water repellency is usually managed by regular surfactant applications. Surfactants, literally, are surface active agents (SURFace ACTive AgeNTS). Their mode of action is to reduce the surface tension of water, allowing it to penetrate and wet the soil more easily and completely. This practice improves water use efficiency (by requiring less water to wet the soil and by capturing rainfall and irrigation more effectively and rapidly). It also reduces nutrient losses through run-off erosion or leaching. These nutrients have the potential to pollute the surrounding environment and water courses. This project investigated potential improvements to standard practices (product combination and scheduling) for surfactant use to overcome localised dry spots on water repellent soils and thus improve turf quality and water use efficiency. Weather conditions for the duration of the trial prevented the identification of improved practices in terms of combination and scheduling. However, the findings support previous research that the use of soil surfactants decreased the time for water to infiltrate dry soil samples taken from a previously severely hydrophobic site. Data will be continually collected from this trial site on a private contractual basis, with the hope that improvements to standard practices will be observed during the drier winter months when moisture availability is a limiting factor for turfgrass growth and quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impact of three cropping histories (sugarcane, maize and soybean) and two tillage practices (conventional tillage and direct drill) on plant-parasitic and free-living nematodes in the following sugarcane crop was examined in a field trial at Bundaberg. Soybean reduced populations of lesion nematode (Pratylenchus zeae) and root-knot nematode (Meloidogyne javanica) in comparison to previous crops of sugarcane or maize but increased populations of spiral nematode (Helicotylenchus dihystera) and maintained populations of dagger nematode (Xiphinema elongatum). However the effect of soybean on P zeae and M. javanica was no longer apparent 15 weeks after planting sugarcane, while later in the season, populations of these nematodes following soybean were as high as or higher than maize or sugarcane. Populations of P zeae were initially reduced by cultivation but due to strong resurgence tended to be higher in conventionally tilled than direct drill plots at the end of the plant crop. Even greater tillage effects were observed with M. javanica and X. elongatum, as nematode populations were significantly higher in conventionally tilled than direct drill plots late in the season. Populations of free-living nematodes in the upper 10 cm of soil were initially highest following soybean, but after 15, 35 and 59 weeks were lower than after sugarcane and contained fewer omnivorous and predatory nematodes. Conventional tillage increased populations of free-living nematodes in soil in comparison to direct drill and was also detrimental to omnivorous and predatory nematodes. These results suggest that crop rotation and tillage not only affect plant-parasitic nematodes directly, but also have indirect effects by impacting on natural enemies that regulate nematode populations. More than 2 million nematodes/m(2) were often present in crop residues on the surface of direct drill plots. Bacterial-feeding nematodes were predominant in residues early in the decomposition process but fungal-feeding nematodes predominated after 15 weeks. This indicates that fungi become an increasingly important component of the detritus food web as decomposition proceeds, and that that the rate of nutrient cycling decreases with time. Correlations between total numbers of free-living nematodes and mineral N concentrations in crop residues and surface soil suggested that the free-living nematode community may provide an indication of the rate of mineralisation of N from organic matter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Historical stocking methods of continuous, season-long grazing of pastures with little account of growing conditions have caused some degradation within grazed landscapes in northern Australia. Alternative stocking methods have been implemented to address this degradation and raise the productivity and profitability of the principal livestock, cattle. Because information comparing stocking methods is limited, an evaluation was undertaken to quantify the effects of stocking methods on pastures, soils and grazing capacity. The approach was to monitor existing stocking methods on nine commercial beef properties in north and south Queensland. Environments included native and exotic pastures and eucalypt (lighter soil) and brigalow (heavier soil) land types. Breeding and growing cattle were grazed under each method. The owners/managers, formally trained in pasture and grazing management, made all management decisions affecting the study sites. Three stocking methods were compared: continuous (with rest), extensive rotation and intensive rotation (commonly referred to as 'cell grazing'). There were two or three stocking methods examined on each property: in total 21 methods (seven continuous, six extensive rotations and eight intensive rotations) were monitored over 74 paddocks, between 2006 and 2009. Pasture and soil surface measurements were made in the autumns of 2006, 2007 and 2009, while the paddock grazing was analysed from property records for the period from 2006 to 2009. The first 2 years had drought conditions (rainfall average 3.4 decile) but were followed by 2 years of above-average rainfall. There were no consistent differences between stocking methods across all sites over the 4 years for herbage mass, plant species composition, total and litter cover, or landscape function analysis (LFA) indices. There were large responses to rainfall in the last 2 years with mean herbage mass in the autumn increasing from 1970 kg DM ha(-1) in 2006-07 to 3830 kg DM ha(-1) in 2009. Over the same period, ground and litter cover and LFA indices increased. Across all sites and 4 years, mean grazing capacity was similar for the three stocking methods. There were, however, significant differences in grazing capacity between stocking methods at four sites but these differences were not consistent between stocking methods or sites. Both the continuous and intensive rotation methods supported the highest average annual grazing capacity at different sites. The results suggest that cattle producers can obtain similar ecological responses and carry similar numbers of livestock under any of the three stocking methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An estimated 110 Mt of dust is eroded by wind from the Australian land surface each year, most of which originates from the arid and semi-arid rangelands. Livestock production is thought to increase the susceptibility of the rangelands to wind erosion by reducing vegetation cover and modifying surface soil stability. However, research is yet to quantify the impacts of grazing land management on the erodibility of the Australian rangelands, or determine how these impacts vary among land types and over time. We present a simulation analysis that links a pasture growth and animal production model (GRASP) to the Australian Land Erodibility Model (AUSLEM) to evaluate the impacts of stocking rate, stocking strategy and land condition on the erodibility of four land types in western Queensland, Australia. Our results show that declining land condition, over stocking, and using inflexible stocking strategies have potential to increase land erodibility and amplify accelerated soil erosion. However, land erodibility responses to grazing are complex and influenced by land type sensitivities to different grazing strategies and local climate characteristics. Our simulations show that land types which are more resilient to livestock grazing tend to be least susceptible to accelerated wind erosion. Increases in land erodibility are found to occur most often during climatic transitions when vegetation cover is most sensitive to grazing pressure. However, grazing effects are limited during extreme wet and dry periods when the influence of climate on vegetation cover is strongest. Our research provides the opportunity to estimate the effects of different land management practices across a range of land types, and provides a better understanding of the mechanisms of accelerated erosion resulting from pastoral activities. The approach could help further assessment of land erodibility at a broader scale notably if combined with wind erosion models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Extensive cattle grazing is the dominant land use in northern Australia. It has been suggested that grazing intensity and rainfall have profound effects on the dynamics of soil nutrients in northern Australia’s semi-arid rangelands. Previous studies have found positive, neutral and negative effects of grazing pressure on soil nutrients. These inconsistencies could be due to short-term experiments that do not capture the slow dynamics of some soil nutrients and the effects of interannual variability in rainfall. In a long-term cattle grazing trial in northern Australia on Brown Sodosol–Yellow Kandosol complex, we analysed soil organic matter and mineral nitrogen in surface soils (0–10 cm depth) 11, 12 and 16 years after trial establishment on experimental plots representing moderate stocking (stocked at the long-term carrying capacity for the region) and heavy stocking (stocked at twice the long-term carrying capacity). Higher soil organic matter was found under heavy stocking, although grazing treatment had little effect on mineral and total soil nitrogen. Interannual variability had a large effect on soil mineral nitrogen, but not on soil organic matter, suggesting that soil nitrogen levels observed in this soil complex may be affected by other indirect pathways, such as climate. The effect of interannual variability in rainfall and the effects of other soil types need to be explored further.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prescribed fire is one of the most widely-used management tools for reducing fuel loads in managed forests. However the long-term effects of repeated prescribed fires on soil carbon (C) and nitrogen (N) pools are poorly understood. This study aimed to investigate how different fire frequency regimes influence C and N pools in the surface soils (0–10 cm). A prescribed fire field experiment in a wet sclerophyll forest established in 1972 in southeast Queensland was used in this study. The fire frequency regimes included long unburnt (NB), burnt every 2 years (2yrB) and burnt every 4 years (4yrB), with four replications. Compared with the NB treatment, the 2yrB treatment lowered soil total C by 44%, total N by 54%, HCl hydrolysable C and N by 48% and 59%, KMnO4 oxidizable C by 81%, microbial biomass C and N by 42% and 33%, cumulative CO2–C by 28%, NaOCl-non-oxidizable C and N by 41% and 51%, and charcoal-C by 17%, respectively. The 4yrB and NB treatments showed no significant differences for these soil C and N pools. All soil labile, biologically active and recalcitrant and total C and N pools were correlated positively with each other and with soil moisture content, but negatively correlated with soil pH. The C:N ratios of different C and N pools were greater in the burned treatments than in the NB treatments. This study has highlighted that the prescribed burning at four year interval is a more sustainable management practice for this subtropical forest ecosystem.