7 resultados para EFFECTIVE DIELECTRIC RESPONSE

em eResearch Archive - Queensland Department of Agriculture


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In coastal waters and estuaries, seagrass meadows are often subject to light deprivation over short time scales (days to weeks) in response to increased turbidity from anthropogenic disturbances. Seagrasses may exhibit negative physiological responses to light deprivation and suffer stress, or tolerate such stresses through photo-adaptation of physiological processes allowing more efficient use of low light. Pulse Amplitude Modulated (PAM) fluorometery has been used to rapidly assess changes in photosynthetic responses along in situ gradients in light. In this study, however, light is experimentally manipulated in the field to examine the photosynthesis of Halophila ovalis and Zostera capricorni. We aimed to evaluate the tolerance of these seagrasses to short-term light reductions. The seagrasses were subject to four light treatments, 0, 5, 60, and 90% shading, for a period of 14 days. In both species, as shading increased the photosynthetic variables significantly (P < 0.05) decreased by up to 40% for maximum electron transport rates (ETRmax) and 70% for saturating irradiances (Ek). Photosynthetic efficiencies (a) and effective quantum yields (ΔF/Fm′ ) increased significantly (P < 0.05), in both species, for 90% shaded plants compared with 0% shaded plants. H. ovalis was more sensitive to 90% shading than Z. capricorni, showing greater reductions in ETR max, indicative of a reduced photosynthetic capacity. An increase in Ek, Fm′ and ΔF/Fm′ for H. ovalis and Z. capricorni under 90% shading suggested an increase in photochemical efficiency and a more efficient use of low-photon flux, consistent with photo-acclimation to shading. Similar responses were found along a depth gradient from 0 to10 m, where depth related changes in ETRmax and Ek in H. ovalis implied a strong difference of irradiance history between depths of 0 and 5-10 m. The results suggest that H. ovalis is more vulnerable to light deprivation than Z. capricorni and that H. ovalis, at depths of 5-10 m, would be more vulnerable to light deprivation than intertidal populations. Both species showed a strong degree of photo-adaptation to light manipulation that may enable them to tolerate and adapt to short-term reductions in light. These consistent responses to changes in light suggest that photosynthetic variables can be used to rapidly assess the status of seagrasses when subjected to sudden and prolonged periods of reduced light

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nutrient mass balances have been used to assess a variety of land resource scenarios, at various scales. They are widely used as a simple basis for policy, planning, and regulatory decisions but it is not clear how accurately they reflect reality. This study provides a critique of broad-scale nutrient mass balances, with particular application to the fertiliser use of beef lot-feeding manure in Queensland. Mass balances completed at the district and farm scale were found to misrepresent actual manure management behaviour and potentially the risk of nutrient contamination of water resources. The difficulties of handling stockpile manure and concerns about soil compaction mean that manure is spread thickly over a few paddocks at a time and not evenly across a whole farm. Consequently, higher nutrient loads were applied to a single paddock less frequently than annually. This resulted in years with excess nitrogen, phosphorus, and potassium remaining in the soil profile. This conclusion was supported by evidence of significant nutrient movement in several of the soil profiles studied. Spreading manure is profitable, but maximum returns can be associated with increased risk of nutrient leaching relative to conventional inorganic fertiliser practices. Bio-economic simulations found this increased risk where manure was applied to supply crop nitrogen requirements (the practice of the case study farms, 200-5000 head lot-feeders). Thus, the use of broad-scale mass balances can be misleading because paddock management is spatially heterogeneous and this leads to increased local potential for nutrient loss. In response to the effect of spatial heterogeneity policy makers who intend to use mass balance techniques to estimate potential for nutrient contamination should apply these techniques conservatively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Liquid forms of phosphorus (P) have been shown to be more effective than granular P for promoting cereal growth in alkaline soils with high levels of free calcium carbonate on Eyre Peninsula, South Australia. However, the advantage of liquid over granular P forms of fertiliser has not been fully investigated across the wide range of soils used for grain production in Australia. A glasshouse pot experiment tested if liquid P fertilisers were more effective for growing spring wheat (Triticum aestivum L.) than granular P (monoammonium phosphate) in 28 soils from all over Australia with soil pH (H2O) ranging from 5.2 to 8.9. Application of liquid P resulted in greater shoot biomass, as measured after 4 weeks' growth (mid to late tillering, Feeks growth stage 2-3), than granular P in 3 of the acidic to neutral soils and in 3 alkaline soils. Shoot dry matter responses of spring wheat to applied liquid or granular P were related to soil properties to determine if any of the properties predicted superior yield responses to liquid P. The calcium carbonate content of soil was the only soil property that significantly contributed to predicting when liquid P was more effective than granular P. Five soil P test procedures (Bray, Colwell, resin, isotopically exchangeable P, and diffusive gradients in thin films (DGT)) were assessed to determine their ability to measure soil test P on subsamples of soil collected before the experiment started. These soil test values were then related to the dry matter shoot yields to assess their ability to predict wheat yield responses to P applied as liquid or granular P. All 5 soil test procedures provided a reasonable prediction of dry matter responses to applied P as either liquid or granular P, with the resin P test having a slightly greater predictive capacity on the range of soils tested. The findings of this investigation suggest that liquid P fertilisers do have some potential applications in non-calcareous soils and confirm current recommendations for use of liquid P fertiliser to grow cereal crops in highly calcareous soils. Soil P testing procedures require local calibration for response to the P source that is going to be used to amend P deficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding plant demography and plant response to herbivory is critical to the selection of effective weed biological control agents. We adopt the metaphor of 'filters' to suggest how agent prioritisation may be improved to narrow our choices down to those likely to be most effective in achieving the desired weed management outcome. Models can serve to capture our level of knowledge (or ignorance) about our study system and we illustrate how one type of modelling approach (matrix models) may be useful in identifying the weak link in a plant life cycle by using a hypothetical and an actual weed example (Parkinsonia aculeata). Once the vulnerable stage has been identified we propose that studying plant response to herbivory (simulated and/or actual) can help identify the guilds of herbivores to which a plant is most likely to succumb. Taking only potentially effective agents through the filter of host specificity may improve the chances of releasing safe and effective agents. The methods we outline may not always lead us definitively to the successful agent(s), but such an empirical, data-driven approach will make the basis for agent selection explicit and serve as testable hypotheses once agents are released.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fumigation of stored grain with phosphine (PH 3) is used widely to control the lesser grain borer Rhyzopertha dominica. However, development of high level resistance to phosphine in this species threatens control. Effective resistance management relies on knowledge of the expression of resistance in relation to dosage at all life stages. Therefore, we determined the mode of inheritance of phosphine resistance and strength of the resistance phenotype at each developmental stage. We achieved this by comparing mortality and developmental delay between a strongly resistant strain (R-strain), a susceptible strain (S-strain) and their F 1 progenies. Resistance was a maternally inherited, semi-dominant trait in the egg stage but was inherited as an autosomal, incompletely recessive trait in larvae and pupae. The rank order of developmental tolerance in both the sensitive and resistant strains was eggs > pupae > larvae. Comparison of published values for the response of adult R. dominica relative to our results from immature stages reveals that the adult stage of the S-strain is more sensitive to phosphine than are larvae. This situation is reversed in the R-strain as the adult stage is much more resistant to phosphine than even the most tolerant immature stage. Phosphine resistance factors at LC 50 were eggs 400×, larvae 87× and pupae 181× with respect to reference susceptible strain (S-strain) adults indicating that tolerance conferred by a particular immature stage neither strongly nor reliably interacts with the genetic resistance element. Developmental delay relative to unfumigated control insects was observed in 93% of resistant pupae, 86% of resistant larvae and 41% of resistant eggs. Increased delay in development and the toxicity response to phosphine exposure were both incompletely recessive. We show that resistance to phosphine has pleiotropic effects and that the expression of these effects varies with genotype and throughout the life history of the insect. © 2012.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

More than 1200 wheat and 120 barley experiments conducted in Australia to examine yield responses to applied nitrogen (N) fertiliser are contained in a national database of field crops nutrient research (BFDC National Database). The yield responses are accompanied by various pre-plant soil test data to quantify plant-available N and other indicators of soil fertility status or mineralisable N. A web application (BFDC Interrogator), developed to access the database, enables construction of calibrations between relative crop yield ((Y0/Ymax) × 100) and N soil test value. In this paper we report the critical soil test values for 90% RY (CV90) and the associated critical ranges (CR90, defined as the 70% confidence interval around that CV90) derived from analysis of various subsets of these winter cereal experiments. Experimental programs were conducted throughout Australia’s main grain-production regions in different eras, starting from the 1960s in Queensland through to Victoria during 2000s. Improved management practices adopted during the period were reflected in increasing potential yields with research era, increasing from an average Ymax of 2.2 t/ha in Queensland in the 1960s and 1970s, to 3.4 t/ha in South Australia (SA) in the 1980s, to 4.3 t/ha in New South Wales (NSW) in the 1990s, and 4.2 t/ha in Victoria in the 2000s. Various sampling depths (0.1–1.2 m) and methods of quantifying available N (nitrate-N or mineral-N) from pre-planting soil samples were used and provided useful guides to the need for supplementary N. The most regionally consistent relationships were established using nitrate-N (kg/ha) in the top 0.6 m of the soil profile, with regional and seasonal variation in CV90 largely accounted for through impacts on experimental Ymax. The CV90 for nitrate-N within the top 0.6 m of the soil profile for wheat crops increased from 36 to 110 kg nitrate-N/ha as Ymax increased over the range 1 to >5 t/ha. Apparent variation in CV90 with seasonal moisture availability was entirely consistent with impacts on experimental Ymax. Further analyses of wheat trials with available grain protein (~45% of all experiments) established that grain yield and not grain N content was the major driver of crop N demand and CV90. Subsets of data explored the impact of crop management practices such as crop rotation or fallow length on both pre-planting profile mineral-N and CV90. Analyses showed that while management practices influenced profile mineral-N at planting and the likelihood and size of yield response to applied N fertiliser, they had no significant impact on CV90. A level of risk is involved with the use of pre-plant testing to determine the need for supplementary N application in all Australian dryland systems. In southern and western regions, where crop performance is based almost entirely on in-crop rainfall, this risk is offset by the management opportunity to split N applications during crop growth in response to changing crop yield potential. In northern cropping systems, where stored soil moisture at sowing is indicative of minimum yield potential, erratic winter rainfall increases uncertainty about actual yield potential as well as reducing the opportunity for effective in-season applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A recent report to the Australian Government identified concerns relating to Australia's capacity to respond to a medium to large outbreak of FMD. To assess the resources required, the AusSpread disease simulation model was used to develop a plausible outbreak scenario that included 62 infected premises in five different states at the time of detection, 28 days after the disease entered the first property in Victoria. Movements of infected animals and/or contaminated product/equipment led to smaller outbreaks in NSW, Queensland, South Australia and Tasmania. With unlimited staff resources, the outbreak was eradicated in 63 days with 54 infected premises and a 98% chance of eradication within 3 months. This unconstrained response was estimated to involve 2724 personnel. Unlimited personnel was considered unrealistic, and therefore, the course of the outbreak was modelled using three levels of staffing and the probability of achieving eradication within 3 or 6 months of introduction determined. Under the baseline staffing level, there was only a 16% probability that the outbreak would be eradicated within 3 months, and a 60% probability of eradication in 6 months. Deployment of an additional 60 personnel in the first 3 weeks of the response increased the likelihood of eradication in 3 months to 68%, and 100% in 6 months. Deployment of further personnel incrementally increased the likelihood of timely eradication and decreased the duration and size of the outbreak. Targeted use of vaccination in high-risk areas coupled with the baseline personnel resources increased the probability of eradication in 3 months to 74% and to 100% in 6 months. This required 25 vaccination teams commencing 12 days into the control program increasing to 50 vaccination teams 3 weeks later. Deploying an equal number of additional personnel to surveillance and infected premises operations was equally effective in reducing the outbreak size and duration.