115 resultados para Losses Tipology
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Rainfall simulation experiments were carried out to measure runoff and soil water fluxes of suspended solids, total nitrogen, total phosphorus, dissolved organic carbon and total iron from sites in Pinus plantations on the coastal lowlands of south-eastern Queensland subjected to various operations (treatments). The operations investigated were cultivated and nil-cultivated site preparation, fertilised site preparation, clearfall harvesting and prescribed burning; these treatments were compared with an 8-y-old established plantation. Flow-weighted mean concentrations of total nitrogen and total phosphorus in surface runoff from the cultivated and nil-cultivated site-preparation, clearfall harvest, prescribed burning and 8-y-old established plantation treatments were very similar. However, both the soil water and the runoff from the fertilised site preparation treatment contained more nitrogen (N) and phosphorus (P) than the other treatments - with 3.10 mg N L-1 and 4.32 mg P L-1 (4 and 20 times more) in the runoff. Dissolved organic carbon concentrations in runoff from the nil-cultivated site-preparation and prescribed burn treatments were elevated. Iron concentrations were highest in runoff from the nil-cultivated site-preparation and 8-y-old established plantation treatments. Concentrations of suspended solids in runoff were higher from cultivated site preparation and prescribed burn treatments, and reflect the great disturbance of surface soil at these sites. The concentrations of all analytes were highest in initial runoff from plots, and generally decreased with time. Total nitrogen (mean 7.28, range 0.11-13.27 mg L-1) and total phosphorus (mean 11.60, range 0.06-83.99 mg L-1) concentrations in soil water were between 2 and 10 times greater than in surface runoff, which highlights the potential for nutrient fluxes in interflow (i.e. in the soil above the water table) through the general plantation area. Implications in regard to forest management are discussed, along with results of larger catchment-scale studies.
Resumo:
Surface losses of nitrogen from horticulture farms in coastal Queensland, Australia, may have the potential to eutrophy sensitive coastal marine habitats nearby. A case-study of the potential extent of such losses was investigated in a coastal macadamia plantation. Nitrogen losses were quantified in 5 consecutive runoff events during the 13-month study. Irrigation did not contribute to surface flows. Runoff was generated by storms at combined intensities and durations that were 20–40 mm/h for >9 min. These intensities and durations were within expected short-term (1 year) and long-term (up to 20 years) frequencies of rainfall in the study area. Surface flow volumes were 5.3 ± 1.1% of the episodic rainfall generated by such storms. Therefore, the largest part of each rainfall event was attributed to infiltration and drainage in this farm soil (Kandosol). The estimated annual loss of total nitrogen in runoff was 0.26 kg N/ha.year, representing a minimal loading of nitrogen in surface runoff when compared to other studies. The weighted average concentrations of total sediment nitrogen (TSN) and total dissolved nitrogen (TDN) generated in the farm runoff were 2.81 ± 0.77% N and 1.11 ± 0.27 mg N/L, respectively. These concentrations were considerably greater than ambient levels in an adjoining catchment waterway. Concentrations of TSN and TDN in the waterway were 0.11 ± 0.02% N and 0.50 ± 0.09 mg N/L, respectively. The steep concentration gradient of TSN and TDN between the farm runoff and the waterway demonstrated the occurrence of nutrient loading from the farming landscapes to the waterway. The TDN levels in the stream exceeded the current specified threshold of 0.2–0.3 mg N/L for eutrophication of such a waterway. Therefore, while the estimate of annual loading of N from runoff losses was comparatively low, it was evident that the stream catchment and associated agricultural land uses were already characterised by significant nitrogen loadings that pose eutrophication risks. The reported levels of nitrogen and the proximity of such waterways (8 km) to the coastline may have also have implications for the nearshore (oligotrophic) marine environment during periods of turbulent flow.
Resumo:
This paper quantifies gaseous N losses due to ammonia volatilisation and denitrification under controlled conditions at 30 degrees C and 75% to 150% of Field Capacity (FC). Biosolids were mixed with two contrasting soils from subtropical Australia at a rate designed to meet crop N requirements for irrigated cotton or maize (i.e., equivalent to 180 kg N ha(-1)). In the first experiment, aerobically (AE) and anaerobically (AN) digested biosolids were mixed into a heavy Vertosol soil and then incubated for 105 days. Ammonia volatilization over 72 days accounted for less than 4% of the applied NH4-N but 24% (AN) to 29% (AE) of the total applied biosolids' N was lost through denitrification in 105 days. In the second experiment AN biosolids with and without added polyacrimide polymer were mixed with either a heavy Vertosol or a lighter Red Ferrosol and then incubated for 98 days. The N loss was higher from the Vertosol with 16-29% of total N applied versus the Red Ferrosol with 7-10% of total N applied, while addition of polymer to the biosolids increased N loss from 7 to 10% and from 16 to 29% in the Red Ferrosol and Vertosol, respectively. A major product from the denitrification process was N-2 gas, accounting for >90% of the emitted N gases from both experiments. Our findings demonstrate that denitrification could be a major pathway of gaseous N losses under warm and moist conditions.
Resumo:
Although rust (caused by Puccinia purpurea) is a common disease in Australian grain sorghum crops, particularly late in the growing season (April onwards), its potential to reduce yield has not been quantified. Field trials were conducted in Queensland between 2003 and 2005 to evaluate the effect of sorghum rust on grain yield of two susceptible sorghum hybrids (Tx610 and Pride). Rust was managed from 28-35 days after sowing until physiological maturity by applying oxycarboxin (1 kg active ingredient/100 L of water/ha) every 10 days. When data were combined for the hybrids, yield losses ranged from 13.1% in 2005 to 3.2% in 2003 but differences in yield the between sprayed and unsprayed treatments were statistically significant (P a parts per thousand currency signaEuro parts per thousand 0.05) only in 2005. Final area under the disease progress curve (AUDPC) values reflected the yield losses in each year. The higher yield loss in 2005 can be attributed primarily to the early development of the rust epidemic and the higher inoculum levels in spreader plots at the time of planting of the trials.
Resumo:
Ammonia volatilisation from manure materials within poultry sheds can adversely affect production, and also represents a loss of fertiliser value from the spent litter. This study sought to compare the ability of alum and bentonite to decrease volatilisation losses of ammonia from spent poultry litter. An in-vessel volatilisation trial with air flushing, ammonia collection, and ammonia analysis was conducted over 64 days to evaluate the mitigation potential of these two materials. Water-saturated spent litter was incubated at 25°C in untreated condition (control) or with three treatments: an industry-accepted rate of alum [4% Al2(SO4)3·18H2O by dry mass of litter dry mass; ALUM], air-dry bentonite (127% by dry mass; BENT), or water-saturated bentonite (once again at 127% by dry mass; SATBENT). A high proportion of the nitrogen contained in the untreated spent litter was volatilised (62%). Bentonite additions were superior to alum additions at retaining spent litter ammonia (nitrogen losses: 15%, SATBENT; 34%, BENT; 54%, ALUM). Where production considerations favour comparable high rates of bentonite addition (e.g. where the litter is to be re-formulated as a fertiliser), this clay has potential to decrease ammonia volatilisation either in-shed or in spent litter stockpiles or formulated products, without the associated detrimental effect of alum on phosphorus availability.
Resumo:
Clays could underpin a viable agricultural greenhouse gas (GHG) abatement technology given their affinity for nitrogen and carbon compounds. We provide the first investigation into the efficacy of clays to decrease agricultural nitrogen GHG emissions (i.e., N2O and NH3). Via laboratory experiments using an automated closed-vessel analysis system, we tested the capacity of two clays (vermiculite and bentonite) to decrease N2O and NH3 emissions and organic carbon losses from livestock manures (beef, pig, poultry, and egg layer) incorporated into an agricultural soil. Clay addition levels varied, with a maximum of 1:1 to manure (dry weight). Cumulative gas emissions were modeled using the biological logistic function, with 15 of 16 treatments successfully fitted (P < 0.05) by this model. When assessing all of the manures together, NH3 emissions were lower (×2) at the highest clay addition level compared with no clay addition, but this difference was not significant (P = 0.17). Nitrous oxide emissions were significantly lower (×3; P < 0.05) at the highest clay addition level compared with no clay addition. When assessing manures individually, we observed generally decreasing trends in NH3 and N2O emissions with increasing clay addition, albeit with widely varying statistical significance between manure types. Most of the treatments also showed strong evidence of increased C retention with increasing clay additions, with up to 10 times more carbon retained in treatments containing clay compared with treatments containing no clay. This preliminary assessment of the efficacy of clays to mitigate agricultural GHG emissions indicates strong promise.
Resumo:
Herbicide runoff from cropping fields has been identified as a threat to the Great Barrier Reef ecosystem. A field investigation was carried out to monitor the changes in runoff water quality resulting from four different sugarcane cropping systems that included different herbicides and contrasting tillage and trash management practices. These include (i) Conventional - Tillage (beds and inter-rows) with residual herbicides used; (ii) Improved - only the beds were tilled (zonal) with reduced residual herbicides used; (iii) Aspirational - minimum tillage (one pass of a single tine ripper before planting) with trash mulch, no residual herbicides and a legume intercrop after cane establishment; and (iv) New Farming System (NFS) - minimum tillage as in Aspirational practice with a grain legume rotation and a combination of residual and knockdown herbicides. Results suggest soil and trash management had a larger effect on the herbicide losses in runoff than the physico-chemical properties of herbicides. Improved practices with 30% lower atrazine application rates than used in conventional systems produced reduced runoff volumes by 40% and atrazine loss by 62%. There were a 2-fold variation in atrazine and >10-fold variation in metribuzin loads in runoff water between reduced tillage systems differing in soil disturbance and surface residue cover from the previous rotation crops, despite the same herbicide application rates. The elevated risk of offsite losses from herbicides was illustrated by the high concentrations of diuron (14mugL-1) recorded in runoff that occurred >2.5months after herbicide application in a 1st ratoon crop. A cropping system employing less persistent non-selective herbicides and an inter-row soybean mulch resulted in no residual herbicide contamination in runoff water, but recorded 12.3% lower yield compared to Conventional practice. These findings reveal a trade-off between achieving good water quality with minimal herbicide contamination and maintaining farm profitability with good weed control.
Resumo:
Climate change and carbon (C) sequestration are a major focus of research in the twenty-first century. Globally, soils store about 300 times the amount of C that is released per annum through the burning of fossil fuels (Schulze and Freibauer 2005). Land clearing and introduction of agricultural systems have led to rapid declines in soil C reserves. The recent introduction of conservation agricultural practices has not led to a reversing of the decline in soil C content, although it has minimized the rate of decline (Baker et al. 2007; Hulugalle and Scott 2008). Lal (2003) estimated the quantum of C pools in the atmosphere, terrestrial ecosystems, and oceans and reported a “missing C” component in the world C budget. Though not proven yet, this could be linked to C losses through runoff and soil erosion (Lal 2005) and a lack of C accounting in inland water bodies (Cole et al. 2007). Land management practices to minimize the microbial respiration and soil organic C (SOC) decline such as minimum tillage or no tillage were extensively studied in the past, and the soil erosion and runoff studies monitoring those management systems focused on other nutrients such as nitrogen (N) and phosphorus (P).
Resumo:
Fortunately, plants have developed highly effective mechanisms with which to defend themselves when attacked by potentially disease-causing microorganisms. If not, then they would succumb to the many pathogenic fungi, bacteria, viruses, nematodes and insect pests, and disease would prevail. These natural defence systems of plants can be deliberately activated to provide some protection against the major pathogens responsible for causing severe yield losses in agricultural and horticultural crops. This is the basis of what is known as ‘induced’ or ‘acquired’ disease resistance in plants. Although the phenomenon of induced resistance has been known amongst plant pathologists for over 100 years, its inclusion into pest and disease management programmes has been a relatively recent development, ie. within the last 5 years. This review will discuss very briefly some of the characteristics of the induced resistance phenomenon, outline some of the advantages and limitations to its implementation and provide some examples within a postharvest pathology context. Finally some approaches being investigated by the fruit pathology team at DPI Indooroopilly and collaborators will be outlined.
Resumo:
Castration of male beef cattle is advantageous for management, however pre-pubertal (early) castration results in comparative losses in growth rate (Jago et al., 1996). Post-pubertal (late) castration may maintain growth rate but lead to management problems. The behavioural differences between early castrates (9mo) and late castrates (18mo), which may have an effect on growth rate were studied.
Resumo:
Root-knot nematodes (Meloidogyne spp.) are obligate, sedentary endoparasites that infect many plant species causing large economic losses worldwide. Available nematicides are being banned due to their toxicity or ozone-depleting properties and alternative control strategies are urgently required. We have produced transgenic tobacco (Nicotiana tabacum) plants expressing different dsRNA hairpin structures targeting a root-knot nematode (Meloidogyne javanica) putative transcription factor, MjTis11. We provide evidence that MjTis11 was consistently silenced in nematodes feeding on the roots of transgenic plants. The observed silencing was specific for MjTis11, with other sequence-unrelated genes being unaffected in the nematodes. Those transgenic plants able to induce silencing of MjTis11, also showed the presence of small interfering RNAs. Even though down-regulation of MjTis11 did not result in a lethal phenotype, this study demonstrates the feasibility of silencing root-knot nematode genes by expressing dsRNA in the host plant. Host-delivered RNA interference-triggered (HD-RNAi) silencing of parasite genes provides a novel disease resistance strategy with wide biotechnological applications. The potential of HD-RNAi is not restricted to parasitic nematodes but could be adapted to control other plant-feeding pests.
Resumo:
Twelve years ago our understanding of ratoon stunting disease (RSD) was confined almost exclusively to diagnosis of the disease and control via farm hygiene, with little understanding of the biology of the interaction between the causal agent (Leifsonia xyli subsp. xyli) and the host plant sugarcane (Saccharum spp. hybrids). Since then, research has focused on developing the molecular tools to dissect L. xyli subsp. xyli, so that better control strategies can be developed to prevent losses from RSD. Within this review, we give a brief overview of the progression in research on L. xyli subsp. xyli and highlight future challenges. After a brief historical background on RSD, we discuss the development of molecular tools such as transformation and transposon mutagenesis and discuss the apparent lack of genetic diversity within the L. xyli subsp. xyli world population. We go on to discuss the sequencing of the genome of L. xyli subsp. xyli, describe the key findings and suggest some future research based on known deficiencies that will capitalise on this tremendous knowledge base to which we now have access.
Resumo:
Fusarium wilt of cotton, caused by the fungus Fusarium oxysporum Schlechtend. f. sp. vasinfectum (Atk.) Snyd. & Hans, was first identified in 1892 in cotton growing in sandy acid soils in Alabama (8). Although the disease was soon discovered in other major cotton-producing areas, it did not become global until the end of the next century. After its original discovery, Fusarium wilt of cotton was reported in Egypt (1902) (30), India (1908) (60), Tanzania (1954) (110), California (1959) (33), Sudan (1960) (44), Israel (1970) (27), Brazil (1978) (5), China (1981) (17), and Australia (1993) (56). In addition to a worldwide distribution, Fusarium wilt occurs in all four of the domesticated cottons, Gossypium arboretum L., G. barbadense L., G. herbaceum L., and G. hirsutum L. (4,30). Disease losses in cotton are highly variable within a country or region. In severely infested fields planted with susceptible cultivars, yield losses can be high. In California, complete crop losses in individual fields have been observed (R. M. Davis, unpublished). Disease loss estimates prepared by the National Cotton Disease Council indicate losses of over 109,000 bales (227 kg or 500 lb) in the United States in 2004 (12).
Resumo:
Equid herpesvirus 1 (EHV1) is a major disease of equids worldwide causing considerable losses to the horse industry. A variety of techniques, including PCR have been used to diagnose EHV1. Some of these PCRs were used in combination with other techniques such as restriction enzyme analysis (REA) or hybridisation, making them cumbersome for routine diagnostic testing and increasing the chances of cross-contamination. Furthermore, they involve the use of suspected carcinogens such as ethidium bromide and ultraviolet light. In this paper, we describe a real-time PCR, which uses minor groove-binding probe (MGB) technology for the diagnosis of EHV1. This technique does not require post-PCR manipulations thereby reducing the risk of cross-contamination. Most importantly, the technique is specific; it was able to differentiate EHV1 from the closely related member of the Alphaherpesvirinae, equid herpesvirus 4 (EHV4). It was not reactive with common opportunistic pathogens such as Escherichia coli, Klebsiella oxytoca, Pseudomonas aeruginosa and Enterobacter agglomerans often involved in abortion. Similarly, it did not react with equine pathogens such as Streptococcus equi, Streptococcus equisimilis, Streptococcus zooepidemicus, Taylorella equigenitalis and Rhodococcus equi, which also cause abortion. The results obtained with this technique agreed with results from published PCR methods. The assay was sensitive enough to detect EHV1 sequences in paraffin-embedded tissues and clinical samples. When compared to virus isolation, the test was more sensitive. This test will be useful for the routine diagnosis of EHV1 based on its specificity, sensitivity, ease of performance and rapidity.