905 resultados para Embryonic Mortality, Fungal Infection, Habitat Choice, Soil pH, Terrestrial Nesting
Resumo:
There are approximately 90,000 ha of grapes in Brazil including wine, juice and table grapes. American varieties (Isabella, Niagara, Ives) comprise the largest part of Brazilian viticulture being destined for wine, juice and table grape. In Southern Brazil, these varieties are produced mainly in non grafted vineyards. Grape phylloxera is common on the roots of these varieties however the insect is not regarded as a serious problem. Leaf galls are common on V. vinifera cultivars, particularly Cabernet sauvignon, and this infestation can be severe in some years causing defoliation. No information about insect damage on leaves in relation to vineyard production and longevity is available. New selections from a breeding program aimed at developing new hybrids for wine production are highly susceptible to damage from leaf galling phylloxera. When leaf galling is severe, growers spray pyretroid and neonicotinoid insecticides however, in many situations, secondary mites can also damage the crop as a consequence of the foliar broad spectrum insecticides application. Studies about the genetic diversity of grape phylloxera strains in Brazil and their association with vine damage and secondary fungal infection must be conducted to clarify the importance of this pest to Brazilian viticulture.
Resumo:
Polar Regions are the most important soil carbon reservoirs on Earth. Monitoring soil carbon storage in a changing global climate context may indicate possible effects of climate change on terrestrial environments. In this regard, we need to understand the dynamics of soil organic matter in relation to its chemical characteristics. We evaluated the influence of chemical characteristics of humic substances on the process of soil organic matter mineralization in selected Maritime Antarctic soils. A laboratory assay was carried out with soils from five locations from King George Island. We determined the contents of total organic carbon, oxidizable carbon fractions of soil organic matter, and humic substances. Two in situ field experiments were carried out during two summers, in order to evaluate the CO2-C emissions in relation to soil temperature variations. The overall low amounts of soil organic matter in Maritime Antarctic soils have a low humification degree and reduced microbial activity. CO2-C emissions showed significant exponential relationship with temperature, suggesting a sharp increase in CO2-C emissions with a warming scenario, and Q10 values (the percentage increase in emission for a 10°C increase in soil temperature) were higher than values reported from elsewhere. The sensitivity of the CO2-C emission in relation to temperature was significantly correlated with the humification degree of soil organic matter and microbial activity for Antarctic soils. © 2012 Antarctic Science Ltd.
Resumo:
Abstract Small-scale coffee producers worldwide remain vulnerable to price fluctuations after the 1999-2003 coffee crisis. One way to increase small-scale farmer economic resilience is to produce a more expensive product, such as quality coffee. There is growing demand in coffee-producing and coffee-importing countries for user-friendly tools that facilitate the marketing of quality coffee. The purpose of this study is to develop a prototypical quality coffee marketing tool in the form of a GIS model that identifies regions for producing quality coffee in a country not usually associated with quality coffee, Honduras. Maps of areas for growing quality coffee were produced with information on climate, soils, topography, areas vulnerable to environmental degradation, the location of current quality coffee farms, and infrastructure. The maps depict suitable coffee-growing land in portions of eight western Honduran departments.
Resumo:
Economic losses resulting from disease development can be reduced by accurate and early detection of plant pathogens. Early detection can provide the grower with useful information on optimal crop rotation patterns, varietal selections, appropriate control measures, harvest date and post harvest handling. Classical methods for the isolation of pathogens are commonly used only after disease symptoms. This frequently results in a delay in application of control measures at potentially important periods in crop production. This paper describes the application of both antibody and DNA based systems to monitor infection risk of air and soil borne fungal pathogens and the use of this information with mathematical models describing risk of disease associated with environmental parameters.
Resumo:
BACKGROUND: Invasive fungal infections (IFIs) are a major cause of morbidity and mortality among organ transplant recipients. Multicenter prospective surveillance data to determine disease burden and secular trends are lacking. METHODS: The Transplant-Associated Infection Surveillance Network (TRANSNET) is a consortium of 23 US transplant centers, including 15 that contributed to the organ transplant recipient dataset. We prospectively identified IFIs among organ transplant recipients from March, 2001 through March, 2006 at these sites. To explore trends, we calculated the 12-month cumulative incidence among 9 sequential cohorts. RESULTS: During the surveillance period, 1208 IFIs were identified among 1063 organ transplant recipients. The most common IFIs were invasive candidiasis (53%), invasive aspergillosis (19%), cryptococcosis (8%), non-Aspergillus molds (8%), endemic fungi (5%), and zygomycosis (2%). Median time to onset of candidiasis, aspergillosis, and cryptococcosis was 103, 184, and 575 days, respectively. Among a cohort of 16,808 patients who underwent transplantation between March 2001 and September 2005 and were followed through March 2006, a total of 729 IFIs were reported among 633 persons. One-year cumulative incidences of the first IFI were 11.6%, 8.6%, 4.7%, 4.0%, 3.4%, and 1.3% for small bowel, lung, liver, heart, pancreas, and kidney transplant recipients, respectively. One-year incidence was highest for invasive candidiasis (1.95%) and aspergillosis (0.65%). Trend analysis showed a slight increase in cumulative incidence from 2002 to 2005. CONCLUSIONS: We detected a slight increase in IFIs during the surveillance period. These data provide important insights into the timing and incidence of IFIs among organ transplant recipients, which can help to focus effective prevention and treatment strategies.
Resumo:
Climatic and land use changes have significant consequences for the distribution of tree species, both through natural dispersal processes and following management prescriptions. Responses to these changes will be expressed most strongly in seedlings near current species range boundaries. In northern temperate forest ecosystems, where changes are already being observed, ectomycorrhizal fungi contribute significantly to successful tree establishment. We hypothesised that communities of fungal symbionts might therefore play a role in facilitating, or limiting, host seedling range expansion. To test this hypothesis, ectomycorrhizal communities of interior Douglas-fir and interior lodgepole pine seedlings were analysed in a common greenhouse environment following growth in five soils collected along an ecosystem gradient. Currently, Douglas-fir’s natural distribution encompasses three of the five soils, whereas lodgepole pine’s extends much further north. Host filtering was evident amongst the 29 fungal species encountered: 7 were shared, 9 exclusive to Douglas-fir and 13 exclusive to lodgepole pine. Seedlings of both host species formed symbioses with each soil fungal community, thus Douglas-fir did so even where those soils came from outside its current distribution. However, these latter communities displayed significant taxonomic and functional differences to those found within the host distribution, indicative of habitat filtering. In contrast, lodgepole pine fungal communities displayed high functional similarity across the soil gradient. Taxonomic and/or functional shifts in Douglas-fir fungal communities may prove ecologically significant during the predicted northward migration of this species; especially in combination with changes in climate and management operations, such as seed transfer across geographical regions for forestry purposes.
Resumo:
The fungus Paracoccidioides brasiliensis has been isolated from nine-banded armadillos (Dasypus novemcinctus) in different regions where paracoccidiodomycosis (PCM) is endemic. The link between PCM and these animals has provided the first valuable clue in the effort to elucidate the ecological niche of P. brasiliensis. The present study was aimed at correlating P. brasiliensis infection in armadillos with local ecological features and, if possible, the presence of the fungus in the soil in the Botucatu hyperendemic area of PCM. In this region the mean temperature ranges from 14.8 to 25.8degreesC and the annual average precipitation is 1520 mm. The sites where 10 infected animals (positive group) were collected were studied and compared with the sites where five uninfected animals were found. The occurrence of the fungus in soil samples collected from the positive armadillos' burrows and foraging sites was investigated by the indirect method of animal inoculation. Environmental data from the sites of animal capture, such as temperature, rainfall, altitude, vegetation, soil composition, presence of water and proximity of urban areas, were recorded. All 37 soil samples collected from the sites had negative fungal cultures. Positive animals were found much more frequently in sites with disturbed vegetation, such as riparian forests and artificial Eucalyptus Or Pinus forests, in altitudes below 800 m, near water sources. The soil type of the sites of positive animals was mainly sandy, with medium to low concentrations of organic matter. The pH was mainly acidic at all the sites, although the concentrations of aluminum cations (H+Al) were lower at the sites where positive animals were found. Positive armadillos were also captured in sites very close to urban areas. Our data and previous studies indicate that P. brasiliensis occurs preferentially in humid and shady disturbed forests in a strong association with armadillos.
Resumo:
Proliferative kidney disease (PKD) is an emerging disease threatening wild salmonid populations. In temperature-controlled aquaria, PKD can cause mortality rates of up to 85% in rainbow trout. So far, no data about PKD-related mortality in wild brown trout Salmo trutta fario are available. The aim of this study was to investigate mortality rates and pathology in brown trout kept in a cage within a natural river habitat known to harbor Tetracapsuloides bryosalmonae. Young-of-the-year (YOY) brown trout, free of T. bryosalmonae, were exposed in the River Wutach, in the northeast of Switzerland, during 3 summer months. Samples of wild brown trout caught by electrofishing near the cage location were examined in parallel. The incidence of PKD in cage-exposed animals (69%) was not significantly different to the disease prevalence of wild fish (82 and 80% in the upstream and downstream locations, respectively). The mortality in cageexposed animals, however, was as low as 15%. At the termination of the exposure experiment, surviving fish showed histological lesions typical for PKD regression, suggesting that many YOY brown trout survive the initial infection. Our results at the River Wutach suggest that PKD in brown trout does not always result in high mortality under natural conditions.
Resumo:
Background: Reducing rates of healthcare acquired infection has been identified by the Australian Commission on Safety and Quality in Health Care as a national priority. One of the goals is the prevention of central venous catheter-related bloodstream infection (CR-BSI). At least 3,500 cases of CR-BSI occur annually in Australian hospitals, resulting in unnecessary deaths and costs to the healthcare system between $25.7 and $95.3 million. Two approaches to preventing these infections have been proposed: use of antimicrobial catheters (A-CVCs); or a catheter care and management ‘bundle’. Given finite healthcare budgets, decisions about the optimal infection control policy require consideration of the effectiveness and value for money of each approach. Objectives: The aim of this research is to use a rational economic framework to inform efficient infection control policy relating to the prevention of CR-BSI in the intensive care unit. It addresses three questions relating to decision-making in this area: 1. Is additional investment in activities aimed at preventing CR-BSI an efficient use of healthcare resources? 2. What is the optimal infection control strategy from amongst the two major approaches that have been proposed to prevent CR-BSI? 3. What uncertainty is there in this decision and can a research agenda to improve decision-making in this area be identified? Methods: A decision analytic model-based economic evaluation was undertaken to identify an efficient approach to preventing CR-BSI in Queensland Health intensive care units. A Markov model was developed in conjunction with a panel of clinical experts which described the epidemiology and prognosis of CR-BSI. The model was parameterised using data systematically identified from the published literature and extracted from routine databases. The quality of data used in the model and its validity to clinical experts and sensitivity to modelling assumptions was assessed. Two separate economic evaluations were conducted. The first evaluation compared all commercially available A-CVCs alongside uncoated catheters to identify which was cost-effective for routine use. The uncertainty in this decision was estimated along with the value of collecting further information to inform the decision. The second evaluation compared the use of A-CVCs to a catheter care bundle. We were unable to estimate the cost of the bundle because it is unclear what the full resource requirements are for its implementation, and what the value of these would be in an Australian context. As such we undertook a threshold analysis to identify the cost and effectiveness thresholds at which a hypothetical bundle would dominate the use of A-CVCs under various clinical scenarios. Results: In the first evaluation of A-CVCs, the findings from the baseline analysis, in which uncertainty is not considered, show that the use of any of the four A-CVCs will result in health gains accompanied by cost-savings. The MR catheters dominate the baseline analysis generating 1.64 QALYs and cost-savings of $130,289 per 1.000 catheters. With uncertainty, and based on current information, the MR catheters remain the optimal decision and return the highest average net monetary benefits ($948 per catheter) relative to all other catheter types. This conclusion was robust to all scenarios tested, however, the probability of error in this conclusion is high, 62% in the baseline scenario. Using a value of $40,000 per QALY, the expected value of perfect information associated with this decision is $7.3 million. An analysis of the expected value of perfect information for individual parameters suggests that it may be worthwhile for future research to focus on providing better estimates of the mortality attributable to CR-BSI and the effectiveness of both SPC and CH/SSD (int/ext) catheters. In the second evaluation of the catheter care bundle relative to A-CVCs, the results which do not consider uncertainty indicate that a bundle must achieve a relative risk of CR-BSI of at least 0.45 to be cost-effective relative to MR catheters. If the bundle can reduce rates of infection from 2.5% to effectively zero, it is cost-effective relative to MR catheters if national implementation costs are less than $2.6 million ($56,610 per ICU). If the bundle can achieve a relative risk of 0.34 (comparable to that reported in the literature) it is cost-effective, relative to MR catheters, if costs over an 18 month period are below $613,795 nationally ($13,343 per ICU). Once uncertainty in the decision is considered, the cost threshold for the bundle increases to $2.2 million. Therefore, if each of the 46 Level III ICUs could implement an 18 month catheter care bundle for less than $47,826 each, this approach would be cost effective relative to A-CVCs. However, the uncertainty is substantial and the probability of error in concluding that the bundle is the cost-effective approach at a cost of $2.2 million is 89%. Conclusions: This work highlights that infection control to prevent CR-BSI is an efficient use of healthcare resources in the Australian context. If there is no further investment in infection control, an opportunity cost is incurred, which is the potential for a more efficient healthcare system. Minocycline/rifampicin catheters are the optimal choice of antimicrobial catheter for routine use in Australian Level III ICUs, however, if a catheter care bundle implemented in Australia was as effective as those used in the large studies in the United States it would be preferred over the catheters if it was able to be implemented for less than $47,826 per Level III ICU. Uncertainty is very high in this decision and arises from multiple sources. There are likely greater costs to this uncertainty for A-CVCs, which may carry hidden costs, than there are for a catheter care bundle, which is more likely to provide indirect benefits to clinical practice and patient safety. Research into the mortality attributable to CR-BSI, the effectiveness of SPC and CH/SSD (int/ext) catheters and the cost and effectiveness of a catheter care bundle in Australia should be prioritised to reduce uncertainty in this decision. This thesis provides the economic evidence to inform one area of infection control, but there are many other infection control decisions for which information about the cost-effectiveness of competing interventions does not exist. This work highlights some of the challenges and benefits to generating and using economic evidence for infection control decision-making and provides support for commissioning more research into the cost-effectiveness of infection control.
Resumo:
Fatty acid methyl ester (FAME) profiles, together with Biolog substrate utilization patterns, were used in conjunction with measurements of other soil chemical and microbiological properties to describe differences in soil microbial communities induced by increased salinity and alkalinity in grass/legume pastures at three sites in SE South Australia. Total ester-linked FAMEs (EL-FAMEs) and phospholipid-linked FAMEs (PL-FAMEs), were also compared for their ability to detect differences between the soil microbial communities. The level of salinity and alkalinity in affected areas of the pastures showed seasonal variation, being greater in summer than in winter. At the time of sampling for the chemical and microbiological measurements (winter) only the affected soil at site 1 was significantly saline. The affected soils at all three sites had lower organic C and total N concentrations than the corresponding non-affected soils. At site 1 microbial biomass, CO 2-C respiration and the rate of cellulose decomposition was also lower in the affected soil compared to the non-affected soil. Biomarker fatty acids present in both the EL- and PL-FAME profiles indicated a lower ratio of fungal to bacterial fatty acids in the saline affected soil at site 1. Analysis of Biolog substrate utilization patterns indicated that the bacterial community in the affected soil at site 1 utilized fewer carbon substrates and had lower functional diversity than the corresponding community in the non-affected soil. In contrast, increased alkalinity, of major importance at sites 2 and 3, had no effect on microbial biomass, the rate of cellulose decomposition or functional diversity but was associated with significant differences in the relative amounts of several fatty acids in the PL-FAME profiles indicative of a shift towards a bacterial dominated community. Despite differences in the number and relative amounts of fatty acids detected, principal component analysis of the EL- and PL-FAME profiles were equally capable of separating the affected and non-affected soils at all three sites. Redundancy analysis of the FAME data showed that organic C, microbial biomass, electrical conductivity and bicarbonate-extractable P were significantly correlated with variation in the EL-FAME profiles, whereas pH, electrical conductivity, NH 4-N, CO 2-C respiration and the microbial quotient were significantly correlated with variation in the PL-FAME profiles. Redundancy analysis of the Biolog data indicated that cation exchange capacity and bicarbonate-extractable K were significantly correlated with the variation in Biolog substrate utilization patterns.
Resumo:
In south-eastern Queensland, Australia, sorghum planted in early spring usually escapes sorghum midge, Stenodiplosis sorghicola, attack. Experiments were conducted to better understand the role of winter diapause in the population dynamics of this pest. Emergence patterns of adult midge from diapausing larvae on the soil surface and at various depths were investigated during spring to autumn of 1987/88–1989/90. From 1987/88 to 1989/90, 89%, 65% and 98% of adult emergence, respectively, occurred during November and December. Adult emergence from larvae diapausing on the soil surface was severely reduced due to high mortality attributed to surface soil temperatures in excess of 40°C, with much of this mortality occurring between mid-September and mid-October. Emergence of adults from the soil surface was considerably delayed in the 1988/89 season compared with larvae buried at 5 or 10 cm which had similar emergence patterns for all three seasons. In 1989/90, when a 1-cm-deep treatment was included, there was a 392% increase in adult emergence from this treatment compared with deeper treatments. Some diapausing larvae on the surface did not emerge at the end of summer in only 1 year (1989/90), when 28.0% of the larvae on the surface remained in diapause, whereas only 0.8% of the buried larvae remained in diapause. We conclude that the pattern of emergence explains why spring plantings of sorghum in south-eastern Queensland usually escape sorghum midge attack.
Resumo:
Natural biological suppression of soil-borne diseases is a function of the activity and composition of soil microbial communities. Soil microbe and phytopathogen interactions can occur prior to crop sowing and/or in the rhizosphere, subsequently influencing both plant growth and productivity. Research on suppressive microbial communities has concentrated on bacteria although fungi can also influence soil-borne disease. Fungi were analyzed in co-located soils 'suppressive' or 'non-suppressive' for disease caused by Rhizoctonia solani AG 8 at two sites in South Australia using 454 pyrosequencing targeting the fungal 28S LSU rRNA gene. DNA was extracted from a minimum of 125 g of soil per replicate to reduce the micro-scale community variability, and from soil samples taken at sowing and from the rhizosphere at 7 weeks to cover the peak Rhizoctonia infection period. A total of ∼994,000 reads were classified into 917 genera covering 54% of the RDP Fungal Classifier database, a high diversity for an alkaline, low organic matter soil. Statistical analyses and community ordinations revealed significant differences in fungal community composition between suppressive and non-suppressive soil and between soil type/location. The majority of differences associated with suppressive soils were attributed to less than 40 genera including a number of endophytic species with plant pathogen suppression potentials and mycoparasites such as Xylaria spp. Non-suppressive soils were dominated by Alternaria , Gibberella and Penicillum. Pyrosequencing generated a detailed description of fungal community structure and identified candidate taxa that may influence pathogen-plant interactions in stable disease suppression. © 2014 Penton et al.
Resumo:
The ubiquitous fungal pathogen Macrophomina phaseolina is best known as causing charcoal rot and premature death when host plants are subject to post-flowering stress. Overseas reports of M.phaseolina causing a rapid rot during the sprouting of Australian mungbean seed resulted in an investigation of the possible modes of infection of seed. Isolations from serial portions of 10 mungbean plants naturally infected with the pathogen revealed that on most plants there were discrete portions of infected tissue separated by apparently healthy tissue. The results from these studies, together with molecular analysis of isolates collected from infected tissue on two of the plants, suggested that aerial infection of aboveground parts by different isolates is common. Inoculations of roots and aboveground parts of mungbean plants at nine temperaturexsoil moisture incubation combinations and of detached green pods strongly supported the concept that seed infection results from infection of pods by microsclerotia, rather than from hyphae growing systemically through the plant after root or stem infection. This proposal is reinforced by anecdotal evidence that high levels of seed infection are common when rainfall occurs during pod fill, and by the isolation of M.phaseolina from soil peds collected on pods of mungbean plants in the field. However, other experiments showed that when inoculum was placed within 130mm of a green developing pod and a herbicide containing paraquat and diquat was sprayed on the inoculated plants, M.phaseolina was capable of some systemic growth from vegetative tissue into the pods and seeds.
Resumo:
A significantly increased water regime can lead to inundation of rivers, creeks and surrounding floodplains- and thus impact on the temporal dynamics of both the extant vegetation and the dormant, but viable soil-seed bank of riparian corridors. The study documented changes in the soil seed-bank along riparian corridors before and after a major flood event in January 2011 in southeast Queensland, Australia. The study site was a major river (the Mooleyember creek) near Roma, Central Queensland impacted by the extreme flood event and where baseline ecological data on riparian seed-bank populations have previously been collected in 2007, 2008 and 2009. After the major flood event, we collected further soil samples from the same locations in spring/summer (November–December 2011) and in early autumn (March 2012). Thereafter, the soils were exposed to adequate warmth and moisture under glasshouse conditions, and emerged seedlings identified taxonomically. Flooding increased seed-bank abundance but decreased its species richness and diversity. However, flood impact was less than that of yearly effect but greater than that of seasonal variation. Seeds of trees and shrubs were few in the soil, and were negatively affected by the flood; those of herbaceous and graminoids were numerous and proliferate after the flood. Seed-banks of weedy and/or exotic species were no more affected by the flood than those of native and/or non-invasive species. Overall, the studied riparian zone showed evidence of a quick recovery of its seed-bank over time, and can be considered to be resilient to an extreme flood event.