975 resultados para mosquito-borne disease
Resumo:
Cardiovascular diseases (CVD) are major contributors to morbidity and mortality worldwide. Several interacting environmental, biochemical, and genetic risk factors can increase disease susceptibility. While some of the genes involved in the etiology of CVD are known, many are yet to be discovered. During the last few decades, scientists have searched for these genes with genome-wide linkage and association methods, and with more targeted candidate gene studies. This thesis investigates variation within the upstream transcription factor 1 (USF1) gene locus in relation to CVD risk factors, atherosclerosis, and incidence and prevalence of CVD. This candidate gene was first identified in Finnish families ascertained for familial combined hyperlipidemia, a common dyslipidemia predisposing to coronary heart disease. The gene is a ubiquitously expressed transcription factor regulating expression of several genes from lipid and glucose metabolism, inflammation, and endothelial function. First, we examined association between USF1 variants and several CVD risk factors, such as lipid phenotypes, body composition measures, and metabolic syndrome, in two prospective population cohorts. Our data suggested that USF1 contributes to these CVD risk factors at the population level. Notably, the associations with quantitative measurements were mostly detected among study subjects with CVD or metabolic syndrome, suggesting complex interactions between USF1 effects and the pathophysiological state of an individual. Second, we investigated how variation at the USF1 locus contributes to atherosclerotic lesions of the coronary arteries and abdominal aorta. For this, we used two study samples of middle-aged men with detailed measurements of atherosclerosis obtained in autopsy. USF1 variation significantly associated with areas of several types of lesions, especially with calcification of the arteries. Next, we tested what effect the USF1 risk variants have on sudden cardiac death and incidence of CVD. The atherosclerosis-associated risk variant increased the risk of sudden cardiac death of the same study subjects. Furthermore, USF1 alleles associated with incidence of CVD in the Finnish population follow-up cohorts. These associations were especially prominent among women, suggesting a sex specific effect, which has also been detected in subsequent studies. Finally, as some of the low-yield DNA samples of the Finnish follow-up study cohort needed to be whole-genome amplified (WGA) prior to genotyping, we evaluated whether the produced WGA genotypes were of good quality. Although the samples giving genotype discrepancies could not be detected before genotyping with standard laboratory quality control methods, our results suggested that enhanced quality control at the time of the genotyping could identify such samples. In addition, combining two WGA reactions into one pooled DNA sample for genotyping markedly reduced the number of discrepancies and samples showing them. In conclusion, USF1 seems to have a role in the etiology of CVD. Additional studies are warranted to identify functional variants and to study interactions between USF1 and other genetic or environmental factors. This USF1 study, and other studies with low DNA yield of some samples, can benefit from whole genome amplification of the low-yield samples prior to genotyping. Careful quality control procedures are, however, needed in WGA genotyping.
Resumo:
Europe was declared malaria free in 1975. The disappearance of malaria has traditionally been attributed to numerous deliberate actions like vector control, the screening of houses, more efficient medication etc. Malaria, however, disappeared from many countries like Finland before any counter measures had even started. The aim of this thesis is to study the population ecology of P. vivax and its interaction with the human host and the vector. By finding the factors that attributed to the extinction of vivax malaria it might be possible to improve the modern strategy against P. vivax. The parasite was studied with data from Finland, which provides the longest time series (1749-2008) of malaria statistics in the world. The malaria vectors, Anopheles messeae and A. beklemishevi are still common species in the country. The eradication of vivax malaria is difficult because the parasite has a dormant stage that can cause a relapse long after a primary infection. It was now shown that P. vivax is able to detect the presence of a potential vector. A dormant stage is triggered even from a bite of an uninfected Anopheles mosquito. This optimizes the chances for the Plasmodium to reach a mosquito vector for sexual reproduction. The longevity of the dormant stage could be shown to be at least nine years. The parasite spends several years in its human host and the behaviour of the human carrier had a profound impact on the decline of the disease in Finland. Malaria spring epidemics could be explained by a previous warm summer. Neither annual nor summer mean temperature had any impact on the long term malaria trend. Malaria disappeared slowly from Finland without mosquito control. The sociological change from extended families to nuclear families led to decreased household size. The decreased household size correlated strongly with the decline of malaria. That led to an increased isolation of the subpopulations of P. vivax. Their habitat consisted of the bedrooms in which human carriers slept together with the overwintering vectors. The isolation of the parasite ultimately led to the extinction of vivax malaria. Metapopulation models adapted to local conditions should therefore be implemented as a tool for settlement planning and socio-economic development and become an integrated part of the fight against malaria.
Resumo:
As for other complex diseases, linkage analyses of schizophrenia (SZ) have produced evidence for numerous chromosomal regions, with inconsistent results reported across studies. The presence of locus heterogeneity appears likely and may reduce the power of linkage analyses if homogeneity is assumed. In addition, when multiple heterogeneous datasets are pooled, inter-sample variation in the proportion of linked families (alpha) may diminish the power of the pooled sample to detect susceptibility loci, in spite of the larger sample size obtained. We compare the significance of linkage findings obtained using allele-sharing LOD scores (LOD(exp))-which assume homogeneity-and heterogeneity LOD scores (HLOD) in European American and African American NIMH SZ families. We also pool these two samples and evaluate the relative power of the LOD(exp) and two different heterogeneity statistics. One of these (HLOD-P) estimates the heterogeneity parameter alpha only in aggregate data, while the second (HLOD-S) determines alpha separately for each sample. In separate and combined data, we show consistently improved performance of HLOD scores over LOD(exp). Notably, genome-wide significant evidence for linkage is obtained at chromosome 10p in the European American sample using a recessive HLOD score. When the two samples are combined, linkage at the 10p locus also achieves genome-wide significance under HLOD-S, but not HLOD-P. Using HLOD-S, improved evidence for linkage was also obtained for a previously reported region on chromosome 15q. In linkage analyses of complex disease, power may be maximised by routinely modelling locus heterogeneity within individual datasets, even when multiple datasets are combined to form larger samples.
Resumo:
Rabbit haemorrhagic disease is a major tool for the management of introduced, wild rabbits in Australia. However, new evidence suggests that rabbits may be developing resistance to the disease. Rabbits sourced from wild populations in central and southeastern Australia, and domestic rabbits for comparison, were experimentally challenged with a low 60 ID50 oral dose of commercially available Czech CAPM 351 virus - the original strain released in Australia. Levels of resistance to infection were generally higher than for unselected domestic rabbits and also differed (0-73% infection rates) between wild populations. Resistance was lower in populations from cooler, wetter regions and also low in arid regions with the highest resistance seen within zones of moderate rainfall. These findings suggest the external influences of non-pathogenic calicivirus in cooler, wetter areas and poor recruitment in arid populations may influence the development rate of resistance in Australia.
Resumo:
Cotton bunchy top (CBT) disease has caused significant yield losses in Australia and is now managed by control of its vector, the cotton aphid (Aphis gossypii). Its mode of transmission and similarities in symptoms to cotton Blue Disease suggested it may also be caused by a luteovirus or related virus. Degenerate primers to conserved regions of the genomes of the family Luteoviridae were used to amplify viral cDNAs from CBT-affected cotton leaf tissue that were not present in healthy plants. Partial genome sequence of a new virus (Cotton bunchy top virus, CBTV) was obtained spanning part of the RNA-dependent-RNA-polymerase (RdRP), all of the coat protein and part of the aphid-transmission protein. CBTV sequences could be detected in viruliferous aphids able to transmit CBT, but not aphids from non-symptomatic plants, indicating that it is associated with the disease and may be the causal agent. All CBTV open-reading frames had their closest similarity to viruses of the genus Polerovirus. The partial RdRP had 90 % amino acid identity to the RdRP of Cotton leafroll dwarf virus (CLRDV) that causes cotton blue disease, while other parts of the genome were more similar to other poleroviruses. The sequence similarity and genome organization of CBTV suggest that it should be considered a new member of the genus Polerovirus. This partial genome sequence of CBTV opens up the possibility for developing diagnostic tests for detection of the virus in cotton plants, aphids and weeds as well as alternative strategies for engineering CBT resistance in cotton plants through biotechnology. © 2012 Australasian Plant Pathology Society Inc.
Resumo:
My work describes two sectors of the human bacterial environment: 1. The sources of exposure to infectious non-tuberculous mycobacteria. 2. Bacteria in dust, reflecting the airborne bacterial exposure in environments protecting from or predisposing to allergic disorders. Non-tuberculous mycobacteria (NTM) transmit to humans and animals from the environment. Infection by NTM in Finland has increased during the past decade beyond that by Mycobacterium tuberculosis. Among the farm animals, porcine mycobacteriosis is the predominant NTM disease in Finland. Symptoms of mycobacteriosis are found in 0.34 % of slaughtered pigs. Soil and drinking water are suspected as sources for humans and bedding materials for pigs. To achieve quantitative data on the sources of human and porcine NTM exposure, methods for quantitation of environmental NTM are needed. We developed a quantitative real-time PCR method, utilizing primers targeted at the 16S rRNA gene of the genus of Mycobacterium. With this method, I found in Finnish sphagnum peat, sandy soils and mud high contents of mycobacterial DNA, 106 to 107 genome equivalents per gram. A similar result was obtained by a method based on the Mycobacterium-specific hybridization of 16S rRNA. Since rRNA is found mainly in live cells, this result shows that the DNA detected by qPCR mainly represented live mycobacteria. Next, I investigated the occurrence of environmental mycobacteria in the bedding materials obtained from 5 pig farms with high prevalence (>4 %) of mycobacteriosis. When I used for quantification the same qPCR methods as for the soils, I found that piggery samples contained non-mycobacterial DNA that was amplified in spite of several mismatches with the primers. I therefore improved the qPCR assay by designing Mycobacterium-specific detection probes. Using the probe qPCR assay, I found 105 to 107 genome equivalents of mycobacterial DNA in unused bedding materials and up to 1000 fold more in the bedding collected after use in the piggery. This result shows that there was a source of mycobacteria in the bedding materials purchased by the piggery and that mycobacteria increased in the bedding materials during use in the piggery. Allergic diseases have reached epidemic proportions in urbanized countries. At the same time, childhood in rural environment or simple living conditions appears to protect against allergic disorders. Exposure to immunoreactive microbial components in rural environments seems to prevent allergies. I searched for differences in the bacterial communities of two indoor dusts, an urban house dust shown to possess immunoreactivity of the TH2-type and a farm barn dust with TH1-activity. The immunoreactivities of the dusts were revealed by my collaborators, in vitro in human dendritic cells and in vivo in mouse. The dusts accumulated >10 years in the respiratory zone (>1.5 m above floor), thus reflecting the long-term content of airborne bacteria at the two sites. I investigated these dusts by cloning and sequencing of bacterial 16S rRNA genes from dust contained DNA. From the TH2-active urban house dust, I isolated 139 16S rRNA gene clones. The most prevalent genera among the clones were Corynebacterium (5 species, 34 clones), Streptococcus (8 species, 33 clones), Staphylococcus (5 species, 9 clones) and Finegoldia (1 species, 9 clones). Almost all of these species are known as colonizers of the human skin and oral cavity. Species of Corynebacterium and Streptococcus have been reported to contain anti-inflammatory lipoarabinomannans and immunmoreactive beta-glucans respectively. Streptococcus mitis, found in the urban house dust is known as an inducer of TH2 polarized immunity, characteristic of allergic disorders. I isolated 152 DNA clones from the TH1-active farm barn dust and found species quite different from those found from the urban house dust. Among others, I found DNA clones representing Bacillus licheniformis, Acinetobacter lwoffii and Lactobacillus each of which was recently reported to possess anti-allergy immunoreactivity. Moreover, the farm barn dust contained dramatically higher bacterial diversity than the urban house dust. Exposure to this dust thus stimulated the human dendritic cells by multiple microbial components. Such stimulation was reported to promote TH1 immunity. The biodiversity in dust may thus be connected to its immunoreactivity. Furthermore, the bacterial biomass in the farm barn dust consisted of live intact bacteria mainly. In the urban house dust only ~1 % of the biomass appeared as intact bacteria, as judged by microscoping. Fragmented microbes may possess bioactivity different from that of intact cells. This was recently shown for moulds. If this is also valid for bacteria, the different immunoreactivities of the two dusts may be explained by the intactness of dustborne bacteria. Based on these results, we offer three factors potentially contributing to the polarized immunoreactivities of the two dusts: (i) the species-composition, (ii) the biodiversity and (iii) the intactness of the dustborne bacterial biomass. The risk of childhood atopic diseases is 4-fold lower in the Russian compared with the Finnish Karelia. This difference across the country border is not explainable by different geo-climatic factors or genetic susceptibilities of the two populations. Instead, the explanation must be lifestyle-related. It has already been reported that the microbiological quality of drinking water differs on the two sides of the borders. In collaboration with allergists, I investigated dusts collected from homes in the Russian Karelia and in the Finnish Karelia. I found that bacterial 16S rRNA genes cloned from the Russian Karelian dusts (10 homes, 234 clones) predominantly represented Gram-positive taxa (the phyla Actinobacteria and Firmicutes, 67%). The Russian Karelian dusts contained nine-fold more of muramic acid (60 to 70 ng mg-1) than the Finnish Karelian dusts (3 to 11 ng mg-1). Among the DNA clones isolated from the Finnish side (n=231), Gram-negative taxa (40%) outnumbered the Gram-positives (34%). Out of the 465 DNA clones isolated from the Karelian dusts, 242 were assigned to cultured validly described bacterial species. In Russian Karelia, animal-associated species e.g. Staphylococcus and Macrococcus were numerous (27 clones, 14 unique species). This finding may connect to the difference in the prevalence of allergy, as childhood contacts with pets and farm animals have been connected with low allergy risk. Plant-associated bacteria and plant-borne 16S rRNA genes (chloroplast) were frequent among the DNA clones isolated from the Finnish Karelia, indicating components originating from plants. In conclusion, my work revealed three major differences between the bacterial communtites in the Russian and in the Finnish Karelian homes: (i) the high prevalence of Gram-positive bacteria on the Russian side and of Gram-negative bacteria on the Finnish side and (ii) the rich presence of animal-associated bacteria on the Russian side whereas (iii) plant-associated bacteria prevailed on the Finnish side. One or several of these factors may connect to the differences in the prevalence of allergy.
Resumo:
The current Ebola virus disease (EVD) epidemic in West Africa is unprecedented in scale, and Sierra Leone is the most severely affected country. The case fatality risk (CFR) and hospitalization fatality risk (HFR) were used to characterize the severity of infections in confirmed and probable EVD cases in Sierra Leone. Proportional hazards regression models were used to investigate factors associated with the risk of death in EVD cases. In total, there were 17 318 EVD cases reported in Sierra Leone from 23 May 2014 to 31 January 2015. Of the probable and confirmed EVD cases with a reported final outcome, a total of 2536 deaths and 886 recoveries were reported. CFR and HFR estimates were 74·2% [95% credibility interval (CrI) 72·6–75·5] and 68·9% (95% CrI 66·2–71·6), respectively. Risks of death were higher in the youngest (0–4 years) and oldest (≥60 years) age groups, and in the calendar month of October 2014. Sex and occupational status did not significantly affect the mortality of EVD. The CFR and HFR estimates of EVD were very high in Sierra Leone.
Resumo:
In 1955 a severe wilt disease occurring on ginger in the Near North Coast district of Queensland was incorrectly attributed to infection by a Fusarium sp., and later shown to be caused by a strain of Ralstonia solanacearum, now reclassified as R. sequeirae. The disease was brought from China into Australia on latently infected rhizomes, and possibly also with associated soil. Several DNA-based diagnostic methods have shown that the pathogen causing bacterial wilt of ginger in parts of China is indistinguishable from the pathogen uniquely associated with the disease in Queensland. © 2012 Australasian Plant Pathology Society Inc.
Resumo:
TRFLP (terminal restriction fragment length polymorphism) was used to assess whether management practices that improved disease suppression and/or yield in a 4-year ginger field trial were related to changes in soil microbial community structure. Bacterial and fungal community profiles were defined by presence and abundance of terminal restriction fragments (TRFs), where each TRF represents one or more species. Results indicated inclusion of an organic amendment and minimum tillage increased the relative diversity of dominant fungal populations in a system dependant way. Inclusion of an organic amendment increased bacterial species richness in the pasture treatment. Redundancy analysis showed shifts in microbial community structure associated with different management practices and treatments grouped according to TRF abundance in relation to yield and disease incidence. ANOVA also indicated the abundance of certain TRFs was significantly affected by farming system management practices, and a number of these TRFs were also correlated with yield or disease suppression. Further analyses are required to determine whether identified TRFs can be used as general or soil-type specific bio-indicators of productivity (increased and decreased) and Pythium myriotylum suppressiveness.
Resumo:
The aim of the current study was to investigate whether polymerase chain reaction amplification of 16S ribosomal (r)RNA and a putative hemolysin gene operon, hhdBA, can be used to monitor live pigs for the presence of Haemophilus parasuis and predict the virulence of the strains present. Nasal cavity swabs were taken from 30 live, healthy, 1- to 8-week-old pigs on a weekly cycle from a commercial Thai nursery pig herd. A total of 27 of these pigs (90%) tested positive for H. parasuis as early as week 1 of age. None of the H. parasuis-positive samples from healthy pigs was positive for the hhdBA genes. At the same pig nursery, swab samples from nasal cavity, tonsil, trachea, and lung, and exudate samples from pleural/peritoneal cavity were taken from 30 dead pigs displaying typical pathological lesions consistent with Glasser disease. Twenty-two of 140 samples (15.7%) taken from 30 diseased pigs yielded a positive result for H. parasuis. Samples from the exudate (27%) yielded the most positive results, followed by lung, tracheal swab, tonsil, and nasal swab, respectively. Out of 22 positive samples, 12 samples (54.5%) harbored hhdA and/or hhdB genes. Detection rates of hhdA were higher than hhdB. None of the H. parasuis-positive samples taken from nasal cavity of diseased pigs tested positive for hhdBA genes. More work is required to determine if the detection of hhdBA genes is useful for identifying the virulence potential of H. parasuis field isolates.
Resumo:
Since its initial description as a Th2-cytokine antagonistic to interferon-alpha and granulocyte-macrophage colony-stimulating factor, many studies have shown various anti-inflammatory actions of interleukin-10 (IL-10), and its role in infection as a key regulator of innate immunity. Studies have shown that IL-10 induced in response to microorganisms and their products plays a central role in shaping pathogenesis. IL-10 appears to function as both sword and shield in the response to varied groups of microorganisms in its capacity to mediate protective immunity against some organisms but increase susceptibility to other infections. The nature of IL-10 as a pleiotropic modulator of host responses to microorganisms is explained, in part, by its potent and varied effects on different immune effector cells which influence antimicrobial activity. A new understanding of how microorganisms trigger IL-10 responses is emerging, along with recent discoveries of how IL-10 produced during disease might be harnessed for better protective or therapeutic strategies. In this review, we summarize studies from the past 5 years that have reported the induction of IL-10 by different classes of pathogenic microorganisms, including protozoa, nematodes, fungi, viruses and bacteria and discuss the impact of this induction on the persistence and/or clearance of microorganisms in the host.
Resumo:
B. cereus is a gram-positive bacterium that possesses two different forms of life:the large, rod-shaped cells (ca. 0.002 mm by 0.004 mm) that are able to propagate and the small (0.001 mm), oval shaped spores. The spores can survive in almost any environment for up to centuries without nourishment or water. They are insensitive towards most agents that normally kill bacteria: heating up to several hours at 90 ºC, radiation, disinfectants and extreme alkaline (≥ pH 13) and acid (≤ pH 1) environment. The spores are highly hydrophobic and therefore make them tend to stick to all kinds of surfaces, steel, plastics and live cells. In favorable conditions the spores of B. cereus may germinate into vegetative cells capable of producing food poisoning toxins. The toxins can be heat-labile protein formed after ingestion of the contaminated food, inside the gastrointestinal tract (diarrhoeal toxins), or heat stable peptides formed in the food (emesis causing toxin, cereulide). Cereulide cannot be inactivated in foods by cooking or any other procedure applicable on food. Cereulide in consumed food causes serious illness in human, even fatalities. In this thesis, B. cereus strains originating from different kinds of foods and environments and 8 different countries were inspected for their capability of forming cereulide. Of the 1041 isolates from soil, animal feed, water, air, used bedding, grass, dung and equipment only 1.2 % were capable of producing cereulide, whereas of the 144 isolates originating from foods 24 % were cereulide producers. Cereulide was detected by two methods: by its toxicity towards mammalian cells (sperm assay) and by its peculiar chemical structure using liquid-chromatograph-mass spectrometry equipment. B. cereus is known as one of the most frequent bacteria occurring in food. Most foods contain more than one kind of B. cereus. When randomly selected 100 isolates of B. cereus from commercial infant foods (dry formulas) were tested, 11% of these produced cereulide. Considering a frequent content of 103 to 104 cfu (colony forming units) of B. cereus per gram of infant food formula (dry), it appears likely that most servings (200 ml, 30 g of the powder reconstituted with water) may contain cereulide producers. When a reconstituted infant formula was inoculated with >105 cfu of cereulide producing B. cereus per ml and left at room temperature, cereulide accumulated to food poisoning levels (> 0.1 mg of cereulide per serving) within 24 hours. Paradoxically, the amount of cereulide (per g of food) increased 10 to 50 fold when the food was diluted 4 - 15 fold with water. The amount of the produced cereulide strongly depended on the composition of the formula: most toxin was formed in formulas with cereals mixed with milk, and least toxin in formulas based on milk only. In spite of the aggressive cleaning practices executed by the modern dairy industry, certain genotypes of B. cereus appear to colonise the silos tanks. In this thesis four strategies to explain their survival of their spores in dairy silos were identified. First, high survival (log 15 min kill ≤ 1.5) in the hot alkaline (pH >13) wash liquid, used at the dairies for cleaning-in-place. Second, efficient adherence of the spores to stainless steel from cold water. Third, a cereulide producing group with spores characterized by slow germination in rich medium and well preserved viability when exposed to heating at 90 ºC. Fourth, spores capable of germinating at 8 ºC and possessing the psychrotolerance gene, cspA. There were indications that spores highly resistant to hot 1% sodium hydroxide may be effectively inactivated by hot 0.9% nitric acid. Eight out of the 14 dairy silo tank isolates possessing hot alkali resistant spores were capable of germinating and forming biofilm in whole milk, not previously reported for B. cereus. In this thesis it was shown that cereulide producing B. cereus was capable of inhibiting the growth of cereulide non-producing B. cereus occurring in the same food. This phenomenon, called antagonism, has long been known to exist between B. cereus and other microbial species, e.g. various species of Bacillus, gram-negative bacteria and plant pathogenic fungi. In this thesis intra-species antagonism of B. cereus was shown for the first time. This brother-killing did not depend on the cereulide molecule, also some of the cereulide non-producers were potent antagonists. Interestingly, the antagonistic clades were most frequently found in isolates from food implicated with human illness. The antagonistic property was therefore proposed in this thesis as a novel virulence factor that increases the human morbidity of the species B. cereus, in particular of the cereulide producers.
Resumo:
Emerging zoonoses threaten global health, yet the processes by which they emerge are complex and poorly understood. Nipah virus (NiV) is an important threat owing to its broad host and geographical range, high case fatality, potential for human-to-human transmission and lack of effective prevention or therapies. Here, we investigate the origin of the first identified outbreak of NiV encephalitis in Malaysia and Singapore. We analyse data on livestock production from the index site (a commercial pig farm in Malaysia) prior to and during the outbreak, on Malaysian agricultural production, and from surveys of NiV's wildlife reservoir (flying foxes). Our analyses suggest that repeated introduction of NiV from wildlife changed infection dynamics in pigs. Initial viral introduction produced an explosive epizootic that drove itself to extinction but primed the population for enzootic persistence upon reintroduction of the virus. The resultant within-farm persistence permitted regional spread and increased the number of human infections. This study refutes an earlier hypothesis that anomalous El Nino Southern Oscillation-related climatic conditions drove emergence and suggests that priming for persistence drove the emergence of a novel zoonotic pathogen. Thus, we provide empirical evidence for a causative mechanism previously proposed as a precursor to widespread infection with H5N1 avian influenza and other emerging pathogens.
Resumo:
This research aimed to develop and evaluate pre- and postharvest management strategies to reduce stem end rot (SER) incidence and extend saleable life of 'Carabao' mango fruits in Southern Philippines. Preharvest management focused on the development and improvement of fungicide spray program, while postharvest management aimed to develop alternative interventions aside from hot water treatment (HWT). Field evaluation of systemic fungicides, namely azoxystrobin ( Amistar 25SC), tebuconazole ( Folicur 25WP), carbendazim ( Goldazim 500SC), difenoconazole ( Score 250SC) and azoxystrobin+difenoconazole ( Amistar Top), reduced blossom blight severity and improved fruit setting and retention, resulting in higher fruit yield but failed to sufficiently suppress SER incidence. Based on these findings, an improved fungicide spray program was developed taking into account the infection process of SER pathogens and fungicide resistance. Timely application of protectant (mancozeb) and systemic fungicides (azoxystrobin, carbendazim and difenoconazole) during the most critical stages of mango flower and fruit development ensured higher harvestable fruit yield and minimally lowered SER incidence. Control of SER was also achieved by employing postharvest treatment such as HWT (52-55°C for 10 min), which significantly prolonged the saleable life of mango fruits. However, extended hot water treatment (EHWT; 46°C pulp temperature for 15 min), rapid heat treatment (RHT; 59°C for 30-60 sec), fungicide dip and promising biological control agents failed to satisfactorily reduce SER and prolong saleable life. In contrast, the integration of the improved spray program as preharvest management practice, and postharvest treatments such as HWT and fungicide dips (azoxystrobin, 150-175 ppm; carbendazim, 312.5 ppm; and tebuconazole, 125-156 ppm), significantly reduced disease and extended marketable life for utmost 8 days.
Resumo:
A recent report to the Australian Government identified concerns relating to Australia's capacity to respond to a medium to large outbreak of FMD. To assess the resources required, the AusSpread disease simulation model was used to develop a plausible outbreak scenario that included 62 infected premises in five different states at the time of detection, 28 days after the disease entered the first property in Victoria. Movements of infected animals and/or contaminated product/equipment led to smaller outbreaks in NSW, Queensland, South Australia and Tasmania. With unlimited staff resources, the outbreak was eradicated in 63 days with 54 infected premises and a 98% chance of eradication within 3 months. This unconstrained response was estimated to involve 2724 personnel. Unlimited personnel was considered unrealistic, and therefore, the course of the outbreak was modelled using three levels of staffing and the probability of achieving eradication within 3 or 6 months of introduction determined. Under the baseline staffing level, there was only a 16% probability that the outbreak would be eradicated within 3 months, and a 60% probability of eradication in 6 months. Deployment of an additional 60 personnel in the first 3 weeks of the response increased the likelihood of eradication in 3 months to 68%, and 100% in 6 months. Deployment of further personnel incrementally increased the likelihood of timely eradication and decreased the duration and size of the outbreak. Targeted use of vaccination in high-risk areas coupled with the baseline personnel resources increased the probability of eradication in 3 months to 74% and to 100% in 6 months. This required 25 vaccination teams commencing 12 days into the control program increasing to 50 vaccination teams 3 weeks later. Deploying an equal number of additional personnel to surveillance and infected premises operations was equally effective in reducing the outbreak size and duration.