909 resultados para Epidemic encephalitis.
Resumo:
The aim of this study was to investigate HIV-1 molecular diversity and the epidemiological profile of HIV-1-infected patients from Ribeirao Preto, Brazil. A nested PCR followed by sequencing of a 302-base pair fragment of the env gene (C2-V3 region) was performed in samples from HIV-1-positive patients. A total of 45 sequences were aligned with final manual adjustments. The phylogenetic analyses showed a higher prevalence of HIV-1 subtype B in the studied population (97.8%) with only one sample yielding an F1 subtype. The viral genotyping prediction showed that CCR5 tropism was the most prevalent in the studied cohort. Geno2pheno analysis showed that R5 and CXCR4 prediction were 69% and 31%, respectively. There was no statistical significance, either in viral load or in CD4(+) T cell count when R5 and X4 prediction groups were compared. Moreover, the GPGR tetramer was the most common V3 loop core motif identified in the HIV-1 strains studied (34.1%) followed by GWGR, identified in 18.1% of the samples. The high level of B subtype in this Brazilian population reinforces the nature of the HIV epidemic in Brazil, and corroborates previous data obtained in the Brazilian HIV-infected population.
Resumo:
There are 3 strains of Encephalitozoon cuniculi that occur in mammals. Strain III is associated with clinical disease in dogs, although some can be asymptomatic carriers and excrete spores in their urine. Several cases of human E. cuniculi infection caused by strain III have been observed in immunocompromised patients, indicating that E. cuniculi should be considered a zoonotic agent. Encephalitozoon cuniculi can cause fatal disease in maternally-infected or young dogs. Clinical signs in these animals included blindness, encephalitis, retarded growth rate, and nephritis. Encephalitozoon cuniculi has also been associated with primary renal failure in adult dogs. The present study used the direct agglutination test (DAT, cut-off 1:50) and the indirect fluorescent antibody test (IFAT, cut-off 1:10) to examine the prevalence of antibodies to E. cuniculi in dogs from Brazil and Colombia. Using the DAG, 31 (27.4%) of 113 dogs from Brazil and 47 (18.5%) of 254 dogs from Colombia were seropositive. Nine (14.3%) of 63 dogs from Brazil and IS (35.3%) of the 51 dogs from Colombia were seropositive by indirect immunofluorescent antibody test. These results indicate that dogs from Brazil and Colombia are exposed to E. cuniculi.
Resumo:
Yellow fever virus (YFV) was isolated from Haemagogus leucocelaenus mosquitoes during an epizootic in 2001 in the Rio Grande do Sul State in southern Brazil In October 2008 a yellow fever outbreak was reported there with nonhuman primate deaths and human cases This latter outbreak led to intensification of surveillance measures for early detection of YFV and support for vaccination programs We report entomologic surveillance in 2 municipalities that recorded nonhuman primate deaths Mosquitoes were collected at ground level identified and processed for virus isolation and molecular analyses Eight YFV strains were isolated (7 from pools of Hg leucocelaenus mosquitoes and another from Aedes serratus mosquitoes) 6 were sequenced and they grouped in the YFV South American genotype I The results confirmed the role of Hg leucocelaenus mosquitoes as the main YFV vector in southern Brazil and suggest that Ae serratus mosquitoes may have a potential role as a secondary vector
Resumo:
By means of numerical simulations and epidemic analysis, the transition point of the stochastic asynchronous susceptible-infected-recovered model on a square lattice is found to be c(0)=0.176 500 5(10), where c is the probability a chosen infected site spontaneously recovers rather than tries to infect one neighbor. This point corresponds to an infection/recovery rate of lambda(c)=(1-c(0))/c(0)=4.665 71(3) and a net transmissibility of (1-c(0))/(1+3c(0))=0.538 410(2), which falls between the rigorous bounds of the site and bond thresholds. The critical behavior of the model is consistent with the two-dimensional percolation universality class, but local growth probabilities differ from those of dynamic percolation cluster growth, as is demonstrated explicitly.
Resumo:
Susceptible-infective-removed (SIR) models are commonly used for representing the spread of contagious diseases. A SIR model can be described in terms of a probabilistic cellular automaton (PCA), where each individual (corresponding to a cell of the PCA lattice) is connected to others by a random network favoring local contacts. Here, this framework is employed for investigating the consequences of applying vaccine against the propagation of a contagious infection, by considering vaccination as a game, in the sense of game theory. In this game, the players are the government and the susceptible newborns. In order to maximize their own payoffs, the government attempts to reduce the costs for combating the epidemic, and the newborns may be vaccinated only when infective individuals are found in their neighborhoods and/or the government promotes an immunization program. As a consequence of these strategies supported by cost-benefit analysis and perceived risk, numerical simulations show that the disease is not fully eliminated and the government implements quasi-periodic vaccination campaigns. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The effects of copper sprays on annual and polyetic progress of citrus canker, caused by Xanthomonas citri subsp. citri, in the presence of the Asian citrus leafminer (Phyllocnistis citrella), were evaluated in a study conducted in a commercial orchard in northwest Parana state, Brazil, where citrus canker is endemic. Nonlinear monomolecular, logistic and Gompertz models were fitted to monthly disease incidence data (proportion of leaves with symptoms) for each treatment for three seasons. The logistic model provided the best estimate of disease progress for all years and treatments evaluated and logistic parameter estimates were used to describe polyetic disease dynamics. Although citrus canker incidence increased during each of the seasons studied, it decreased over the whole study period, more so in copper-treated trees than in water-sprayed controls. Copper treatment reduced disease incidence compared with controls in every year, especially 2004-2005, when incidence was ca. 10-fold higher in controls than in treated plots (estimated asymptote values 0 center dot 82 and 0 center dot 07, respectively). Copper treatment also reduced estimated initial disease incidence and epidemic growth rates every year.
Resumo:
P>Thirty-five lymph node samples were taken from animals with macroscopic lesions consistent with Mycobacterium bovis infection. The animals were identified by postmortem examination in an abattoir in the northwestern region of state of Parana, Brazil. Twenty-two of the animals had previously been found to be tuberculin skin test positive. Tissue samples were decontaminated by Petroff`s method and processed for acid-fast bacilli staining, culture in Stonebrink and Lowenstein-Jensen media and DNA extraction. Lymph node DNA samples were amplified by PCR in the absence and presence (inhibitor controls) of DNA extracted from M. bovis culture. Mycobacterium bovis was identified in 14 (42.4%) lymph node samples by both PCR and by culture. The frequency of PCR-positive results (54.5%) was similar to that of culture-positive results (51.5%, P > 0.05). The percentage of PCR-positive lymph nodes increased from 39.4% (13/33) to 54.5% (18/33) when samples that were initially PCR-negative were reanalysed using 2.5 mu l DNA (two samples) and 1 : 2 diluted DNA (three samples). PCR sensitivity was affected by inhibitors and by the amount of DNA in the clinical samples. Our results indicate that direct detection of M. bovis in lymph nodes by PCR may be a fast and useful tool for bovine tuberculosis epidemic management in the region.
Resumo:
Background and aims-The colons of patients with pneumatosis cystoides coli produce excessive H-2. Exposure to alkyl halides could explain this. Six consecutive patients who had pneumatosis cystoides coli while taking chloral hydrate (1-5+ g/day) are reported. Patients 2 and 3 were investigated after they had ceased chloral hydrate treatment. One produced methane, the other did not. (Pneumatosis cystoides coli patients are non-methanogenic according to the literature.) Both had overnight fasting breath H-2 of less than 10 ppm. A literature review disclosed just one patient who was using chloral at the time of diagnosed pneumatosis cystoides coli, but an epidemic of the disease in workers exposed to trichloroethylene. Methods-(i) In vitro experiments with human faeces: chloral or closely related alkyl halides were added to anaerobic faecal cultures derived from four methane-producing and three non-methanogenic human subjects. H-2 and CH4 gases were measured. (ii) In vivo animal experiment: chloral hydrate was added to drinking water of four Wistar rats, and faecal HI compared with control rats. Results-Alkyl halides increased H-2 up to 900 times in methanogenic and 10 times in non-methanogenic faecal cultures. The K-i of chloral was 0.2 mM. Methanogenesis was inhibited in concert with the increase in net H-2. In the rat experiment, chloral hydrate increased H-2 10 times, but did not cause pneumatosis. Conclusions-Chloral and trichloroethylene are alkyl halides chemically similar to chloroform, a potent inhibitor of H-2 consumption by methanogens and acetogens. These bacteria are the most important H-2-consuming species in the colon. It is postulated that exposure to these alkyl halides increases net H-2 production, which sets the scene for counterperfusion supersaturation and the formation of gas cysts. In recent times, very low prescribing rates for chloral have caused primary pneumatosis cystoides to become extremely rare. As with primary pneumatosis, secondary pneumatosis cystoides, which occurs if there is small bowel bacterial overgrowth distal to a proximally located gut obstruction, is predicted by counterperfusion supersaturation. Inherent unsaturation due to metabolism of O-2 is a safety factor, which could explain why gas bubbles do not form more often in tissue with high H-2 tension.
Resumo:
This review describes the Australian decline in all-cause mortality, 1788-1990, and compares this with declines in Europe and North America. The period until the 1870s shows characteristic 'crisis mortality', attributable to epidemics of infectious disease. A decline in overall mortality is evident from 1880. A precipitous fall occurs in infant mortality from 1900, similar to that in European countries. Infant mortality continues downward during this century (except during the 1930s), with periods of accelerated decline during the 1940s (antibiotics) and early 1970s. Maternal mortality remains high until a precipitous fall in 1937 coinciding with the arrival of sulphonamide. Excess mortality due to the 1919 influenza epidemic is evident. Artefactual falls in mortality occur in 1930, and for men during the war of 1939-1945. Stagnation in overall mortality decline during the 1930s and 1945-1970 is evident for adult males, and during 1960-1970 for adult females. A decline in mortality is registered in both sexes from 1970, particularly in middle and older age groups, with narrowing of the sex differential. The mortality decline in Australia is broadly similar to those of the United Kingdom and several European countries, although an Australian advantage during last century and the first part of this century may have been due to less industrialisation, lower population density and better nutrition. Australia shows no war-related interruptions in the mortality decline. Australian mortality patterns from 1970 are also similar to those observed in North America and European countries (including the United Kingdom, but excluding Eastern Europe).
Resumo:
This review describes the changes in composition of mortality by major attributed cause during the Australian mortality decline this century. The principal categories employed were: infectious diseases, nonrheumatic cardiovascular disease, external causes, cancer,'other' causes and ill-defined conditions. The data were age-adjusted. Besides registration problems (which also affect all-cause mortality) artefacts due to changes in diagnostic designation and coding-are evident. The most obvious trends over the period are the decline in infectious disease mortality (half the decline 1907-1990 occurs before 1949), and the epidemic of circulatory disease mortality which appears to commence around 1930, peaks during the 1950s and 1960s, and declines from 1970 to 1990 (to a rate half that at the peak). Mortality for cancer remains static for females after 1907, but increases steadily for males, reaching a plateau in the mid-1980s (owing to trends in lung cancer); trends in cancers of individual sites are diverse. External cause mortality declines after 1970. The decline in total mortality to 1930 is associated with decline in infection and 'other' causes, Stagnation of mortality decline in 1930-1940 and 1946-1970 for males is a consequence of contemporaneous movements in opposite directions of infection mortality (decrease) and circulatory disease and cancer mortality (increase). In females, declines in infections and 'other' causes of death exceed the increase in circulatory disease mortality until 1960, then stability in all major causes of death to 1970. The overall mortality decline since 1970 is a consequence of a reduction in circulatory disease,'other' cause, external cause and infection mortality, despite the increase in cancer mortality (for males).
Resumo:
Coronary heart disease is a leading cause of death in Australia with the Coalfields district of New South Wales having one of the country's highest rates. Identification of the Coalfields epidemic in the 1970's led to the formation of a community awareness program in the late 1980's (the healthy heart support group) followed by a more intense community action program in 1990, the Coalfields Healthy Heartbeat (CHHB). CHHB is a coalition of community members, local government officers, health workers and University researchers. We evaluate the CHHB program, examining both the nature and sustainability of heart health activities undertaken, as well as trends in risk factor levels and rates of coronary events in the Coalfields in comparison with nearby local government areas. Process data reveal difficulties mobilising the community as a whole; activities had to be selected for interested subgroups such as families of heart disease patients, school children, retired people and women concerned with family nutrition and body maintenance. Outcome data show a significantly larger reduction in case fatality for Coalfields men (although nonfatal heart attacks did not decline) while changes in risk factors levels were comparable with surrounding areas. We explain positive responses to the CHHB by schools, heart attack survivors and women interested in body maintenance in terms of the meaning these subgroups find in health promotion discourses based on their embodied experiences. When faced with a threat to one's identity, health discourse suddenly becomes meaningful along with the regimens for health improvement. General public disinterest in heart health promotion is examined in the context of historical patterns of outsiders criticising the lifestyle of miners, an orientation toward communal lather than individual responsibility for health (i.e, community 'owned' emergency services and hospitals) and anger about risks from environmental hazards imposed by industrialists. (C) 1999 Elsevier Science Ltd. All rights reserved.
Resumo:
Spending by aid agencies on emergencies has quadrupled over the last decade, to over US$ 6 billion. To date, cost-effectiveness has seldom been considered in the prioritization and evaluation of emergency interventions. The sheer volume of resources spent on humanitarian aid and the chronicity of many humanitarian interventions call for more attention to be paid to the issue of 'value for money'. In this paper we present data from a major humanitarian crisis, an epidemic of visceral leishmaniasis (VL) in war-torn Sudan. The special circumstances provided us, in retrospect, with unusually accurate data on excess mortality, costs of the intervention and its effects, thus allowing us to express cost-effectiveness as the cost per Disability Adjusted Life Year (DALY) averted. The cost-effectiveness ratio, of US$ 18.40 per DALY (uncertainty range between US$ 13.53 and US$ 27.63), places the treatment of VL in Sudan among health interventions considered 'very flood value for money' (interventions of less than US$ 25 per DALY). We discuss the usefulness of this analysis to the internal management of the VL programme, the procurement of funds for the programme, and more generally, to priority setting in humanitarian relief interventions. We feel that in evaluations of emergency interventions attempts could be made more often to perform cost-effectiveness analyses, including the use of DALYs, provided that the outcomes of these analyses are seen in the broad context of the emergency situation and its consequences on the affected population. This paper provides a first contribution to what is hoped to become an international database of cost-effectiveness studies of health outcome such as the DALY.
Resumo:
Objective: To determine the effectiveness of twice-weekly directly observed therapy (DOT) for tuberculosis (TB) in HIV-infected and uninfected patients, irrespective of their previous treatment history. Also to determine the predictive value of 2-3 month smears on treatment outcome. Methods: Four hundred and sixteen new and 113 previously treated adults with culture positive pulmonary TB (58% HIV infected, 9% combined drug resistance) in Hlabisa, South Africa. Daily isoniazid (H), rifampicin (R), pyrazinamide (Z) and ethambutol (E) given in hospital (median 17 days), followed by HRZE twice a week to 2 months and HR twice a week to 6 months in the community. Results: Outcomes at 6 months among the 416 new patients were: transferred out 2%; interrupted treatment 17%; completed treatment 3%; failure 2%; and cured 71%. Outcomes were similar among HIV-infected and uninfected patients except for death (6 versus 2%; P = 0.03). Cure was frequent among adherent HIV-infected (97%; 95% CI 94-99%) and uninfected (96%; 95% CI 92-99%) new patients. Outcomes were similar among previously treated and new patients, except for death (11 versus 4%; P = 0.01), and cure among adherent previously treated patients 97% (95% CI 92-99%) was high. Smear results at 2 months did not predict the final outcome. Conclusion: A twice-weekly rifampicin-containing drug regimen given under DOT cures most adherent patients irrespective of HIV status and previous treatment history. The 2 month smear may be safely omitted. Relapse rates need to be determined, and an improved system of keeping treatment interrupters on therapy is needed. Simplified TB treatment may aid implementation of the DOTS strategy in settings with high TB caseloads secondary to the HIV epidemic. (C) 1999 Lippincott Williams & Wilkins.
Resumo:
OBJECTIVE: Although little studied in developing countries, multidrug-resistant tuberculosis (MDR-TB) is considered a major threat. We report the molecular epidemiology, clinical features and outcome of an emerging MDR-TB epidemic. METHODS: In 1996 all tuberculosis suspects in the rural Hlabisa district, South Africa, had sputum cultured, and drug susceptibility patterns of mycobacterial isolates were determined. Isolates with MDR-TB (resistant to both isoniazid and rifampicin) were DNA fingerprinted by restriction fragment length polymorphism (RFLP) using IS6110 and polymorphic guanine-cytosine-rich sequence-based (PGRS) probes. Patients with MDR-TB were traced to determine outcome. Data were compared with results from a survey of drug susceptibility done in 1994. RESULTS: The rate of MDR-TB among smear-positive patients increased six-fold from 0.36% (1/275) in 1994 to 2.3% (13/561) in 1996 (P = 0.04). A further eight smear-negative cases were identified in 1996 from culture, six of whom had not been diagnosed with tuberculosis. MDR disease was clinically suspected in only five of the 21 cases (24%). Prevalence of primary and acquired MDR-TB was 1.8% and 4.1%, respectively. Twelve MDR-TB cases (67%) were in five RFLP-defined clusters. Among 20 traced patients, 10 (50%) had died, five had active disease (25%) and five (25%) were apparently cured. CONCLUSIONS: The rate of MDR-TB has risen rapidly in Hlabisa, apparently due to both reactivation disease and recent transmission. Many patients were not diagnosed with tuberculosis and many were not suspected of drug-resistant disease, and outcome was poor.
Resumo:
Recent El Nino events have stimulated interest in the development of modeling techniques to forecast extremes of climate and related health events. Previous studies have documented associations between specific climate variables (particularly temperature and rainfall) and outbreaks of arboviral disease. In some countries, such diseases are sensitive to Fl Nino. Here we describe a climate-based model for the prediction of Ross River virus epidemics in Australia. From a literature search and data on case notifications, we determined in which years there were epidemics of Ross River virus in southern Australia between 1928 and 1998. Predictor variables were monthly Southern Oscillation index values for the year of an epidemic or lagged by 1 year. We found that in southeastern states, epidemic years were well predicted by monthly Southern Oscillation index values in January and September in the previous year. The model forecasts that there is a high probability of epidemic Ross River virus in the southern states of Australia in 1999. We conclude that epidemics of arboviral disease can, at least in principle, be predicted on the basis of climate relationships.