111 resultados para Epidemic encephalitis.
Resumo:
Background and aims-The colons of patients with pneumatosis cystoides coli produce excessive H-2. Exposure to alkyl halides could explain this. Six consecutive patients who had pneumatosis cystoides coli while taking chloral hydrate (1-5+ g/day) are reported. Patients 2 and 3 were investigated after they had ceased chloral hydrate treatment. One produced methane, the other did not. (Pneumatosis cystoides coli patients are non-methanogenic according to the literature.) Both had overnight fasting breath H-2 of less than 10 ppm. A literature review disclosed just one patient who was using chloral at the time of diagnosed pneumatosis cystoides coli, but an epidemic of the disease in workers exposed to trichloroethylene. Methods-(i) In vitro experiments with human faeces: chloral or closely related alkyl halides were added to anaerobic faecal cultures derived from four methane-producing and three non-methanogenic human subjects. H-2 and CH4 gases were measured. (ii) In vivo animal experiment: chloral hydrate was added to drinking water of four Wistar rats, and faecal HI compared with control rats. Results-Alkyl halides increased H-2 up to 900 times in methanogenic and 10 times in non-methanogenic faecal cultures. The K-i of chloral was 0.2 mM. Methanogenesis was inhibited in concert with the increase in net H-2. In the rat experiment, chloral hydrate increased H-2 10 times, but did not cause pneumatosis. Conclusions-Chloral and trichloroethylene are alkyl halides chemically similar to chloroform, a potent inhibitor of H-2 consumption by methanogens and acetogens. These bacteria are the most important H-2-consuming species in the colon. It is postulated that exposure to these alkyl halides increases net H-2 production, which sets the scene for counterperfusion supersaturation and the formation of gas cysts. In recent times, very low prescribing rates for chloral have caused primary pneumatosis cystoides to become extremely rare. As with primary pneumatosis, secondary pneumatosis cystoides, which occurs if there is small bowel bacterial overgrowth distal to a proximally located gut obstruction, is predicted by counterperfusion supersaturation. Inherent unsaturation due to metabolism of O-2 is a safety factor, which could explain why gas bubbles do not form more often in tissue with high H-2 tension.
Resumo:
This review describes the Australian decline in all-cause mortality, 1788-1990, and compares this with declines in Europe and North America. The period until the 1870s shows characteristic 'crisis mortality', attributable to epidemics of infectious disease. A decline in overall mortality is evident from 1880. A precipitous fall occurs in infant mortality from 1900, similar to that in European countries. Infant mortality continues downward during this century (except during the 1930s), with periods of accelerated decline during the 1940s (antibiotics) and early 1970s. Maternal mortality remains high until a precipitous fall in 1937 coinciding with the arrival of sulphonamide. Excess mortality due to the 1919 influenza epidemic is evident. Artefactual falls in mortality occur in 1930, and for men during the war of 1939-1945. Stagnation in overall mortality decline during the 1930s and 1945-1970 is evident for adult males, and during 1960-1970 for adult females. A decline in mortality is registered in both sexes from 1970, particularly in middle and older age groups, with narrowing of the sex differential. The mortality decline in Australia is broadly similar to those of the United Kingdom and several European countries, although an Australian advantage during last century and the first part of this century may have been due to less industrialisation, lower population density and better nutrition. Australia shows no war-related interruptions in the mortality decline. Australian mortality patterns from 1970 are also similar to those observed in North America and European countries (including the United Kingdom, but excluding Eastern Europe).
Resumo:
This review describes the changes in composition of mortality by major attributed cause during the Australian mortality decline this century. The principal categories employed were: infectious diseases, nonrheumatic cardiovascular disease, external causes, cancer,'other' causes and ill-defined conditions. The data were age-adjusted. Besides registration problems (which also affect all-cause mortality) artefacts due to changes in diagnostic designation and coding-are evident. The most obvious trends over the period are the decline in infectious disease mortality (half the decline 1907-1990 occurs before 1949), and the epidemic of circulatory disease mortality which appears to commence around 1930, peaks during the 1950s and 1960s, and declines from 1970 to 1990 (to a rate half that at the peak). Mortality for cancer remains static for females after 1907, but increases steadily for males, reaching a plateau in the mid-1980s (owing to trends in lung cancer); trends in cancers of individual sites are diverse. External cause mortality declines after 1970. The decline in total mortality to 1930 is associated with decline in infection and 'other' causes, Stagnation of mortality decline in 1930-1940 and 1946-1970 for males is a consequence of contemporaneous movements in opposite directions of infection mortality (decrease) and circulatory disease and cancer mortality (increase). In females, declines in infections and 'other' causes of death exceed the increase in circulatory disease mortality until 1960, then stability in all major causes of death to 1970. The overall mortality decline since 1970 is a consequence of a reduction in circulatory disease,'other' cause, external cause and infection mortality, despite the increase in cancer mortality (for males).
Resumo:
Coronary heart disease is a leading cause of death in Australia with the Coalfields district of New South Wales having one of the country's highest rates. Identification of the Coalfields epidemic in the 1970's led to the formation of a community awareness program in the late 1980's (the healthy heart support group) followed by a more intense community action program in 1990, the Coalfields Healthy Heartbeat (CHHB). CHHB is a coalition of community members, local government officers, health workers and University researchers. We evaluate the CHHB program, examining both the nature and sustainability of heart health activities undertaken, as well as trends in risk factor levels and rates of coronary events in the Coalfields in comparison with nearby local government areas. Process data reveal difficulties mobilising the community as a whole; activities had to be selected for interested subgroups such as families of heart disease patients, school children, retired people and women concerned with family nutrition and body maintenance. Outcome data show a significantly larger reduction in case fatality for Coalfields men (although nonfatal heart attacks did not decline) while changes in risk factors levels were comparable with surrounding areas. We explain positive responses to the CHHB by schools, heart attack survivors and women interested in body maintenance in terms of the meaning these subgroups find in health promotion discourses based on their embodied experiences. When faced with a threat to one's identity, health discourse suddenly becomes meaningful along with the regimens for health improvement. General public disinterest in heart health promotion is examined in the context of historical patterns of outsiders criticising the lifestyle of miners, an orientation toward communal lather than individual responsibility for health (i.e, community 'owned' emergency services and hospitals) and anger about risks from environmental hazards imposed by industrialists. (C) 1999 Elsevier Science Ltd. All rights reserved.
Resumo:
Spending by aid agencies on emergencies has quadrupled over the last decade, to over US$ 6 billion. To date, cost-effectiveness has seldom been considered in the prioritization and evaluation of emergency interventions. The sheer volume of resources spent on humanitarian aid and the chronicity of many humanitarian interventions call for more attention to be paid to the issue of 'value for money'. In this paper we present data from a major humanitarian crisis, an epidemic of visceral leishmaniasis (VL) in war-torn Sudan. The special circumstances provided us, in retrospect, with unusually accurate data on excess mortality, costs of the intervention and its effects, thus allowing us to express cost-effectiveness as the cost per Disability Adjusted Life Year (DALY) averted. The cost-effectiveness ratio, of US$ 18.40 per DALY (uncertainty range between US$ 13.53 and US$ 27.63), places the treatment of VL in Sudan among health interventions considered 'very flood value for money' (interventions of less than US$ 25 per DALY). We discuss the usefulness of this analysis to the internal management of the VL programme, the procurement of funds for the programme, and more generally, to priority setting in humanitarian relief interventions. We feel that in evaluations of emergency interventions attempts could be made more often to perform cost-effectiveness analyses, including the use of DALYs, provided that the outcomes of these analyses are seen in the broad context of the emergency situation and its consequences on the affected population. This paper provides a first contribution to what is hoped to become an international database of cost-effectiveness studies of health outcome such as the DALY.
Resumo:
Objective: To determine the effectiveness of twice-weekly directly observed therapy (DOT) for tuberculosis (TB) in HIV-infected and uninfected patients, irrespective of their previous treatment history. Also to determine the predictive value of 2-3 month smears on treatment outcome. Methods: Four hundred and sixteen new and 113 previously treated adults with culture positive pulmonary TB (58% HIV infected, 9% combined drug resistance) in Hlabisa, South Africa. Daily isoniazid (H), rifampicin (R), pyrazinamide (Z) and ethambutol (E) given in hospital (median 17 days), followed by HRZE twice a week to 2 months and HR twice a week to 6 months in the community. Results: Outcomes at 6 months among the 416 new patients were: transferred out 2%; interrupted treatment 17%; completed treatment 3%; failure 2%; and cured 71%. Outcomes were similar among HIV-infected and uninfected patients except for death (6 versus 2%; P = 0.03). Cure was frequent among adherent HIV-infected (97%; 95% CI 94-99%) and uninfected (96%; 95% CI 92-99%) new patients. Outcomes were similar among previously treated and new patients, except for death (11 versus 4%; P = 0.01), and cure among adherent previously treated patients 97% (95% CI 92-99%) was high. Smear results at 2 months did not predict the final outcome. Conclusion: A twice-weekly rifampicin-containing drug regimen given under DOT cures most adherent patients irrespective of HIV status and previous treatment history. The 2 month smear may be safely omitted. Relapse rates need to be determined, and an improved system of keeping treatment interrupters on therapy is needed. Simplified TB treatment may aid implementation of the DOTS strategy in settings with high TB caseloads secondary to the HIV epidemic. (C) 1999 Lippincott Williams & Wilkins.
Resumo:
OBJECTIVE: Although little studied in developing countries, multidrug-resistant tuberculosis (MDR-TB) is considered a major threat. We report the molecular epidemiology, clinical features and outcome of an emerging MDR-TB epidemic. METHODS: In 1996 all tuberculosis suspects in the rural Hlabisa district, South Africa, had sputum cultured, and drug susceptibility patterns of mycobacterial isolates were determined. Isolates with MDR-TB (resistant to both isoniazid and rifampicin) were DNA fingerprinted by restriction fragment length polymorphism (RFLP) using IS6110 and polymorphic guanine-cytosine-rich sequence-based (PGRS) probes. Patients with MDR-TB were traced to determine outcome. Data were compared with results from a survey of drug susceptibility done in 1994. RESULTS: The rate of MDR-TB among smear-positive patients increased six-fold from 0.36% (1/275) in 1994 to 2.3% (13/561) in 1996 (P = 0.04). A further eight smear-negative cases were identified in 1996 from culture, six of whom had not been diagnosed with tuberculosis. MDR disease was clinically suspected in only five of the 21 cases (24%). Prevalence of primary and acquired MDR-TB was 1.8% and 4.1%, respectively. Twelve MDR-TB cases (67%) were in five RFLP-defined clusters. Among 20 traced patients, 10 (50%) had died, five had active disease (25%) and five (25%) were apparently cured. CONCLUSIONS: The rate of MDR-TB has risen rapidly in Hlabisa, apparently due to both reactivation disease and recent transmission. Many patients were not diagnosed with tuberculosis and many were not suspected of drug-resistant disease, and outcome was poor.
Resumo:
Recent El Nino events have stimulated interest in the development of modeling techniques to forecast extremes of climate and related health events. Previous studies have documented associations between specific climate variables (particularly temperature and rainfall) and outbreaks of arboviral disease. In some countries, such diseases are sensitive to Fl Nino. Here we describe a climate-based model for the prediction of Ross River virus epidemics in Australia. From a literature search and data on case notifications, we determined in which years there were epidemics of Ross River virus in southern Australia between 1928 and 1998. Predictor variables were monthly Southern Oscillation index values for the year of an epidemic or lagged by 1 year. We found that in southeastern states, epidemic years were well predicted by monthly Southern Oscillation index values in January and September in the previous year. The model forecasts that there is a high probability of epidemic Ross River virus in the southern states of Australia in 1999. We conclude that epidemics of arboviral disease can, at least in principle, be predicted on the basis of climate relationships.
Resumo:
Objective: To measure prevalence and model incidence of HIV infection. Setting: 2013 consecutive pregnant women attending public sector antenatal clinics in 1997 in Hlabisa health district, South Africa. Historical seroprevalence data, 1992-1995. Methods: Serum remaining from syphilis testing was tested anonymously for antibodies to HIV to determine seroprevalence. Two models, allowing for differential mortality between HIV-positive and HIV-negative people, were used. The first used serial seroprevalence data to estimate trends in annual incidence. The second, a maximum likelihood model, took account of changing force of infection and age-dependent risk of infection, to estimate age-specific HIV incidence in 1997. Multiple logistic regression provided adjusted odds ratios (OR) for risk factors for prevalent HIV infection. Results: Estimated annual HIV incidence increased from 4% in 1992/1993 to 10% in 1996/1997. In 1997, highest age-specific incidence was 16% among women aged between 20 and 24 years. in 1997, overall prevalence was 26% (95% confidence interval [CI], 24%-28%) and at 34% was highest among women aged between 20 and 24 years. Young age (<30 years; odds ratio [OR], 2.1; p = .001), unmarried status (OR 2.2; p = .001) and living in less remote parts of the district (OR 1.5; p = .002) were associated with HIV prevalence in univariate analysis. Associations were less strong in multivariate analysis. Partner's migration status was not associated with HIV infection. Substantial heterogeneity of HIV prevalence by clinic was observed (range 17%-31%; test for trend, p = .001). Conclusions: This community is experiencing an explosive HIV epidemic. Young, single women in the more developed parts of the district would form an appropriate cohort to test, and benefit from, interventions such as vaginal microbicides and HIV vaccines.
Resumo:
OBJECTIVE: To compare the HIV/AIDS epidemics in Australia and sub-Saharan Africa, to outline reasons for differences, and to consider implications for the Asia and Pacific region. METHODS: Comparison of key indicators of the epidemic in Australia, and Africa viewed largely through the experience of the Hlabisa health district, South Africa. RESULTS: To the end of 1997, for all Australia, the estimated cumulative number of HIV infections was approximately 19,000, whereas in Hlabisa 31,000 infections are estimated to have occurred. Compared with the low and declining incidence of HIV in Australia (<1%), estimated incidence in Hlabisa rose to 10% in 1997. In all, 94% of Australian infections have been amongst men; in Hlabisa equal numbers of males and females are infected. Consequently, whereas 3000 children were perinatally exposed to HIV in Hlabisa in 1998 alone, 160 Australian children have been exposed this way. In Australia, HIV-related disease is characterised by opportunistic infection whereas in Hlabisa tuberculosis and wasting dominate. Surveys among gay men in Sydney and Melbourne indicate >80% of HIV infected people receive antiretroviral therapy whereas in Hlabisa these drugs are not available. IMPLICATIONS: It seems possible that Asia and the Pacific will experience a similar HIV/AIDS epidemic to that in Africa. Levels of HIV are already high in parts of Asia, and social conditions in parts of the region might be considered ripe for the spread of HIV. As Australia strengthens economic and political ties within the region, so should more be done to help Pacific and Asian neighbours to prevent and respond to the HIV epidemic.
Resumo:
Ecological extinction caused by overfishing precedes all other pervasive human disturbance to coastal ecosystems, including pollution, degradation of water quality, and anthropogenic climate change. Historical abundances of large consumer species were fantastically large in comparison with recent observations. Paleoecological, archaeological, and historical data show that time lags of decades to centuries occurred between the onset of overfishing and consequent changes in ecological communities, because unfished species of similar trophic level assumed the ecological roles of overfished species until they too were overfished or died of epidemic diseases related to overcrowding. Retrospective data not only help to clarify underlying causes and rates of ecological change, but they also demonstrate achievable goals for restoration and management of coastal ecosystems that could not even be contemplated based on the limited perspective of recent observations alone.
Resumo:
Canine parasitic zoonoses pose a continuing public health problem, especially in developing countries and communities that are socioeconomically disadvantaged. Our study combined the use of conventional and molecular epidemic, logical tools to determine the role of dogs in transmission of gastrointestinal (GI) parasites such as hookworms, Giardia and Ascaris in a parasite endemic teagrowing community in northeast India. A highly sensitive and specific molecular tool was developed to detect and differentiate the zoonotic species of canine hookworm eggs directly from faeces. This allowed epidemiological screening of canine hookworm species in this community to be conducted with ease and accuracy. The zoonotic potential of canine Giardia was also investigated by characterising Giardia duodenalis recovered from humans and dogs living in the same locality and households at three different loci. Phylogenetic and epidemiological analysis provided compelling evidence to support the zoonotic transmission of canine Giardia. Molecular tools were also used to identify the species of Ascaris egg present in over 30% of dog faecal samples. The results demonstrated the role of dogs as a significant disseminator and environmental contaminator of Ascaris lumbricoides in communities where promiscuous defecation practices exist. Our study demonstrated the usefulness of combining conventional and molecular parasitological and epidemiological tools to help solve unresolved relationships with regards to parasitic zoonoses.