91 resultados para Control of Chagas disease
em University of Queensland eSpace - Australia
Resumo:
The objective of this study is to determine if a Chagas disease protocol starting with a serological survey is as reliable at identifying insect-infested areas as one using the gold standard entomological survey. The study found that health center areas infested with Rhodnius prolixus were identified using a threshold seroprevalence of 0.1%. The serological survey took half the time and was 30% less expensive than the entomological survey. Developing countries with limited resources may find this strategy useful in combating Chagas disease. This strategy also identifies seropositive children, which facilitates their treatment.
Resumo:
The possibility of controlling vector-borne disease through the development and release of transgenic insect vectors has recently gained popular support and is being actively pursued by a number of research laboratories around the world. Several technical problems must be solved before such a strategy could be implemented: genes encoding refractory traits (traits that render the insect unable to transmit the pathogen) must be identified, a transformation system for important vector species has to be developed, and a strategy to spread the refractory trait into natural vector populations must be designed. Recent advances in this field of research make it seem likely that this technology will be available in the near future. In this paper we review recent progress in this area as well as argue that care should be taken in selecting the most appropriate disease system with which to first attempt this form of intervention. Much attention is currently being given to the application of this technology to the control of malaria, transmitted by Anopheles gambiae in Africa. While malaria is undoubtedly the most important vector-borne disease in the world and its control should remain an important goal, we maintain that the complex epidemiology of malaria together with the intense transmission rates in Africa may make it unsuitable for the first application of this technology. Diseases such as African trypanosomiasis, transmitted by the tsetse fly, or unstable malaria in India may provide more appropriate initial targets to evaluate the potential of this form of intervention.
Resumo:
Objective: To estimate the number of coronary events that could be prevented in Australia each year by the use of preventive and therapeutic strategies targeted to subgroups of the population based on their levels of risk and need. Methods: Estimates of risk reduction from the published literature, prevalence estimates of elevated risk factor levels from the 1995 National Health Survey and treatment levels from the Australian collaborating centres in the World Health Organization's MONICA Project were used to calculate numbers of coronary events preventable among men and women aged 35-79 years in Australia. Results: Approximately 14,000 coronary events could be avoided each year if the mean level of cholesterol in the population was reduced by 0.5 mmol/L, smoking prevalence was halved and prevalence of physical inactivity was reduced to 25%. This represents a reduction in coronary events of about 40%. Even with less optimistic targets, a reduction of 20% could be attained, while the achievement of some internationally recommended targets could lead to almost 50% reduction. In the short term, aggressive medical treatment of people with elevated levels of risk factors and established coronary disease offers the greatest opportunity for reducing coronary events. Conclusion: A comprehensive approach to reduce levels of behavioural and biological risk factors and improve the use of effective treatment could lead to a large reduction in coronary event rates. In the long term, primary prevention - especially to reduce smoking, lower cholesterol levels and increase exercise - has the potential to reduce the population levels of risk and hence contain the national cost of coronary disease.
Resumo:
There is now considerable evidence to suggest that non-demented people with Parkinson's disease (PD) experience difficulties using the morphosyntactic aspects of language. It remains unclear, however, at precisely which point in the processing of morphosyntax, these difficulties emerge. The major objective of the present study was to examine the impact of PD on the processes involved in accessing morphosyntactic information in the lexicon. Nineteen people with PD and 19 matched control subjects participated in the study which employed on-line word recognition tasks to examine morphosyntactic priming for local grammatical dependencies that occur both within (e.g. is going) and across (e.g. she gives) phrasal boundaries (Experiments 1 and 2, respectively). The control group evidenced robust morphosyntactic priming effects that were consistent with the involvement of both pre- (Experiment 1) and post-lexical (Experiment 2) processing routines. Whilst the participants with PD also recorded priming for dependencies within phrasal boundaries (Experiment 1), priming effects were observed over an abnormally brief time course. Further, in contrast to the controls, the PD group failed to record morphosyntactic priming for constructions that crossed phrasal boundaries (Experiment 2). The results demonstrate that attentionally mediated mechanisms operating at both the pre- and post-lexical stages of processing are able to contribute to morphosyntactic priming effects. In addition, the findings support the notion that, whilst people with PD are able to access morphosyntactic information in a normal manner, the time frame in which this information remains available for processing is altered. Deficits may also be experienced at the post-lexical integrational stage of processing.
Resumo:
Age is a critical determinant of the ability of most arthropod vectors to transmit a range of human pathogens. This is due to the fact that most pathogens require a period of extrinsic incubation in the arthropod host before pathogen transmission can occur. This developmental period for the pathogen often comprises a significant proportion of the expected lifespan of the vector. As such, only a small proportion of the population that is oldest contributes to pathogen transmission. Given this, strategies that target vector age would be expected to obtain the most significant reductions in the capacity of a vector population to transmit disease. The recent identification of biological agents that shorten vector lifespan, such as Wolbachia, entomopathogenic fungi and densoviruses, offer new tools for the control of vector-borne diseases. Evaluation of the efficacy of these strategies under field conditions will be possible due to recent advances in insect age-grading techniques. Implementation of all of these strategies will require extensive field evaluation and consideration of the selective pressures that reductions in vector longevity may induce on both vector and pathogen.
Resumo:
The extensive antigenic variation phenomena African trypanosomes display in their mammalian host have hampered efforts to develop effective vaccines against trypanosomiasis. Human disease management aims largely to treat infected hosts by chemotherapy, whereas control of animal diseases relies on reducing tsetse populations as well as on drug therapy. The control strategies for animal diseases are carried out and financed by livestock owners, who have an obvious economic incentive. Sustaining largely insecticide-based control at a local level and relying on drugs for treatment of infected hosts for a disease for which there is no evidence of acquired immunity could prove extremely costly in the long run. It is more likely that a combination of several methods in an integrated, phased and area-wide approach would be more effective in controlling these diseases and subsequently improving agricultural output. New approaches that are environmentally acceptable, efficacious and affordable are clearly desirable for control of various medically and agriculturally important insects including tsetse. Here, Serap Aksoy and colleagues discuss molecular genetic approaches to modulate tsetse vector competence.
Resumo:
Blood disease of banana is substantiated by using the polymerase chain reaction for the first time from Irian Jaya, Indonesia.
Resumo:
Background. Increased life expectancy in men during the last thirty years is largely due to the decrease in mortality from cardiovascular disease in the age group 29-69 yr. This change has resulted in a change in the disease profile of the population with conditions such as aneurysm of the abdominal aorta (AAA) becoming more prevalent. The advent of endoluminal treatment for AAA has encouraged prophylactic intervention and fuelled the argument to screen for the disease. The feasibility of inserting an endoluminal graft is dependent on the morphology and growth characteristics of the aneurysm. This study used data from a randomized controlled trial of ultrasound screening for AAA in men aged 65-83 yr in Western Australia for the purpose of determining the norms of the living anatomy in the pressurized infrarenal aorta. Aims. To examine (1) the diameters of the infra-renal aorta in aneurysmal and non-aneurysmal cases, (2) the implications for treatment modalities, with particular reference to endoluminal grafting, which is most dependent on normal and aneurysmal morphology, and (3) any evidence to support the notion that northern Europeans are predisposed to aneurysmal disease. Methods. Using ultrasound, a randomized control trial was established in Western Australia to assess the value of a screening program in males aged 65-83 yr, The infra-renal aorta was defined as aneurysmal if the maximum diameter was 30 mm or more. Aortic diameter was modelled both as a continuous tin mm) and as a binary outcome variable, for those men who had an infra-renal diameter of 30 mm or more. ANOVA and linear regression were used for modelling aortic diameter as a continuum, while chi-square analysis and logistic regression were used in comparing men with and without the diagnosis of AAA. Findings. By December 1998, of 19.583 men had been invited to undergo ultrasound screening for AAA, 12.203 accepted the invitation (corrected response fraction 70.8%). The prevalence of AAA increased with age from 4.8% at 65 yr to 10.8% at 80 yr (chi (2) = 77.9, df = 3, P<0.001). The median (IQR) diameter for the non-aneurysmal group was 21.4 mm (3.3 mm) and there was an increase (<chi>(2) = 76.0, df = 1, P<0.001) in the diameter of the infra-renal aorta with age. Since 27 mm is the 95th centile for the non-aneurysmal infra-renal aorta, a diameter of 30 mm or more is justified as defining an aneurysm. The risk of AAA was higher in men of Australian (OR = 1.0) and northern European origin (OR = 1.0, 95%CL: 0.9. 1.2) compared with those of Mediterranean origin (OR = 0.5, 99%CL: 0.4, 0.7). Conclusion. Although screening has not yet been shown to reduce mortality from AAA. these population-based data assist the understanding of aneurysmal disease and the further development and use of endoluminal grafts for this condition. (C) 2001 Published by Elsevier Science Ltd on behalf of The International Society for Cardiovascular Surgery.