943 resultados para leprosy detection rate
Resumo:
Pós-graduação em Zootecnia - FMVZ
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
We analyze free elementary particles with a rest mass m and total energy E
Resumo:
Background: To establish the best methodology for diagnosis and management of patients with solid and complex renal masses by comparing the costs and benefits of different imaging methods and to improve differential diagnosis of these benign and malignant lesions, particularly by investigating tumour calcifications. Methods: We performed a prospective study on 31 patients with solid or complex masses by submitting them to Abdominal Ultrasonography (US), Doppler Ultrasonography of the renal mass (US Dop), Computed Tomography (CT), and Magnetic Resonance Imaging (MRI). Results: We found 28 patients with malignant and three with benign masses. Of the 28 malignant, 17 showed calcifications at CT; 16 central and one was of the pure peripheral curvilinear type (egg shell). Excretory Urography (IVP) had a significantly lower detection rate for central calcifications than both US and CT. Benign and malignant masses appeared as described in literature, with US, CT and MRI showing high sensitivity and specificity in renal tumor diagnosis. The exception was US Dop where we obtained lower sensitivity for the characterization of malignant tumor flow. Conclusions: In this series we were surprised to find that CT revealed central calcifications in 51.6% of patients, all with malignant lesions, while, literature reports a frequency of calcification in renal cell carcinoma between 8 and 22%, in studies using abdominal films and EU (IVP). This finding is of great importance when we consider that these calcifications occur particularly in malignant neoplasms. As a result of comparing these different imaging methods we have developed a better methodology for renal tumor investigation.
Resumo:
We estimated the population density of the Helmeted Curassow (Pauxi pauxi) in Tama National Park (TNP) Colombia, using visual counts between December 2006 and December 2008. We used six line transects (1 km each) equitably distributed in a natural forest between 800 and 1,200 m asl in the southern part of the park. The sampling effort was 588 hrs with a total distance of 490 km, a detection rate of 0.06 records/hr, and an encounter rate of 0.08 individuals/km. Only solitary individuals were recorded (n = 40); the estimated density was 4.8 individuals/km(2). Most detections occurred in the lower strata of the forest (floor and sub-canopy) where hunters take advantage of curassows in the lower strata for successful harvest. The southern sector of TNP becomes important in the dry season. Our study suggests a large population is in TNP, but harvesting activities including removal of eggs, chicks, and juveniles, and hunting adults are affecting the reproductive rate and population of the species. Received 6 June 2011. Accepted 2 February 2012.
Resumo:
Abstract Background Direct smear examination with Ziehl-Neelsen (ZN) staining for the diagnosis of pulmonary tuberculosis (PTB) is cheap and easy to use, but its low sensitivity is a major drawback, particularly in HIV seropositive patients. As such, new tools for laboratory diagnosis are urgently needed to improve the case detection rate, especially in regions with a high prevalence of TB and HIV. Objective To evaluate the performance of two in house PCR (Polymerase Chain Reaction): PCR dot-blot methodology (PCR dot-blot) and PCR agarose gel electrophoresis (PCR-AG) for the diagnosis of Pulmonary Tuberculosis (PTB) in HIV seropositive and HIV seronegative patients. Methods A prospective study was conducted (from May 2003 to May 2004) in a TB/HIV reference hospital. Sputum specimens from 277 PTB suspects were tested by Acid Fast Bacilli (AFB) smear, Culture and in house PCR assays (PCR dot-blot and PCR-AG) and their performances evaluated. Positive cultures combined with the definition of clinical pulmonary TB were employed as the gold standard. Results The overall prevalence of PTB was 46% (128/277); in HIV+, prevalence was 54.0% (40/74). The sensitivity and specificity of PCR dot-blot were 74% (CI 95%; 66.1%-81.2%) and 85% (CI 95%; 78.8%-90.3%); and of PCR-AG were 43% (CI 95%; 34.5%-51.6%) and 76% (CI 95%; 69.2%-82.8%), respectively. For HIV seropositive and HIV seronegative samples, sensitivities of PCR dot-blot (72% vs 75%; p = 0.46) and PCR-AG (42% vs 43%; p = 0.54) were similar. Among HIV seronegative patients and PTB suspects, ROC analysis presented the following values for the AFB smear (0.837), Culture (0.926), PCR dot-blot (0.801) and PCR-AG (0.599). In HIV seropositive patients, these area values were (0.713), (0.900), (0.789) and (0.595), respectively. Conclusion Results of this study demonstrate that the in house PCR dot blot may be an improvement for ruling out PTB diagnosis in PTB suspects assisted at hospitals with a high prevalence of TB/HIV.
Resumo:
Introduction: Entamoeba histolytica infections were investigated in residents of the Ariquemes and Monte Negro municipalities in Rondônia State, Brazil. Methods: Stool samples of 216 individuals were processed by the spontaneous sedimentation method and analyzed by microscopy for detection of the E. histolytica/E. dispar complex, followed by the immunoassay method using an enzyme-linked immunosorbent assay-based kit for the E. histolytica stool antigen. Results: E. histolytica/E. dispar cysts were present in 61% (50/82) and 44% (59/134) of the samples from Ariquemes and Monte Negro respectively, with a significant difference in the occurrence of infection between the two populations [p < 0.05; χ2 = 5.2; odds ratio = 2.0 (1.1 - 3.6)]. The E. histolytica antigen detection rate was 36.6% (30/82) for stool samples from Ariquemes, and 19.4% (26/134) for stool taken from the residents of Monte Negro. The rate of the occurrence of amoebiasis was significantly higher in the population from Ariquemes [p < 0.05; χ2= 7.8; odds ratio = 2.4 (1.2 - 4.7)]. Discussion: Due to the high occurrence of E. histolytica infected residents diagnosed in the region and the unavailability in local clinics of a test to distinguish between the two Entamoeba species, physicians should consider treating E. histolytica/E.dispar infections. Conclusion: The results indicate that E. histolytica infection is highly endemic in the studied areas.
Resumo:
This 9p21 locus, encode for important proteins involved in cell cycle regulation and apoptosis containing the p16/CDKN2A (cyclin-dependent kinase inhibitor 2a) tumor suppressor gene and two other related genes, p14/ARF and p15/CDKN2B. This locus, is a major target of inactivation in the pathogenesis of a number of human tumors, both solid and haematologic, and is a frequent site of loss or deletion also in acute lymphoblastic leukemia (ALL) ranging from 18% to 45% 1. In order to explore, at high resolution, the frequency and size of alterations affecting this locus in adult BCR-ABL1-positive ALL and to investigate their prognostic value, 112 patients (101 de novo and 11 relapse cases) were analyzed by genome-wide single nucleotide polymorphisms arrays and gene candidate deep exon sequencing. Paired diagnosis-relapse samples were further available and analyzed for 19 (19%) cases. CDKN2A/ARF and CDKN2B genomic alterations were identified in 29% and 25% of newly diagnosed patients, respectively. Deletions were monoallelic in 72% of cases and in 43% the minimal overlapping region of the lost area spanned only the CDKN2A/2B gene locus. The analysis at the time of relapse showed an almost significant increase in the detection rate of CDKN2A/ARF loss (47%) compared to diagnosis (p = 0.06). Point mutations within the 9p21 locus were found at very low level with only a non-synonymous substition in the exon 2 of CDKN2A. Finally, correlation with clinical outcome showed that deletions of CDKN2A/B are significantly associated with poor outcome in terms of overall survival (p = 0.0206), disease free-survival (p = 0.0010) and cumulative incidence of relapse (p = 0.0014). The inactivation of 9p21 locus by genomic deletions is a frequent event in BCR-ABL1-positive ALL. Deletions are frequently acquired at the leukemia progression and work as a poor prognostic marker.
Resumo:
Hintergrund: Die Systembiopsie gilt als Goldstandard zum Nachweis eines Prostatakarzinoms, obwohl ein relevanter Anteil an Prostatakarzinomen nicht diagnostiziert wird. Wir wollten mit unserer Arbeit die Frage beantworten, ob mittels elastographisch gezielter Biopsien die Prostatakarzinom-Detektion im Vergleich zur Goldstandard-Systembiopsie verbessert werden kann. Material und Methode: 152 Patienten wurden in einer prospektiven Studie einer 12-fachen Prostata-Systembiopsie unterzogen. In Linksseitenlagerung wurde dabei aus jedem der vordefinierten 6 Prostatasegmente je 1 laterale und 1 mediale Stanze entnommen. Elastographisch suspekte Areale wurden zusätzlich gezielt biopsiert. Als statistisch signifikant wurde p<0,05 angenommen. Ergebnisse: Bei 62 der 152 Patienten (40,8%) wurde ein Prostatakarzinom diagnostiziert. Die Detektionsrate der Systembiopsie betrug 39,5% (60/152), die Detektionsrate der Elastographie 29,6% (45/152). Somit war die Systembiopsie der elastographisch gezielten Biopsie signifikant überlegen (p=0,039).Jedoch war die Wahrscheinlichkeit mit einer Prostatastanze ein Karzinomherd zu entdecken, für die elastographischen Biopsien 3,7-fach höher als für die Systembiopsien. Die Sensitivität der Elastographie betrug 72,6% und die Spezifität 66,6%. Der positive Vorhersagewert für die Elastographie war 60%, der negative Vorhersagewert 78%. Die Kombination von Systembiopsie und elastographisch gezielten Biopsien bot die höchste Detektionsrate. In der rechten Prostatahälfte (48%) verzeichneten wir doppelt so viele elastographisch falsch-positive Befunde wie in der linken Prostatahälfte (25%). Desweiteren fanden sich am häufigsten falsch-positive Befunde im Prostata-Apex (46%) und am seltensten in der Prostata-Basis (29%). Schlussfolgerung: In unserer Studie war die elastographisch gezielte Biopsie der Systembiopsie signifikant unterlegen (p=0,039). Die Kombination von Systembiopsie mit elastographisch gezielten Biopsien bot die höchste Detektionsrate und kann daher empfohlen werden. Die Auffälligkeiten in der Segment-bezogenen Auswertung und ein möglicher Einfluss der Patienten-Lagerung müssen durch weitere Studien überprüft werden.rn
Resumo:
OBJECTIVES: Aim of this study was to compare the utility of susceptibility weighted imaging (SWI) with the established diagnostic techniques CT and fluid attenuated inversion recovery (FLAIR) in their detecting capacity of subarachnoid hemorrhage (SAH), and further to compare the combined SWI/FLAIR MRI data with CT to evaluate whether MRI is more accurate than CT. METHODS: Twenty-five patients with acute SAH underwent CT and MRI within 6 days after symptom onset. Underlying pathology for SAH was head trauma (n=9), ruptured aneurysm (n=6), ruptured arteriovenous malformation (n=2), and spontaneous bleeding (n=8). SWI, FLAIR, and CT data were analyzed. The anatomical distribution of SAH was subdivided into 8 subarachnoid regions with three peripheral cisterns (frontal-parietal, temporal-occipital, sylvian), two central cisterns and spaces (interhemispheric, intraventricular), and the perimesencephalic, posterior fossa, superior cerebellar cisterns. RESULTS: SAH was detected in a total of 146 subarachnoid regions. CT identified 110 (75.3%), FLAIR 127 (87%), and SWI 129 (88.4%) involved regions. Combined FLAIR and SWI identified all 146 detectable regions (100%). FLAIR was sensitive for frontal-parietal, temporal-occipital and Sylvian cistern SAH, while SWI was particularly sensitive for interhemispheric and intraventricular hemorrhage. CONCLUSIONS: By combining SWI and FLAIR, MRI yields a distinctly higher detection rate for SAH than CT alone, particularly due to their complementary detection characteristics in different anatomical regions. Detection strength of SWI is high in central areas, whereas FLAIR shows a better detection rate in peripheral areas.
Resumo:
PURPOSE: The low diagnostic yield of vitrectomy specimen analysis in chronic idiopathic uveitis (CIU) has been related to the complex nature of the underlying disease and to methodologic and tissue immanent factors in older studies. In an attempt to evaluate the impact of recently acquired analytic methods, the authors assessed the current diagnostic yield in CIU. METHODS: Retrospective analysis of consecutive vitrectomy specimens from patients with chronic endogenous uveitis (n = 56) in whom extensive systemic workup had not revealed a specific diagnosis (idiopathic) and medical treatment had not resulted in a satisfying clinical situation. Patients with acute postoperative endophthalmitis served a basis for methodologic comparison (Group 2; n = 21). RESULTS: In CIU, a specific diagnosis provided in 17.9% and a specific diagnosis excluded in 21.4%. In 60.7% the laboratory investigations were inconclusive. In postoperative endophthalmitis, microbiological culture established the infectious agent in 47.6%. In six of eight randomly selected cases, eubacterial PCR identified bacterial DNA confirming the culture results in three, remaining negative in two with a positive culture and being positive in three no growth specimens. A double negative result never occurred, suggesting a very high detection rate, when both tests were applied. CONCLUSIONS: The diagnostic yield of vitrectomy specimen analysis has not been improved by currently routinely applied methods in recent years in contrast to the significantly improved sensitivity of combined standardized culture and PCR analysis in endophthalmitis. Consequently, the low diagnostic yield in CIU has to be attributed to insufficient understanding of the underlying pathophysiologic mechanisms.
Resumo:
This article is a systematic review of whether everyday exposure to radiofrequency electromagnetic field (RF-EMF) causes symptoms, and whether some individuals are able to detect low-level RF-EMF (below the ICNIRP [International Commission on Non-Ionizing Radiation Protection] guidelines). Peer-reviewed articles published before August 2007 were identified by means of a systematic literature search. Meta-analytic techniques were used to pool the results from studies investigating the ability to discriminate active from sham RF-EMF exposure. RF-EMF discrimination was investigated in seven studies including a total of 182 self-declared electromagnetic hypersensitive (EHS) individuals and 332 non-EHS individuals. The pooled correct field detection rate was 4.2% better than expected by chance (95% CI: -2.1 to 10.5). There was no evidence that EHS individuals could detect presence or absence of RF-EMF better than other persons. There was little evidence that short-term exposure to a mobile phone or base station causes symptoms based on the results of eight randomized trials investigating 194 EHS and 346 non-EHS individuals in a laboratory. Some of the trials provided evidence for the occurrence of nocebo effects. In population based studies an association between symptoms and exposure to RF-EMF in the everyday environment was repeatedly observed. This review showed that the large majority of individuals who claims to be able to detect low level RF-EMF are not able to do so under double-blind conditions. If such individuals exist, they represent a small minority and have not been identified yet. The available observational studies do not allow differentiating between biophysical from EMF and nocebo effects.
Resumo:
PURPOSE: We sought to identify causative nongenetic and genetic risk factors for the bladder exstrophy-epispadias complex. MATERIALS AND METHODS: A total of 237 families with the bladder exstrophy-epispadias complex were invited to participate in the study, and information was obtained from 214 families, mainly from European countries. RESULTS: Two families showed familial occurrence. Male predominance was found among all subgroups comprising epispadias, classic bladder exstrophy and cloacal exstrophy, with male-to-female ratios of 1.4:1, 2.8:1 and 2.0:1, respectively (p = 0.001). No association with parental age, maternal reproductive history or periconceptional maternal exposure to alcohol, drugs, chemical noxae, radiation or infections was found. However, periconceptional maternal exposure to smoking was significantly more common in patients with cloacal exstrophy than in the combined group of patients with epispadias/classic bladder exstrophy (p = 0.009). Only 16.8% of mothers followed the current recommendations of periconceptional folic acid supplementation, and 17.6% had started supplementation before 10 weeks of gestation. Interestingly, in the latter group mothers of patients with cloacal exstrophy were more compliant with folic acid supplementation than were mothers of the combined group of patients with epispadias/classic bladder exstrophy (p = 0.037). Furthermore, mothers of children with cloacal exstrophy knew significantly more often prenatally that their child would have a congenital malformation than did mothers of children with epispadias/classic bladder exstrophy (p <0.0001). CONCLUSIONS: Our study corroborates the hypothesis that epispadias, classic bladder exstrophy and cloacal exstrophy are causally related, representing a spectrum of the same developmental defect, with a small risk of recurrence within families. Embryonic exposure to maternal smoking appears to enforce the severity, whereas periconceptional folic acid supplementation does not seem to alleviate it. There is a disproportional prenatal ultrasound detection rate between severe and mild phenotypes, possibly due to the neglect of imaging of full bladders with a focus on neural tube defects.
Resumo:
BACKGROUND: Newborns with hypoplastic left heart syndrome (HLHS) or right heart syndrome or other malformations with a single ventricle physiology and associated hypoplasia of the great arteries continue to be a challenge in terms of survival. The vast majority of these forms of congenital heart defects relate to abnormal morphogenesis during early intrauterine development and can be diagnosed accurately by fetal echocardiography. Early knowledge of these conditions not only permits a better understanding of the progression of these malformations but encourages some researchers to explore new minimally invasive therapeutic options with a view to early pre- and postnatal cardiac palliation. DATA SOURCES: PubMed database was searched with terms of "congenital heart defects", "fetal echocardiography" and "neonatal cardiac surgery". RESULTS: At present, early prenatal detection has been applied for monitoring pregnancy to avoid intrauterine cardiac decompensation. In principle, the majority of congenital heart defects can be diagnosed by prenatal echocardiography and the detection rate is 85%-95% at tertiary perinatal centers. The majority, particularly of complex congenital lesions, show a steadily progressive course including subsequent secondary phenomena such as arrhythmias or myocardial insufficiency. So prenatal treatment of an abnormal fetus is an area of perinatal medicine that is undergoing a very dynamic development. Early postnatal treatment is established for some time, and prenatal intervention or palliation is at its best experimental stage in individual cases. CONCLUSION: The upcoming expansion of fetal cardiac intervention to ameliorate critically progressive fetal lesions intensifies the need to address issues about the adequacy of technological assessment and patient selection as well as the morbidity of those who undergo these procedures.
Resumo:
The aim of the present study was to evaluate the potential of diagnostic tests based on interferon-gamma inducible protein (IP)-10 and monocyte chemotactic protein (MCP)-2, and compare the performance with the QuantiFERON TB Gold In-Tube (QFT-IT; Cellestis, Carnagie, Australia) test. IP-10 and MCP-2 were determined in supernatants from whole blood stimulated with Mycobacterium tuberculosis-specific antigens. Samples were obtained from 80 patients with culture- and/or PCR-proven tuberculosis (TB), and 124 unexposed healthy controls: 86 high school students and 38 high school staff. IP-10 and MCP-2 test cut-offs were established based on receiver operating characteristic curve analysis. TB patients produced significantly higher levels (median) of IP-10 (2158 pg x mL(-1)) and MCP-2 (379 pg x mL(-1)) compared with interferon (IFN)-gamma (215 pg x mL(-1)). The QFT-IT, IP-10 and MCP-2 tests detected 81, 83 and 71% of the TB patients; 0, 3 and 0% of the high school students and 0, 16 and 3% of the staff, respectively. Agreement between tests was high (>89%). By combining IP-10 and IFN-gamma tests, the detection rate increased among TB patients to 90% without a significant increase in positive responders among the students. In conclusion, interferon-gamma inducible protein-10 and monocyte chemotactic protein-2 responses to Mycobacterium tuberculosis-specific antigens could be used to diagnose infection. Combining interferon-gamma inducible protein-10 and interferon-gamma may be a simple approach to increase the detection rate of the Mycobacterium tuberculosis-specific in vitro tests.