142 resultados para Probability of detection
Resumo:
OBJECTIVE: To assess the incidence of problems requiring reprogramming of atrioventricular pacemakers in a long-term follow-up, and also the causes for this procedure. METHODS: During the period from May '98 to December '99, 657 patients were retrospectively studied, An actuarial curve for the event reprogramming of the stimulation mode was drawn. RESULTS: The follow-up period ranged from 12 to 178 months (mean = 81 months). Eighty-two (12.4%) patients underwent reprogramming of the stimulation mode as follows: 63 (9.5%) changed to VVI,(R/C); 10 (1.5%) changed to DVI,C; 6 (0.9%) changed to VDD,C; and 3 (0.5%) changed to DOO. The causes for the reprogramming were as follows: arrhythmia conducted by the pacemaker in 39 (37.6%) patients; loss of atrial sensitivity or capture, or both, in 39 (38.6%) patients; and microfracture of atrial electrode in 5 (4.9%) patients. The stimulation mode reprogramming free probability after 15 years was 58%. CONCLUSION: In a long-term follow-up, the atrioventricular pacemaker provided a low incidence of complications, a high probability of permanence in the DDD,C mode, and the most common cause of reprogramming was arrhythmia conducted by the pacemaker.
Resumo:
OBJECTIVE: To analyze the incidence of intraventricular and atrioventricular conduction defects associated with acute myocardial infarction and the degree of in hospital mortality resulting from this condition during the era of thrombolytic therapy. METHODS: Observational study of a cohort of 929 consecutive patients with acute myocardial infarction. Multivariate analysis by logistic regression. Was used. RESULTS: Logistic regression showed a greater incidence of bundle branch block in male sex (odds ratio = 1.87, 95% CI = 1.02-3.42), age over 70 years (odds ratio = 2.31, 95% CI = 1.68-5.00), anterior localization of the infarction (odds ratio = 1.93, 95% CI = 1.03-3.65). There was a greater incidence of complete atrioventricular block in inferior infarcts (odds ratio = 2.59, 95% CI 1.30-5.18) and the presence of cardiogenic shock (odds ratio = 3.90, 95% CI = 1.43-10.65). Use of a thrombolytic agent was associated with a tendency toward a lower occurrence of bundle branch block (odds ratio = 0.68) and a greater occurrence of complete atrioventricular block (odds ratio = 1.44). The presence of bundle branch block (odds ratio = 2.45 95% , CI = 1.14-5.28) and of complete atrioventricular block (odds ratio = 13.59, 95% CI = 5.43-33.98) was associated with a high and independent probability of inhospital death. CONCLUSION: During the current era of thrombolytic therapy and in this population, intraventricular disturbances of electrical conduction and complete atrioventricular block were associated with a high and independent risk of inhospital death during acute myocardial infarction.
Resumo:
OBJECTIVE: To determine the characteristics associated with the dropout of patients followed up in a Brazilian out patient clinic specializing in hypertension. METHODS: Planned prospective cohort study of patients who were prescribed an antihypertensive treatment after an extensive initial evaluation. The following parameters were analyzed: sex, age, educational level, duration of disease, pressure level used for classifying the patient, previous treatment, physical activity, smoking, alcohol consumption, familial history of hypertension, and lesion in a target organ. RESULTS: We studied 945 hypertensive patients, 533 (56%) of whom dropped out of the follow-up. The mean age was 52.3±12.9 years. The highest probabilities of dropout of the follow-up were associated with current smoking, relative risk of 1.46 (1.04-2.06); educational level equal to or below 5 years of schooling, relative risk of 1.52 (1.11-2.08); and hypertension duration below 5 years, relative risk of 1.78 (1.28-2.48). Age increase was associated with a higher probability of follow-up with a relative risk of 0.98 (0.97-0.99). CONCLUSION: We identified a group at risk for dropping out the follow-up, which comprised patients with a lower educational level, a recent diagnosis of hypertension, and who were smokers. We think that measures assuring adherence to treatment should be directed to this group of patients.
Resumo:
PURPOSE: Upright tilt-table testing (UTT) is an useful method for identifying patients with neurocardiogenic syncope, but its role in the evaluation of therapeutic efficacy is controversial. The aim of this study was to determine the correlation between negative UTT after therapy introduction (acute efficacy) and symptom recurrence during follow-up (chronic efficacy). METHODS: We studied 56 severely symptomatic patients (age 27±19 years) with recurrent (7±12 episodes) neurocardiogenic syncope (positive UTT). Once empirical pharmacological therapy was initiated, all patients underwent another UTT (therapeutic evaluation test - TET). Therapy was not modified after TET results. The probability of symptom recurrence was analyzed with the Kaplan-Meier method and compared by log-rank test in patients with negative and positive TET. RESULTS: Negative UTT after therapy was related to a significantly lower probability of recurrence during follow-up (4.9 versus 52.4% in 12 months, P<0.0001). CONCLUSION: A good correlation exists between acute and long-term efficacy of pharmacological therapy for neurocardiogenic syncope, so that serial UTT may be considered a good method for identifying an effective therapeutic strategy.
Resumo:
OBJECTIVE - To assess mortality and the psychological repercussions of the prolonged waiting time for candidates for heart surgery. METHODS - From July 1999 to May 2000, using a standardized questionnaire, we carried out standardized interviews and semi-structured psychological interviews with 484 patients with coronary heart disease, 121 patients with valvular heart diseases, and 100 patients with congenital heart diseases. RESULTS - The coefficients of mortality (deaths per 100 patients/year) were as follows: patients with coronary heart disease, 5.6; patients with valvular heart diseases, 12.8; and patients with congenital heart diseases, 3.1 (p<0.0001). The survival curve was lower in patients with valvular heart diseases than in patients with coronary heart disease and congenital heart diseases (p<0.001). The accumulated probability of not undergoing surgery was higher in patients with valvular heart diseases than in the other patients (p<0.001), and, among the patients with valvular heart diseases, this probability was higher in females than in males (p<0.01). Several patients experienced intense anxiety and attributed their adaptive problems in the scope of love, professional, and social lives, to not undergoing surgery. CONCLUSION - Mortality was high, and even higher among the patients with valvular heart diseases, with negative psychological and social repercussions.
Resumo:
OBJECTIVE: To assess the clinical significance of transient ischemic dilation of the left ventricle during myocardial perfusion scintigraphy with stress/rest sestamibi. METHODS: The study retrospectively analyzed 378 patients who underwent myocardial perfusion scintigraphy with stress/rest sestamibi, 340 of whom had a low probability of having ischemia and 38 had significant transient defects. Transient ischemic dilation was automatically calculated using Autoquant software. Sensitivity, specificity, and the positive and negative predictive values were established for each value of transient ischemic dilation. RESULTS: The values of transient ischemic dilation for the groups of low probability and significant transient defects were, respectively, 1.01 ± 0.13 and 1.18 ± 0.17. The values of transient ischemic dilation for the group with significant transient defects were significantly greater than those obtained for the group with a low probability (P<0.001). The greatest positive predictive values, around 50%, were obtained for the values of transient ischemic dilation above 1.25. CONCLUSION: The results suggest that transient ischemic dilation assessed using the stress/rest sestamibi protocol may be useful to separate patients with extensive myocardial ischemia from those without ischemia.
Resumo:
Background: According to some international studies, patients with acute coronary syndrome (ACS) and increased left atrial volume index (LAVI) have worse long-term prognosis. However, national Brazilian studies confirming this prediction are still lacking. Objective: To evaluate LAVI as a predictor of major cardiovascular events (MCE) in patients with ACS during a 365-day follow-up. Methods: Prospective cohort of 171 patients diagnosed with ACS whose LAVI was calculated within 48 hours after hospital admission. According to LAVI, two groups were categorized: normal LAVI (≤ 32 mL/m2) and increased LAVI (> 32 mL/m2). Both groups were compared regarding clinical and echocardiographic characteristics, in- and out-of-hospital outcomes, and occurrence of ECM in up to 365 days. Results: Increased LAVI was observed in 78 patients (45%), and was associated with older age, higher body mass index, hypertension, history of myocardial infarction and previous angioplasty, and lower creatinine clearance and ejection fraction. During hospitalization, acute pulmonary edema was more frequent in patients with increased LAVI (14.1% vs. 4.3%, p = 0.024). After discharge, the occurrence of combined outcome for MCE was higher (p = 0.001) in the group with increased LAVI (26%) as compared to the normal LAVI group (7%) [RR (95% CI) = 3.46 (1.54-7.73) vs. 0.80 (0.69-0.92)]. After Cox regression, increased LAVI increased the probability of MCE (HR = 3.08, 95% CI = 1.28-7.40, p = 0.012). Conclusion: Increased LAVI is an important predictor of MCE in a one-year follow-up.
Frequency of Cardiovascular Involvement in Familial Amyloidotic Polyneuropathy in Brazilian Patients
Resumo:
Background:Familial amyloidotic polyneuropathy (FAP) is a rare disease diagnosed in Brazil and worldwide. The frequency of cardiovascular involvement in Brazilian FAP patients is unknown.Objective:Detect the frequency of cardiovascular involvement and correlate the cardiovascular findings with the modified polyneuropathy disability (PND) score.Methods:In a national reference center, 51 patients were evaluated with clinical examination, electrocardiography (ECG), echocardiography (ECHO), and 24-hour Holter. Patients were classified according to the modified PND score and divided into groups: PND 0, PND I, PND II, and PND > II (which included PND IIIa, IIIb, and IV). We chose the classification tree as the statistical method to analyze the association between findings in cardiac tests with the neurological classification (PND).Results:ECG abnormalities were present in almost 2/3 of the FAP patients, whereas ECHO abnormalities occurred in around 1/3 of them. All patients with abnormal ECHO also had abnormal ECG, but the opposite did not apply. The classification tree identified ECG and ECHO as relevant variables (p < 0.001 and p = 0.08, respectively). The probability of a patient to be allocated to the PND 0 group when having a normal ECG was over 80%. When both ECG and ECHO were abnormal, this probability was null.Conclusions:Brazilian patients with FAP have frequent ECG abnormalities. ECG is an appropriate test to discriminate asymptomatic carriers of the mutation from those who develop the disease, whereas ECHO contributes to this discrimination.
Resumo:
In order to upgrade the reliability of xenodiagnosis, attention has been directed towards population dynamics of the parasite, with particular interest for the following factors: 1. Parasite density which by itself is not a research objective, but by giving an accurate portrayal of parasite development and multiplication, has been incorporated in screening of bugs for xenodiagnosis. 2. On the assumption that food availability might increase parasite density, bugs from xenodiagnosis have been refed at biweekly intervals on chicken blood. 3. Infectivity rates and positives harbouring large parasite yields were based on gut infections, in which the parasite population comprised of all developmental forms was more abundant and easier to detect than in fecal infections, thus minimizing the probability of recording false negatives. 4. Since parasite density, low in the first 15 days of infection, increases rapidly in the following 30 days, the interval of 45 days has been adopted for routine examination of bugs from xenodiagnosis. By following the enumerated measures, all aiming to reduce false negative cases, we are getting closer to a reliable xenodiagnostic procedure. Upgrading the efficacy of xenodiagnosis is also dependent on the xenodiagnostic agent. Of 9 investigated vector species, Panstrongylus megistus deserves top priority as a xenodiagnostic agent. Its extraordinary capability to support fast development and vigorous multiplication of the few parasites, ingested from the host with chronic Chagas' disease, has been revealed by the strikingly close infectivity rates of 91.2% vs. 96.4% among bugs engorged from the same host in the chronic and acute phase of the disease respectively (Table V), the latter comporting an estimated number of 12.3 x 10[raised to the power of 3] parasites in the circulation at the time of xenodiagnosis, as reported previously by the authors (1982).
Resumo:
We tested experimentally the effects of the presence of non-susceptible hosts on the infection with Trypanosoma cruzi of the vector Triatoma infestans. The experiment consisted in two treatments: with chickens, including two chickens (non-susceptible hosts) and two infected guinea pigs (susceptible hosts), and without chickens, including only two infected guinea pigs. The hosts were held unrestrained in individual metal cages inside a closed tulle chamber. A total of 200 uninfected T. infestans third instar nymphs were liberated in each replica, collected on day 14, and examined for infection and blood meal sources on day 32-36. The additional presence of chickens relative to infected guinea pigs: (a) significantly modified the spatial distribution of bugs; (b) increased significantly the likelihoods of having a detectable blood meal on any host and molting to the next instar; (c) did not affect the bugs' probability of death by predation; and (d) decreased significantly the overall percentage of T. infestans infected with T. cruzi. The bugs collected from inside or close to the guinea pigs' cages showed a higher infection rate (71-88%) than those collected from the chickens' cages (22-32%). Mixed blood meals on chickens and guinea pigs were detected in 12-21% of bugs. Although the presence of chickens would decrease the overall percentage of infected bugs in short term experiments, the high rate of host change of T. infestans would make this difference fade out if longer exposure times had been provided.
Resumo:
Trypanosoma cruzi is a protozoan infection widely spread in Latin America, from Mexico in the north to Argentina and Chile in the south. The second most important way of acquiring the infection is by blood transfusion. Even if most countries of Latin America have law/decree/norms, that make mandatory the screening of blood donors for infectious diseases, including T. cruzi (El Salvador and Nicaragua do not have laws on the subject), there is usually no enforcement or it is very lax. Analysis of published serologic surveys of T. cruzi antibodies in blood donors done in 1993, indicating the number of donors and screening coverage for T. cruzi in ten countries of Central and South America indicated that the probability of receiving a potentially infected transfusion unit in each country varied from 1,096 per 10,000 transfusions in Bolivia, the highest, to 13.02 or 13.86 per 10,000 transfusions in Honduras and Venezuela respectively, where screening coverage was 100%. On the other hand the probability of transmitting a T. cruzi infected unit was 219/10,000 in Bolivia, 24/10,000 in Colombia, 17/10,000 in El Salvador, and around 2-12/10,000 for the seven other countries. Infectivity risks defined as the likelihood of being infected when receiving an infected transfusion unit were assumed to be 20% for T. cruzi. Based on this, estimates of the absolute number of infections induced by transfusion indicated that they were 832, 236, and 875 in Bolivia, Chile and Colombia respectively. In all the other countries varied from seven in Honduras to 85 in El Salvador. Since 1993, the situation has improved. At that time only Honduras and Venezuela screened 100% of donors, while seven countries, Argentina, Colombia, El Salvador, Honduras, Paraguay, Uruguay and Venezuela, did the same in 1996. In Central America, without information from Guatemala, the screening of donors for T. cruzi prevented the transfusion of 1,481 infected units and the potential infection of 300 individuals in 1996. In the same year, in seven countries of South America, the screening prevented the transfusion of 36,017 infected units and 7, 201 potential cases of transfusional infection.
Resumo:
Antimalarial drugs including the antifolate, pyrimethamine-sulfadoxine (PS), can modulate the prevalence and intensities of gametocytaemia following treatment of acute malaria infections. They may also directly influence the transmission and spread of drug insensitivity. Little is known of the effects of co-trimoxazole (Co-T), another antifolate antimalarial, on gametocytes in children with acute malaria infections. We compared the effects of Co-T and PS on the prevalence and intensities of gametocytaemia and gametocyte sex ratios in 102 children aged 0.5-12 years presenting with acute and uncomplicated falciparum malaria. Compared to pre-treatment, both drugs significantly increased gametocyte carriage post-initiation of treatment. However, gametocyte carriage was significantly lower on day 14 in those treated with Co-T than PS. Significant increase in gametocytaemia with time occurred in PS - but not Co-T-treated children. Kaplan-Meier survival curve of the cumulative probability of remaining gametocyte-free in children who were agametocytaemic at enrolment showed that by day 7 of follow up, children treated with PS had a significantly higher propensity to have developed gametocytes than in Co-T-treated children (Log-rank statistic 5.35, df = 1, P = 0.02). Gametocyte sex ratio changes were similar following treatment with both drugs. PS and Co-T treatment of acute malaria infections in children from this endemic area is associated with significant increases in prevalence and intensities of gametocytaemia but these effects are more marked in those treated with PS than Co-T.
Resumo:
This report describes the development of a SYBR Green I based real time polymerase chain reaction (PCR) protocol for detection on the ABI Prism 7000 instrument. Primers targeting the gene encoding the SSU rRNA were designed to amplify with high specificity DNA from Schistosoma mansoni, in a real time quantitative PCR system. The limit of detection of parasite DNA for the system was 10 fg of purified genomic DNA, that means less than the equivalent to one parasite cell (genome ~580 fg DNA). The efficiency was 0.99 and the correlation coefficient (R²) was 0.97. When different copy numbers of the target amplicon were used as standards, the assay could detect at least 10 copies of the specific target. The primers used were designed to amplify a 106 bp DNA fragment (Tm 83ºC). The assay was highly specific for S. mansoni, and did not recognize DNA from closely related non-schistosome trematodes. The real time PCR allowed for accurate quantification of S. mansoni DNA and no time-consuming post-PCR detection of amplification products by gel electrophoresis was required. The assay is potentially able to quantify S. mansoni DNA (and indirectly parasite burden) in a number of samples, such as snail tissue, serum and feces from patients, and cercaria infested water. Thus, these PCR protocols have potential to be used as tools for monitoring of schistosome transmission and quantitative diagnosis of human infection.
Resumo:
Different urban structures might affect the life history parameters of Aedes aegypti and, consequently, dengue transmission. Container productivity, probability of daily survival (PDS) and dispersal rates were estimated for mosquito populations in a high income neighbourhood of Rio de Janeiro. Results were contrasted with those previously found in a suburban district, as well as those recorded in a slum. After inspecting 1,041 premises, domestic drains and discarded plastic pots were identified as the most productive containers, collectively holding up to 80% of the total pupae. In addition, three cohorts of dust-marked Ae. aegypti females were released and recaptured daily using BGS-Traps, sticky ovitraps and backpack aspirators in 50 randomly selected houses; recapture rate ranged from 5-12.2% within cohorts. PDS was determined by two models and ranged from 0.607-0.704 (exponential model) and 0.659-0.721 (non-linear model), respectively. Mean distance travelled varied from 57-122 m, with a maximum dispersal of 263 m. Overall, lower infestation indexes and adult female survival were observed in the high income neighbourhood, suggesting a lower dengue transmission risk in comparison to the suburban area and the slum. Since results show that urban structure can influence mosquito biology, specific control strategies might be used in order to achieve cost-effective Ae. aegypti control.
Resumo:
The diagnosis of single-lesion paucibacillary leprosy remains a challenge. Reviews by expert dermatopathologists and quantitative polymerase chain reaction (qPCR) results obtained from 66 single-plaque biopsy samples were compared. Histological findings were graded as high (HP), medium (MP) or low (LP) probability of leprosy or other dermatopathy (OD). Mycobacterium leprae-specific genes were detected using qPCR. The biopsies of 47 out of 57 clinically diagnosed patients who received multidrug therapy were classified as HP/MP, eight of which were qPCR negative. In the LP/OD (n = 19), two out of eight untreated patients showed positive qPCR results. In the absence of typical histopathological features, qPCR may be utilised to aid in final patient diagnosis, thus reducing overtreatment and delay in diagnosis.