62 resultados para PROGNOSTIC INDICATOR
Resumo:
OBJECTIVE: To assess survival of patients undergoing cerebral cardiopulmonary resuscitation maneuvers and to identify prognostic factors for short-term survival. METHODS: Prospective study with patients undergoing cardiopulmonary resuscitation maneuvers. RESULTS: The study included 150 patients. Spontaneous circulation was re-established in 88 (58%) patients, and 42 (28%) were discharged from the hospital. The necessary number of patients treated to save 1 life in 12 months was 3.4. The presence of ventricular fibrillation or tachycardia (VF/VT) as the initial rhythm, shorter times of cardiopulmonary resuscitation maneuvers and cardiopulmonary arrest, and greater values of mean blood pressure (BP) prior to cardiopulmonary arrest were independent variables for re-establishment of spontaneous circulation and hospital discharge. The odds ratios for hospital discharge were as follows: 6.1 (95% confidence interval [CI] = 2.7-13.6), when the initial rhythm was VF/VT; 9.4 (95% CI = 4.1-21.3), when the time of cerebral cardiopulmonary resuscitation was < 15 min; 9.2 (95% CI = 3.9-21.3), when the time of cardiopulmonary arrest was < 20 min; and 5.7 (95% CI = 2.4-13.7), when BP was > 70 mmHg. CONCLUSION: The presence of VF/VT as the initial rhythm, shorter times of cerebral cardiopulmonary resuscitation and of cardiopulmonary arrest, and a greater value of BP prior to cardiopulmonary arrest were independent variables of better prognosis.
Resumo:
Background: The TIMI Score for ST-segment elevation myocardial infarction (STEMI) was created and validated specifically for this clinical scenario, while the GRACE score is generic to any type of acute coronary syndrome. Objective: Between TIMI and GRACE scores, identify the one of better prognostic performance in patients with STEMI. Methods: We included 152 individuals consecutively admitted for STEMI. The TIMI and GRACE scores were tested for their discriminatory ability (C-statistics) and calibration (Hosmer-Lemeshow) in relation to hospital death. Results: The TIMI score showed equal distribution of patients in the ranges of low, intermediate and high risk (39 %, 27 % and 34 %, respectively), as opposed to the GRACE Score that showed predominant distribution at low risk (80 %, 13 % and 7%, respectively). Case-fatality was 11%. The C-statistics of the TIMI score was 0.87 (95%CI = 0.76 to 0.98), similar to GRACE (0.87, 95%CI = 0.75 to 0.99) - p = 0.71. The TIMI score showed satisfactory calibration represented by χ2 = 1.4 (p = 0.92), well above the calibration of the GRACE score, which showed χ2 = 14 (p = 0.08). This calibration is reflected in the expected incidence ranges for low, intermediate and high risk, according to the TIMI score (0 %, 4.9 % and 25 %, respectively), differently to GRACE (2.4%, 25% and 73%), which featured middle range incidence inappropriately. Conclusion: Although the scores show similar discriminatory capacity for hospital death, the TIMI score had better calibration than GRACE. These findings need to be validated populations of different risk profiles.
Resumo:
Abstract Background: BNP has been extensively evaluated to determine short- and intermediate-term prognosis in patients with acute coronary syndrome, but its role in long-term mortality is not known. Objective: To determine the very long-term prognostic role of B-type natriuretic peptide (BNP) for all-cause mortality in patients with non-ST segment elevation acute coronary syndrome (NSTEACS). Methods: A cohort of 224 consecutive patients with NSTEACS, prospectively seen in the Emergency Department, had BNP measured on arrival to establish prognosis, and underwent a median 9.34-year follow-up for all-cause mortality. Results: Unstable angina was diagnosed in 52.2%, and non-ST segment elevation myocardial infarction, in 47.8%. Median admission BNP was 81.9 pg/mL (IQ range = 22.2; 225) and mortality rate was correlated with increasing BNP quartiles: 14.3; 16.1; 48.2; and 73.2% (p < 0.0001). ROC curve disclosed 100 pg/mL as the best BNP cut-off value for mortality prediction (area under the curve = 0.789, 95% CI= 0.723-0.854), being a strong predictor of late mortality: BNP < 100 = 17.3% vs. BNP ≥ 100 = 65.0%, RR = 3.76 (95% CI = 2.49-5.63, p < 0.001). On logistic regression analysis, age >72 years (OR = 3.79, 95% CI = 1.62-8.86, p = 0.002), BNP ≥ 100 pg/mL (OR = 6.24, 95% CI = 2.95-13.23, p < 0.001) and estimated glomerular filtration rate (OR = 0.98, 95% CI = 0.97-0.99, p = 0.049) were independent late-mortality predictors. Conclusions: BNP measured at hospital admission in patients with NSTEACS is a strong, independent predictor of very long-term all-cause mortality. This study allows raising the hypothesis that BNP should be measured in all patients with NSTEACS at the index event for long-term risk stratification.
Resumo:
Abstract Background: Pulmonary hypertension is associated with poor prognosis in heart failure. However, non-invasive diagnosis is still challenging in clinical practice. Objective: We sought to assess the prognostic utility of non-invasive estimation of pulmonary vascular resistances (PVR) by cardiovascular magnetic resonance to predict adverse cardiovascular outcomes in heart failure with reduced ejection fraction (HFrEF). Methods: Prospective registry of patients with left ventricular ejection fraction (LVEF) < 40% and recently admitted for decompensated heart failure during three years. PVRwere calculated based on right ventricular ejection fraction and average velocity of the pulmonary artery estimated during cardiac magnetic resonance. Readmission for heart failure and all-cause mortality were considered as adverse events at follow-up. Results: 105 patients (average LVEF 26.0 ±7.7%, ischemic etiology 43%) were included. Patients with adverse events at long-term follow-up had higher values of PVR (6.93 ± 1.9 vs. 4.6 ± 1.7estimated Wood Units (eWu), p < 0.001). In multivariate Cox regression analysis, PVR ≥ 5 eWu(cutoff value according to ROC curve) was independently associated with increased risk of adverse events at 9 months follow-up (HR2.98; 95% CI 1.12-7.88; p < 0.03). Conclusions: In patients with HFrEF, the presence of PVR ≥ 5.0 Wu is associated with significantly worse clinical outcome at follow-up. Non-invasive estimation of PVR by cardiac magnetic resonance might be useful for risk stratification in HFrEF, irrespective of etiology, presence of late gadolinium enhancement or LVEF.
Resumo:
A virus antigenic characterization methodology using an indirect method of antibody detection ELISA with virus-infected cultured cells as antigen and a micro virus neutralisation test using EIA (NT-EIA) as an aid to reading were used for antigenic characterization of Jatobal (BeAn 423380). Jatobal virus was characterized as a Bunyaviridae, Bunyavirus genus, Simbu serogroup virus. ELISA using infected cultured cells as antigen is a sensitive and reliable method for identification of viruses and has many advantages over conventional antibody capture ELISA's and other tests: it eliminates solid phase coating with virus and laborious antigen preparation; it permits screening of large numbers of virus antisera faster and more easily than by CF, HAI, or plaque reduction NT. ELISA and NT using EIA as an aid to reading can be applicable to viruses which do not produce cytopathogenic effect. Both techniques are applicable to identification of viruses which grow in mosquito cells.
Resumo:
The aim of this study was to investigate the correlation between proportion method with mycobacteria growth indicator tube (MGIT) and E-test for Mycobacterium tuberculosis. Forty clinical isolates were tested. MGIT and E-test with the first line antituberculous drugs correlated with the proportion method. Our results suggested that MGIT and E-test methods can be routinely used instead of the proportion method.
Resumo:
Small mammals are found naturally infected by Schistosoma mansoni, becoming a confounding factor for control programs of schistosomiasis in endemic areas. The aims of this study were: to investigate the infection rates by S. mansoni on the water-rat Nectomys squamipes during four years in endemic areas of Sumidouro, state of Rio de Janeiro, using mark-recapture technique; to compare two diagnostic methods for schistosomiasis; and to evaluate the effects of the chemotherapy in the human infected population on the rodent infection rates. The rodent infection rates of S. mansoni increased when rodent population sizes were lower. Coprology and serology results presented the same trends along time and were correlated. Serology could detect recent infection, including the false negatives in the coprology. The chemotherapy in the humans could not interrupt the rodent infection. Rodents can increase the schistosomiaisis transmission where it already exists, they probably maintain the transmission cycle in the nature and can be considered as biological indicators of the transmission sites of this parasite since they are highly susceptible to infection. The water-rats may present different levels of importance in the transmission dynamics of S. mansoni infection cycle for each area, and can be considered important wild-reservoirs of this human disease.
Resumo:
Chagas heart disease (CHD) results from infection with the protozoan parasite Trypanosoma cruzi and is the leading cause of infectious myocarditis worldwide. It poses a substantial public health burden due to high morbidity and mortality. CHD is also the most serious and frequent manifestation of chronic Chagas disease and appears in 20-40% of infected individuals between 10-30 years after the original acute infection. In recent decades, numerous clinical and experimental investigations have shown that a low-grade but incessant parasitism, along with an accompanying immunological response [either parasite-driven (most likely) or autoimmune-mediated], plays an important role in producing myocardial damage in CHD. At the same time, primary neuronal damage and microvascular dysfunction have been described as ancillary pathogenic mechanisms. Conduction system disturbances, atrial and ventricular arrhythmias, congestive heart failure, systemic and pulmonary thromboembolism and sudden cardiac death are the most common clinical manifestations of chronic Chagas cardiomyopathy. Management of CHD aims to relieve symptoms, identify markers of unfavourable prognosis and treat those individuals at increased risk of disease progression or death. This article reviews the pathophysiology of myocardial damage, discusses the value of current risk stratification models and proposes an algorithm to guide mortality risk assessment and therapeutic decision-making in patients with CHD.
Resumo:
Chagas disease is a pleomorphic clinical entity that has several unique features. The aim of this study is to summarise some of the recent contributions from our research group to knowledge of the morbidity and prognostic factors in Chagas heart disease. A retrospective study suggested that ischaemic stroke associated with left ventricular (LV) apical thrombi is the first clinical manifestation of Chagas disease observed in a large proportion of patients. LV function and left atrial volume (LAV) are independent risk factors for ischaemic cerebrovascular events during follow-up of Chagas heart disease patients. Pulmonary congestion in Chagas-related dilated cardiomyopathy is common but usually mild. Although early right ventricular (RV) involvement has been described, we have shown by Doppler echocardiography that RV dysfunction is evident almost exclusively when it is associated with left ventricle dilatation and functional impairment. In addition, RV dysfunction is a powerful predictor of survival in patients with heart failure secondary to Chagas disease. We have also demonstrated that LAV provides incremental prognostic information independent of clinical data and conventional echocardiographic parameters that predict survival.
Resumo:
School-aged children (6-15 years) from the endemic area of Pernambuco were evaluated both as a target group for and an indicator of schistosomiasis control in the community. Parasitological data were drawn from baseline stool surveys of whole populations that were obtained to diagnose Schistosoma mansoni infection. Nineteen representative localities were selected for assessing the prevalence of schistosomiasis among individuals in the following age groups: 0-5, 6-15, 16-25, 26-40 and 41-80 years. For each locality, the prevalence in each age group was compared to that of the overall population using contingency table analysis. To select a reference group, the operational difficulties of conducting residential surveys were considered. School-aged children may be considered to be the group of choice as the reference group for the overall population for the following reasons: (i) the prevalence of schistosomiasis in this age group had the highest correlation with the prevalence in the overall population (r = 0.967), (ii) this age group is particularly vulnerable to infection and plays an important role in parasite transmission and (iii) school-aged children are the main target of the World Health Organization in terms of helminth control. The Schistosomiasis Control Program should consider school-aged children both as a reference group for assessing the need for intervention at the community level and as a target group for integrated health care actions of the Unified Health System that are focused on high-risk groups.
Resumo:
The diagnosis of meningitic angiostrongyliasis (MA) is based on clinical criteria. A lumbar puncture is used as a diagnostic tool, but it is an invasive procedure. The blood eosinophil levels are also assessed and used in the diagnosis of this disease. We enrolled 47 patients with serologically proven MA and 131 controls with intestinal parasite infections. An absolute eosinophil count model was found to be the best marker for MA. An eosinophil count of more than 798 cells led to sensitivity, specificity, positive predictive and negative predictive values of 76.6%, 80.2%, 58.1% and 90.5%, respectively. These data support the use of testing for high blood eosinophil levels as a diagnostic tool for MA in individuals that are at risk for this disease.
Resumo:
The emerging resistance to artemisinin derivatives that has been reported in South-East Asia led us to assess the efficacy of artemether-lumefantrine as the first line therapy for uncomplicated Plasmodium falciparum infections in Suriname. This drug assessment was performed according to the recommendations of the World Health Organization in 2011. The decreasing number of malaria cases in Suriname, which are currently limited to migrating populations and gold miners, precludes any conclusions on artemether efficacy because adequate numbers of patients with 28-day follow-up data are difficult to obtain. Therefore, a comparison of day 3 parasitaemia in a 2011 study and in a 2005/2006 study was used to detect the emergence of resistance to artemether. The prevalence of day 3 parasitaemia was assessed in a study in 2011 and was compared to that in a study in 2005/2006. The same protocol was used in both studies and artemether-lumefantrine was the study drug. Of 48 evaluable patients in 2011, 15 (31%) still had parasitaemia on day 3 compared to one (2%) out of 45 evaluable patients in 2005/2006. Overall, 11 evaluable patients in the 2011 study who were followed up until day 28 had negative slides and similar findings were obtained in all 38 evaluable patients in the 2005/2006 study. The significantly increased incidence of parasite persistence on day 3 may be an indication of emerging resistance to artemether.
Resumo:
This study proposes a method of direct and simultaneous determination of the amount of Ca2+ and Mg2+ present in soil extracts using a Calcium Ion-Selective Electrode and by Complexometric Titration (ISE-CT). The results were compared to those obtained by conventional analytical techniques of Complexometric Titration (CT) and Flame Atomic Absorption Spectrometry (FAAS). There were no significant differences in the determination of Ca2+ and Mg2+ in comparison with CT and FAAS, at a 95 % confidence level. Additionally, results of this method were more precise and accurate than of the Interlaboratorial Control (IC).
Resumo:
The interactions between soil invertebrates and environmental variations are relatively unknown in the assessment of soil quality. The objective of this study was to evaluate soil quality in areas with different soil management systems, based on soil fauna as indicator, in Além Paraíba, Minas Gerais, Brazil. The soil invertebrate community was sampled using pitfall traps, in the dry and rainy seasons, from areas with five vegetation types (acacia, mimosa, eucalyptus, pasture, and secondary forest). The abundance of organisms and the total and average richness, Shannon's diversity index, the Pielou uniformity index, and change index V were determined. The fauna was most abundant in the areas of secondary forest and mimosa plantations in the dry season (111.3 and 31.7 individuals per trap per day, respectively). In the rainy season, the abundance of organisms in the three vegetation types did not differ. The highest values of average and total richness were recorded in the secondary forest in the dry season and in the mimosa stand in the rainy season. Shannon's index ranged from 1.57 in areas with acacia and eucalyptus in the rainy season to 3.19 in the eucalyptus area in the dry season. The uniformity index was highest in forest stands (eucalyptus, acacia and mimosa) in the dry season, but higher in the rainy season in the pasture and secondary forest than in the forest stands. The change index V indicated that the percentage of extremely inhibited groups was lowest in the area with mimosa, both in the dry and rainy season (36 and 23 %, respectively). Of all forest stands, the mimosa area had the most abundant soil fauna.