123 resultados para Long-qt Syndrome
em Scielo Saúde Pública - SP
Resumo:
No reports testing the efficacy of the use of the QT/RR ratio <1/2 for detecting a normal QTc interval were found in the literature. The objective of the present study was to determine if a QT/RR ratio <=1/2 can be considered to be equal to the normal QTc and to compare the QT and QTc measured and calculated clinically and by a computerized electrocardiograph. Ratios (140 QT/RR) of 28 successive electrocardiograms obtained from 28 consecutive patients in a tertiary level teaching hospital were analyzed clinically by 5 independent observers and by a computerized electrocardiograph. The QT/RR ratio provided 56% sensitivity and 78% specificity, with an area under the receiver operator characteristic curve of 75.8% (95%CI: 0.68 to 0.84). The divergence in QT and QTc interval measurements between clinical and computerized evaluation were 0.01 ± 0.03 s (95%CI: 0.04-0.02) and 0.01 ± 0.04 s (95%CI: -0.05-0.03), respectively. The QT and QTc values measured clinically and by a computerized electrocardiograph were similar. The QT/RR ratio <=1/2 was not a satisfactory index for QTc evaluation because it could not predict a normal QTc value.
Resumo:
Abstract Background: BNP has been extensively evaluated to determine short- and intermediate-term prognosis in patients with acute coronary syndrome, but its role in long-term mortality is not known. Objective: To determine the very long-term prognostic role of B-type natriuretic peptide (BNP) for all-cause mortality in patients with non-ST segment elevation acute coronary syndrome (NSTEACS). Methods: A cohort of 224 consecutive patients with NSTEACS, prospectively seen in the Emergency Department, had BNP measured on arrival to establish prognosis, and underwent a median 9.34-year follow-up for all-cause mortality. Results: Unstable angina was diagnosed in 52.2%, and non-ST segment elevation myocardial infarction, in 47.8%. Median admission BNP was 81.9 pg/mL (IQ range = 22.2; 225) and mortality rate was correlated with increasing BNP quartiles: 14.3; 16.1; 48.2; and 73.2% (p < 0.0001). ROC curve disclosed 100 pg/mL as the best BNP cut-off value for mortality prediction (area under the curve = 0.789, 95% CI= 0.723-0.854), being a strong predictor of late mortality: BNP < 100 = 17.3% vs. BNP ≥ 100 = 65.0%, RR = 3.76 (95% CI = 2.49-5.63, p < 0.001). On logistic regression analysis, age >72 years (OR = 3.79, 95% CI = 1.62-8.86, p = 0.002), BNP ≥ 100 pg/mL (OR = 6.24, 95% CI = 2.95-13.23, p < 0.001) and estimated glomerular filtration rate (OR = 0.98, 95% CI = 0.97-0.99, p = 0.049) were independent late-mortality predictors. Conclusions: BNP measured at hospital admission in patients with NSTEACS is a strong, independent predictor of very long-term all-cause mortality. This study allows raising the hypothesis that BNP should be measured in all patients with NSTEACS at the index event for long-term risk stratification.
Resumo:
INTRODUTION: Steroid resistant idiopathic nephrotic syndrome (SRINS) in children is one of the leading causes of progression to chronic kidney disease stage V (CKD V)/end stage renal disease (ESRD). OBJECTIVE: The aim of this retrospective study is to evaluate the efficacy of immunosuppressive drugs (IS) and to identify risk factors for progression to ESRD in this population. METHODS: Clinical and biochemical variables at presentation, early or late steroid resistance, histological pattern and response to cyclosporine A (CsA) and cyclophosfamide (CP) were reviewed in 136 children with SRINS. The analyzed outcome was the progression to ESRD. Univariate as well as multivariate Cox-regression analysis were performed. RESULTS: Median age at onset was 5.54 years (0.67-17.22) and median follow up time was 6.1 years (0.25-30.83). Early steroid-resistance was observed in 114 patients and late resistance in 22. Resistance to CP and CsA was 62.9% and 35% respectively. At last follow-up 57 patients reached ESRD. The renal survival rate was 71.5%, 58.4%, 55.3%, 35.6% and 28.5% at 5, 10, 15, 20 and 25 years respectively. Univariate analysis demonstrated that older age at onset, early steroid-resistance, hematuria, hypertension, focal segmental glomerulosclerosis (FSGS), and resistance to IS were risk factors for ESRD. The Cox proportional-hazards regression identified CsAresistance and FSGS as the only predictors for ESRD. CONCLUSION: Our findings showed that CsA-resistance and FSGS were risk factors for ESRD.
Resumo:
Schistosomal nephropathy has long been related to the hepatosplenic form of schistosomiasis. In the last few years, 24 patients with hepatointestinal schistosomiasis and the nephrotic syndrome were studied. Aiming at evaluating a possible etiologic participation of schistosomiasis in the development of the nephropathy, this group was comparatively studied with a group of 37 patients with idiopathic nephrotic syndrome. Both groups had a different distribution of the histologic lesions. In the group with schistosomiasis there was a statistically significant prevalence of proliferative mesangial glomerulonephritis (33.3%), whereas in the control group there was prevalence of membranous glomerulonephritis (32.4%). On immunofluorescence, IgM was positive in 94.4% of the patients with schistosomiasis versus 55.0% in the control group (p<0.01). In the group with schistosomiasis, 8 patients evidenced mesangial proliferative glomerulonephritis and 5, membra-noproliferative glomerulonephritis. In both histological types immunofluorescence showed IgM and C3 granular deposits in the glomeruli. The data in this study suggests that mesangial proliferative and membranoproliferative glomerulonephritis, with glomerular granular IgM and C3 deposits, represent the renal lesions of the schistosomiasis associated nephropathy.
Resumo:
Background: According to some international studies, patients with acute coronary syndrome (ACS) and increased left atrial volume index (LAVI) have worse long-term prognosis. However, national Brazilian studies confirming this prediction are still lacking. Objective: To evaluate LAVI as a predictor of major cardiovascular events (MCE) in patients with ACS during a 365-day follow-up. Methods: Prospective cohort of 171 patients diagnosed with ACS whose LAVI was calculated within 48 hours after hospital admission. According to LAVI, two groups were categorized: normal LAVI (≤ 32 mL/m2) and increased LAVI (> 32 mL/m2). Both groups were compared regarding clinical and echocardiographic characteristics, in- and out-of-hospital outcomes, and occurrence of ECM in up to 365 days. Results: Increased LAVI was observed in 78 patients (45%), and was associated with older age, higher body mass index, hypertension, history of myocardial infarction and previous angioplasty, and lower creatinine clearance and ejection fraction. During hospitalization, acute pulmonary edema was more frequent in patients with increased LAVI (14.1% vs. 4.3%, p = 0.024). After discharge, the occurrence of combined outcome for MCE was higher (p = 0.001) in the group with increased LAVI (26%) as compared to the normal LAVI group (7%) [RR (95% CI) = 3.46 (1.54-7.73) vs. 0.80 (0.69-0.92)]. After Cox regression, increased LAVI increased the probability of MCE (HR = 3.08, 95% CI = 1.28-7.40, p = 0.012). Conclusion: Increased LAVI is an important predictor of MCE in a one-year follow-up.
Resumo:
No significant difference has been demonstrated in the altered circadian blood pressure pattern between the pituitary-dependent and adrenal forms of Cushing's syndrome before surgery. The effect of therapy, however, proved to be different. The mesor was normalized in the pituitary-dependent Cushing's syndrome more conspicuously for systolic than for diastolic blood pressure. In Cushing's syndrome due to adrenal adenoma, systolic and diastolic blood pressure mesors have been even significantly "overnormalized" after treatment, being 11 to 27 and 2 to 13 mmHg (95% confidence) lower than corresponding mesors in controls. There was no difference between forms in the effect of treatment on blood pressure amplitudes, which remained significantly lower than in controls. Finally, acrophase patterns were partly normalized after treatment of the pituitary-dependent form only for diastolic blood pressure, while both systolic and diastolic blood pressure acrophases were normalized in the treated adrenal form. In conclusion, complete normalization of the pattern of daily blood pressure profile has not been achieved in either form of the syndrome. This may be one of the reasons for the reduced long-term survival after surgical cure of hypercortisolism, than expected.
Resumo:
Improving the course and outcome of patients with acute respiratory distress syndrome presents a challenge. By understanding the immune status of a patient, physicians can consider manipulating proinflammatory systems more rationally. In this context, corticosteroids could be a therapeutic tool in the armamentarium against acute respiratory distress syndrome. Corticosteroid therapy has been studied in three situations: prevention in high-risk patients, early treatment with high-dose, short-course therapy, and prolonged therapy in unresolving cases. There are differences between the corticosteroid trials of the past and recent trials: today, treatment starts 2-10 days after disease onset in patients that failed to improve; in the past, the corticosteroid doses employed were 5-140 times higher than those used now. Additionally, in the past treatment consisted of administering one to four doses every 6 h (methylprednisolone, 30 mg/kg) versus prolonging treatment as long as necessary in the new trials (2 mg kg-1 day-1 every 6 h). The variable response to corticosteroid treatment could be attributed to the heterogeneous biochemical and molecular mechanisms activated in response to different initial insults. Numerous factors need to be taken into account when corticosteroids are used to treat acute respiratory distress syndrome: the specificity of inhibition, the duration and degree of inhibition, and the timing of inhibition. The major continuing problem is when to administer corticosteroids and how to monitor their use. The inflammatory mechanisms are continuous and cyclic, sometimes causing deterioration or improvement of lung function. This article reviews the mechanisms of action of corticosteroids and the results of experimental and clinical studies regarding the use of corticosteroids in acute respiratory distress syndrome.
Resumo:
Inflammatory markers have been associated with clinical outcome in patients with acute coronary syndrome (ACS). The present study evaluated the role of high-sensitivity C-reactive protein (CRP) measurements as a predictor of late cardiovascular outcomes after ACS. One hundred and ninety-nine ACS patients in a Coronary Care Unit from March to November 2002 were included and were reassessed clinically after ~3 years. Clinical variables and CRP levels were evaluated as predictors of major cardiovascular events (MACE, defined as the occurrence of cardiac death, ischemic stroke or myocardial infarction) and mortality. Statistical analyses included Cox multivariable analysis and survival curves (Kaplan-Meier). Of the 199 patients, 11 died within 1 month (5.5%). Of the 188 remaining patients, 22 died after a mean follow-up of 2.9 ± 0.5 years. Baseline CRP levels for patients with MACE (N = 57) were significantly higher than those of patients with no events (median = 0.67 mg/L; 25th-75th percentiles = 0.32 and 1.99 mg/L vs median = 0.45 mg/L; 25th-75th percentiles = 0.24 and 0.83 mg/L; P < 0.001). Patients with CRP levels >3 mg/L had a significantly lower survival than the other two groups (1-3 and <1 mg/L; P = 0.001, log-rank test). The odds ratio for MACE was 7.41 (2.03-27.09) for patients with CRP >3 mg/L compared with those with CRP <1 mg/L. For death by any cause, the hazard ratio was 4.58 (1.93-10.86). High CRP levels predicted worse long-term outcomes (MACE and death by any cause) in patients with ACS.
Resumo:
Damping off is a nursery disease of great economic importance in papaya and seed treatment may be an effective measure to control. The aim of this work was to evaluate the quality of papaya seeds treated with fungicides and stored under two environmental and packaging conditions. Additionally, the efficiency of fungicide treatments in the control of damping-off caused by Rhizoctonia solani was evaluated. Papaya seeds were treated with the fungicides Captan, Tolylfluanid and the mixture Tolylfluanid + Captan (all commercial wettable powder formulations). Seeds of the control group were not treated. The seeds were stored for nine months in two conditions: packed in aluminum coated paper and kept at 7 ± 1ºC and in permeable kraft paper and kept in non-controlled environment. At the beginning of the storage and every three months the seed quality (germination and vigor tests), emergence rate index, height, dry mass and damping of plants in pre and post-emergence (in contaminated substrate and mycelia-free substrate) were analyzed. Both storage conditions as well as the fungicide treatments preserved the germination and seed vigor. In the infested substrate, seedling emergence was favored by fungicides, but in post-emergence, fungicides alone did not control the damping off caused by R. solani. Symptoms of damping off were not observed in the clean substrate. The results showed that the fungicide treatments may be used to pretreat papaya seed for long-term storage and commercialization.
Resumo:
The long-lived flowers of orchids increase the chances of pollination and thus the reproductive success of the species. However, a question arises: does the efficiency of pollination, expressed by fruit set, vary with the flower age? The objective of this study was to verify whether the flower age of Corymborkis flava(Sw.) Kuntze affects pollination efficiency. The following hypotheses were tested: 1) the fruit set of older flowers is lower than that of younger ones; 2) morphological observations (perianth and stigmatic area), stigma receptivity test by using a solution of hydrogen peroxide and hand-pollination tests are equally effective in defining the period of stigmatic receptivity. Flowers were found to be receptive from the first to the fourth day of anthesis. Fruit set of older flowers (third and fourth day) was lower than that of younger flowers. Morphological observations, the stigma receptivity test and hand-pollinations were equally effective in defining the period of stigmatic receptivity. However, to evaluate the maximum degree of stigma receptivity of orchid species with long-lived flowers, we recommend hand-pollinations, beyond the period of receptivity.
Resumo:
OBJECTIVE: To assess personal autonomy of long-stay psychiatric inpatients, to identify those patients who could be discharged and to evaluate the impact of sociodemographic variables, social functioning, and physical disabilities on their autonomy was also assessed. METHODS: A total of 584 long-stay individuals of a psychiatric hospital (96% of the hospital population) in Southern Brazil was assessed between July and August 2002. The following instruments, adapted to the Brazilian reality, were used: independent living skills survey, social behavioral schedule, and questionnaire for assessing physical disability. RESULTS: Patients showed severe impairment of their personal autonomy, especially concerning money management, work-related skills and leisure, food preparation, and use of transportation. Autonomy deterioration was associated with length of stay (OR=1.02), greater physical disability (OR=1.54; p=0.01), and male gender (OR=3.11; p<0.001). The risk estimate of autonomy deterioration was 23 times greater among those individuals with severe impairment of social functioning (95% CI: 10.67-49.24). CONCLUSIONS: In-patients studied showed serious impairment of autonomy. While planning these patients' discharge their deficits should be taken into consideration. Assessment of patients' ability to function and to be autonomous helps in identifying their needs for care and to evaluate their actual possibilities of social reinsertion.
Resumo:
OBJECTIVE: To describe the demographic profile, social functioning, and quality of life of a population of long-stay care patients in a psychiatric hospital. METHODS: A study was carried out in Porto Alegre, Southern Brazil, in 2002. A total of 584 (96%) long-stay patients were assessed by means of the following instruments: the World Health Organization Quality of Life, the Social Behavior Schedule, the Independent Living Skills Survey, the Brief Psychiatric Rating Scale and another instrument for assessing disability (Questionnaire for Assessing Physical Disability). RESULTS: The average hospital stay was 26 years (SD: 15.8) and 46.6% of inpatients had no physical disability. Patients had their social functioning skills and autonomy largely impaired. Few of them (27.7%) answered the instrument for assessing quality of life, and showed significant impairments in all domains. The Brief Psychiatric Rating Scale evidenced a low prevalence of positive symptoms in this population. CONCLUSIONS: The institutionalized population studied presented significantly impaired social functioning, autonomy, and quality of life. These aspects need to be taken into consideration while planning for their deinstitutionalization.