27 resultados para Long-term follow-up study
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
Assessing the efficacy of implantable cardioverter-defibrillators (ICD) in patients with Chagas' heart disease (ChHD) and identifying the clinical predictors of mortality and ICD shock during long-term follow-up. ChHD is associated with ventricular tachyarrhythmias and an increased risk of sudden cardiac death. Although ChHD is a common form of cardiomyopathy in Latin American ICD users, little is known about its efficacy in the treatment of this population. The study cohort included 116 consecutive patients with ChHD and an ICD implanted for secondary prevention. Of the 116 patients, 83 (72%) were men; the mean age was 54 +/- 10.7 years. Several clinical variables were tested in a multivariate Cox model for predicting long-term mortality. The average follow-up was 45 +/- 32 months. New York Heart Association class I-II developed in 83% of patients. The mean left ventricular ejection fraction was 42 +/- 16% at implantation. Of the 116 patients, 58 (50%) had appropriate shocks and 13 (11%) had inappropriate therapy. A total of 31 patients died (7.1% annual mortality rate). New York Heart Association class III (hazard ratio [HR] 3.09, 95% confidence interval 1.37 to 6.96, p = 0.0064) was a predictor of a worse prognosis. The left ventricular ejection fraction (HR 0.972, 95% confidence interval 0.94 to 0.99, p = 0.0442) and low cumulative right ventricular pacing (HR 0.23, 95% confidence interval 0.11 to 0.49, p = 0.0001) were predictors of better survival. The left ventricular diastolic diameter was an independent predictor of appropriate shock (I-ER 1.032, 95% confidence interval 1.004 to 1.060, p = 0.025). In conclusion, in a long-term follow-up, ICD efficacy for secondary sudden cardiac death prevention in patients with ChHD was marked by a favorable annual rate of all-cause mortality (7.1%); 50% of the cohort received appropriate shock therapy. New York Heart Association class III and left ventricular ejection fraction were independent predictors of worse prognosis, and low cumulative right ventricular pacing defined better survival. (C) 2012 Elsevier Inc. All rights reserved. (Am J Cardiol 2012;110:1040-1045)
Resumo:
Background and Objectives: Patients who survive acute kidney injury (AKI), especially those with partial renal recovery, present a higher long-term mortality risk. However, there is no consensus on the best time to assess renal function after an episode of acute kidney injury or agreement on the definition of renal recovery. In addition, only limited data regarding predictors of recovery are available. Design, Setting, Participants, & Measurements: From 1984 to 2009, 84 adult survivors of acute kidney injury were followed by the same nephrologist (RCRMA) for a median time of 4.1 years. Patients were seen at least once each year after discharge until end stage renal disease (ESRD) or death. In each consultation serum creatinine was measured and glomerular filtration rate estimated. Renal recovery was defined as a glomerular filtration rate value >= 60 mL/min/1.73 m2. A multiple logistic regression was performed to evaluate factors independently associated with renal recovery. Results: The median length of follow-up was 50 months (30-90 months). All patients had stabilized their glomerular filtration rates by 18 months and 83% of them stabilized earlier: up to 12 months. Renal recovery occurred in 16 patients (19%) at discharge and in 54 (64%) by 18 months. Six patients died and four patients progressed to ESRD during the follow up period. Age (OR 1.09, p < 0.0001) and serum creatinine at hospital discharge (OR 2.48, p = 0.007) were independent factors associated with non renal recovery. The acute kidney injury severity, evaluated by peak serum creatinine and need for dialysis, was not associated with non renal recovery. Conclusions: Renal recovery must be evaluated no earlier than one year after an acute kidney injury episode. Nephrology referral should be considered mainly for older patients and those with elevated serum creatinine at hospital discharge.
Resumo:
Objectives. Adductor spasmodic dysphonia (ADSD) is a focal laryngeal dystonia, which compromises greatly the quality of life of the patients involved. It is a severe vocal disorder characterized by spasms of laryngeal muscles during speech, producing phonatory breaks, forced, strained and strangled voice. Its symptoms result from involuntary and intermittent contractions of thyroarytenoid muscle during speech, which causes vocal fold to strain, pressing each vocal fold against the other and increasing glottic resistance. Botulinum toxin injection remains the gold-standard treatment. However, as injections should be repeated periodically leading to voice quality instability, a more definitive procedure would be desirable. In this pilot study we report the long-term vocal quality results of endoscopic laser thyroarytenoid myoneurectomy. Study Design. Prospective study. Methods. Surgery was performed in 15 patients (11 females and four males), aged between 29 and 73 years, diagnosed with ADSD. Voice Handicap Index (VHI) was obtained before and after surgery (median 31 months postoperatively). Results. A significant improvement in VHI was observed after surgery, as compared with baseline values (P = 0.001). The median and interquartile range for preoperative VHI was 99 and 13, respectively and 24 and 42, for postoperative VHI. Subjective improvement of voice as assessed by the patients showed median improvement of 80%. Conclusions. Because long-term follow-up showed significant improvement of voice quality, this innovative surgical technique seems a satisfactory alternative treatment of ADSD patients who seek a definite improvement of their condition.
Resumo:
OBJECTIVE: The significance of pretransplant, donor-specific antibodies on long-term patient outcomes is a subject of debate. This study evaluated the impact and the presence or absence of donor-specific antibodies after kidney transplantation on short-and long-term graft outcomes. METHODS: We analyzed the frequency and dynamics of pretransplant donor-specific antibodies following renal transplantation from a randomized trial that was conducted from 2002 to 2004 and correlated these findings with patient outcomes through 2009. Transplants were performed against a complement-dependent T-and B-negative crossmatch. Pre- and posttransplant sera were available from 94 of the 118 patients (80%). Antibodies were detected using a solid-phase (Luminex (R)), single-bead assay, and all tests were performed simultaneously. RESULTS: Sixteen patients exhibited pretransplant donor-specific antibodies, but only 3 of these patients (19%) developed antibody-mediated rejection and 2 of them experienced early graft losses. Excluding these 2 losses, 6 of 14 patients exhibited donor-specific antibodies at the final follow-up exam, whereas 8 of these patients (57%) exhibited complete clearance of the donor-specific antibodies. Five other patients developed "de novo'' posttransplant donor-specific antibodies. Death-censored graft survival was similar in patients with pretransplant donor-specific and non-donor-specific antibodies after a mean follow-up period of 70 months. CONCLUSION: Pretransplant donor-specific antibodies with a negative complement-dependent cytotoxicity crossmatch are associated with a risk for the development of antibody-mediated rejection, although survival rates are similar when patients transpose the first months after receiving the graft. Our data also suggest that early posttransplant donor-specific antibody monitoring should increase knowledge of antibody dynamics and their impact on long-term graft outcome.
Resumo:
Background: Although linear growth during childhood may be affected by early-life exposures, few studies have examined whether the effects of these exposures linger on during school age, particularly in low-and middle-income countries. Methods: We conducted a population-based longitudinal study of 256 children living in the Brazilian Amazon, aged 0.1 y to 5.5 y in 2003. Data regarding socioeconomic and maternal characteristics, infant feeding practices, morbidities, and birth weight and length were collected at baseline of the study (2003). Child body length/height was measured at baseline and at follow-up visits (in 2007 and 2009). Restricted cubic splines were used to construct average height-for-age Z score (HAZ) growth curves, yielding estimated HAZ differences among exposure categories at ages 0.5 y, 1 y, 2 y, 5 y, 7 y, and 10 y. Results: At baseline, median age was 2.6 y (interquartile range, 1.4 y-3.8 y), and mean HAZ was -0.53 (standard deviation, 1.15); 10.2% of children were stunted. In multivariable analysis, children in households above the household wealth index median were 0.30 Z taller at age 5 y (P = 0.017), and children whose families owned land were 0.34 Z taller by age 10 y (P = 0.023), when compared with poorer children. Mothers in the highest tertile for height had children whose HAZ were significantly higher compared with those of children from mothers in the lowest height tertile at all ages. Birth weight and length were positively related to linear growth throughout childhood; by age 10 y, children weighing >3500 g at birth were 0.31 Z taller than those weighing 2501 g to 3500 g (P = 0.022) at birth, and children measuring >= 51 cm at birth were 0.51 Z taller than those measuring <= 48 cm (P = 0.005). Conclusions: Results suggest socioeconomic background is a potentially modifiable predictor of linear growth during the school-aged years. Maternal height and child's anthropometric characteristics at birth are positively associated with HAZ up until child age 10 y.
Resumo:
Background: Bariatric surgery influences the intake and absorption of nutrients. The serum concentrations of vitamin C, myeloperoxidase (MPO) and oral clinical manifestations were examined in patients two years after Roux-en-Y gastric bypass (RYGB). Methods: Clinical prospective-study with control-group (CG; n = 26), assessed only once, and the bariatric-group (BG; n = 26), assessed in the basal period and at 12 and 24 months after surgery. The mean ages in the CG and BG were 37.8 +/- 1.51 and 39.6 +/- 1.93 years, respectively, and their body mass indices were 22.07 +/- 0.29 and 45.62 +/- 1.46 kg/m2, respectively. Results: At 12 months after surgery, increased episodes of vomiting (P < .001) and dental hypersensitivity (P=.012) were observed, with a reduction in the saliva buffering capacity of 21.3 2.9% (P=.004). At 24 months after RYGB, we detected a significant reduction in serum vitamin C (32.9 +/- 5.3%, P < .001) and MPO values were higher than in the basal period (P = .032). With regard to oral hygiene habits, 92.3% of patients reported frequent tooth brushing and 96.1% used fluoride, which were similar across the two years. However, dental hypersensitivity (P = .048) was significantly increased than baseline. Conclusions: The results demonstrated that vitamin C deficiency and increased vomiting after RYGB for morbid obesity may contribute to increased periodontal disease. The fact it is impossible to determine which factors (diet, poor compliance with supplementation, vomiting, poor oral hygiene) contributed to the dental problems in these patients is a shortcoming of the report. (Nutr Clin Pract. 2012; 27: 114-121)
HPV clearance in postpartum period of HIV-positive and negative women: a prospective follow-up study
Resumo:
Abstract Background HPV persistence is a key determinant of cervical carcinogenesis. The influence of postpartum on HPV clearance has been debated. This study aimed to assess HPV clearance in later pregnancy and postpartum among HIV-positive and negative women. Methods We conducted a follow-up study with 151 HPV-positive women coinfected with HIV, in 2007–2010. After baseline assessment, all women were retested for HPV infection using PCR in later pregnancy and after delivery. Multivariable logistic regressions assessed the putative association of covariates with HPV status in between each one of the successive visits. Results Seventy-one women (47%) have eliminated HPV between the baseline visit and their second or third visits. HIV-positive women took a significantly longer time (7.0 ± 3.8 months) to clear HPV, compared to those not infected by HIV (5.9 ± 3.0 months). HPV clearance was significantly more likely to take place after delivery than during pregnancy (84.5% x 15.5%). Conclusions Both HIV-positive and negative women presented a significant reduction in HPV infection during the postpartum period. HIV-positive status was found to be associated with a longer period of time to clear HPV infection in pregnant women.
Resumo:
Background. Brazil conducted mass immunization of women of childbearing age in 2001 and 2002. Surveillance was initiated for vaccination of women during pregnancy to monitor the effects of rubella vaccination on fetal outcomes. Methods. Women vaccinated while pregnant or prior to conception were reported to the surveillance system. Susceptibility to rubella infection was determined by anti-rubella immunoglobulin (Ig) M and IgG immunoassays. Susceptible women were observed through delivery. Live-born infants were tested for anti-rubella IgM antibody; IgM-seropositive newborns were tested for viral shedding and observed for 12 months for signs of congenital rubella syndrome. Incidence of congenital rubella infection was calculated using data from 7 states. Results. A total of 22 708 cases of rubella vaccination during pregnancy or prior to conception were reported nationwide, 20 536 (90%) of which were from 7 of 27 states in Brazil. Of these, 2332 women were susceptible to rubella infection at vaccination. Sixty-seven (4.1%) of 1647 newborns had rubella IgM antibody (incidence rate, 4.1 congenital infections per 100 susceptible women vaccinated during pregnancy [95% confidence interval, 3.2–5.1]). None of the infants infected with rubella vaccine virus was born with congenital rubella syndrome. Conclusions. As rubella elimination goals are adopted worldwide, evidence of rubella vaccine safety aids in planning and implementation of mass adult immunization.
Resumo:
To evaluate the feasibility, safety, and potential beneficial effects of left cardiac sympathetic denervation (LCSD) in systolic heart failure (HF) patients. In this prospective, randomized pilot study, inclusion criteria were New York Heart Association (NYHA) functional class II or III, left ventricular ejection fraction (LVEF) 40, sinus rhythm, and resting heart rate 65 b.p.m., despite optimal medical therapy (MT). Fifteen patients were randomly assigned either to MT alone or MT plus LCSD. The primary endpoint was safety, measured by mortality in the first month of follow-up and morbidity according to pre-specified criteria. Secondary endpoints were exercise capacity, quality of life, LVEF, muscle sympathetic nerve activity (MSNA), brain natriuretic peptide (BNP) levels and 24 h Holter mean heart rate before and after 6 months. We studied clinical effects in long-term follow-up. Ten patients underwent LCSD. There were no adverse events attributable to surgery. In the LCSD group, LVEF improved from 25 6.6 to 33 5.2 (P 0.03); 6 min walking distance improved from 167 35 to 198 47 m (P 0.02). Minnesota Living with Heart Failure Questionnaire (MLWHFQ) score physical dimension changed from 21 5 to 15 7 (P 0.06). The remaining analysed variables were unchanged. During 848 549 days of follow-up, in the MT group, three patients either died or underwent cardiac transplantation (CT), while in the LCSD group six were alive without CT. LCSD was feasible and seemed to be safe in systolic HF patients. Its beneficial effects warrant the development of a larger randomized trial. Trail registration: NCT01224899.
Resumo:
Objective: Optimal surgical treatment of patients with transposition of the great arteries (TGA), ventricular septal defect (VSD), and pulmonary stenosis (PS) remains a matter of debate. This study evaluated the clinical outcome and right ventricle outflow tract performance in the long-term follow-up of patients subjected to pulmonary root translocation (PRT) as part of their surgical repair. Methods: From April 1994 to December 2010, we operated on 44 consecutive patients (median age, 11 months). All had malposition of the great arteries as follows: TGA with VSD and PS (n = 33); double-outlet right ventricle with subpulmonary VSD (n = 7); double-outlet right ventricle with atrioventricular septal defect (n = 1); and congenitally corrected TGA with VSD and PS (n 3). The surgical technique consisted of PRT from the left ventricle to the right ventricle after construction of an intraventricular tunnel that diverted blood flow from the left ventricle to the aorta. Results: The mean follow-up time was 72 +/- 52.1 months. There were 3 (6.8%) early deaths and 1 (2.3%) late death. Kaplan-Meier survival was 92.8% and reintervention-free survival was 82.9% at 12 years. Repeat echocardiographic data showed nonlinear growth of the pulmonary root and good performance of the valve at 10 years. Only 4 patients required reinterventions owing to right ventricular outflow tract problems. Conclusions: PRT is a good surgical alternative for treatment of patients with TGA complexes, VSD, and PS, with acceptable operative risk, high long-term survivals, and few reinterventions. Most patients had adequate pulmonary root growth and performance. (J Thorac Cardiovasc Surg 2012;143:1292-8)
Resumo:
Purpose: Few reports have evaluated cumulative survival rates of extraoral rehabilitation and peri-implant soft tissue reaction at long-term follow-up. The objective of this study was to evaluate implant and prosthesis survival rates and the soft tissue reactions around the extraoral implants used to support craniofacial prostheses. Materials and Methods: A retrospective study was performed of patients who received implants for craniofacial rehabilitation from 2003 to 2010. Two outcome variables were considered: implant and prosthetic success. The following predictor variables were recorded: gender, age, implant placement location, number and size of implants, irradiation status in the treated field, date of prosthesis delivery, soft tissue response, and date of last follow-up. A statistical model was used to estimate survival rates and associated confidence intervals. We randomly selected 1 implant per patient for analysis. Data were analyzed using the Kaplan-Meier method and log-rank test to compare survival curves. Results: A total of 150 titanium implants were placed in 56 patients. The 2-year overall implant survival rates were 94.1% for auricular implants, 90.9% for nasal implants, 100% for orbital implants, and 100% for complex midfacial implants (P = .585). The implant survival rates were 100% for implants placed in irradiated patients and 94.4% for those placed in nonirradiated patients (P = .324). The 2-year overall prosthesis survival rates were 100% for auricular implants, 90.0% for nasal implants, 92.3% for orbital implants, and 100% for complex midfacial implants (P = .363). The evaluation of the peri-implant soft tissue response showed that 15 patients (26.7%) had a grade 0 soft tissue reaction, 30 (53.5%) had grade 1, 6 (10.7%) had grade 2, and 5 (8.9%) had grade 3. Conclusions: From this study, it was concluded that craniofacial rehabilitation with extraoral implants is a safe, reliable, and predictable method to restore the patient's normal appearance. (C) 2012 American Association of Oral and Maxillofacial Surgeons J Oral Maxillofac Surg 70:1551-1557, 2012
Resumo:
Ischemia/reperfusion (I/R) injury remains a major cause of graft dysfunction, which impacts short- and long-term follow-up. Hyperbaric oxygen therapy (HBO), through plasma oxygen transport, has been currently used as an alternative treatment for ischemic tissues. The aim of this study was to analyze the effects of HBO on kidney I/R injury model in rats, in reducing the harmful effect of I/R. The renal I/R model was obtained by occluding bilateral renal pedicles with nontraumatic vascular clamps for 45 minutes, followed by 48 hours of reperfusion. HBO therapy was delivered an hypebaric chamber (2.5 atmospheres absolute). Animals underwent two sessions of 60 minutes each at 6 hours and 20 hours after initiation of reperfusion. Male Wistar rats (n = 38) were randomized into four groups: sham, sham operated rats; Sham+HBO, sham operated rats exposed to HBO; I/R, animals submitted to I/R; and I/R+HBO, I/R rats exposed to HBO. Blood, urine, and kidney tissue were collected for biochemical, histologic, and immunohistochemical analyses. The histopathological evaluation of the ischemic injury used a grading scale of 0 to 4. HBO attenuated renal dysfunction after ischemia characterized by a significant decrease in blood urea nitrogen (BUN), serum creatinine, and proteinuria in the I/R+HBO group compared with I/R alone. In parallel, tubular function was improved resulting in significantly lower fractional excretions of sodium and potassium. Kidney sections from the I/R plus HBO group showed significantly lower acute kidney injury scores compared with the I/R group. HBO treatment significantly diminished proliferative activity in I/R (P < .05). There was no significant difference in macrophage infiltration or hemoxygenase-1 expression. In conclusion, HBO attenuated renal dysfunction in a kidney I/R injury model with a decrease in BUN, serum creatinine, proteinuria, and fractional excretion of sodium and potassium, associated with reduced histological damage.
Resumo:
Context: Jansen's metaphyseal chondrodysplasia (JMC) is a rare autosomal dominant disorder caused by activating mutations in the PTH 1 receptor (PTH1R; PTH/PTHrP receptor), leading to chronic hypercalcemia and hypercalciuria. Hypophosphatemia is also a hallmark of JMC, and recently, increased fibroblast growth factor 23 (FGF23) levels have been reported in this syndrome. Hypercalcemia has been associated with increased cardiovascular risk; however, cardiovascular disease has not been extensively investigated in JMC patients. Objective: The aim of the study was to describe the long-term follow-up of a JMC patient with regard to the management of hypercalciuria, the evaluation of FGF23 levels under bisphosphonate treatment, and the investigation of cardiovascular repercussion of chronic hypercalcemia. Results: The diagnosis of JCM was confirmed by molecular analysis (p.H223R mutation in PTH1R). The patient was followed from 5 to 27 yr of age. Asymptomatic nephrolithiasis was diagnosed at 18 yr of age, prompting pharmacological management of hypercalciuria. Treatment with alendronate reduced hypercalciuria; however, normocalciuria was only obtained with the association of thiazide diuretic. Serum FGF23 levels, measured under alendronate treatment, were repeatedly within the normal range. Subclinical cardiovascular disease was investigated when the patient was 26 yr old, after 19 yr of sustained mild hypercalcemia; carotid and vertebral artery ultrasonography was normal, as well as coronary computed tomography angiography (calcium score = 0). Conclusion: The long-term follow-up of our JMC patient has provided insight on therapeutic strategies to control hypercalciuria, on the potential effects of alendronate on FGF23 levels, and on the lack of detectable cardiovascular disease at young adulthood after prolonged exposure to hypercalcemia. (J Clin Endocrinol Metab 97: 1098-1103, 2012)
Resumo:
A serological follow-up study was carried out on 27 children (1–12 years old) with visceral and/or ocular toxocariasis, after treatment with thiabendazole. A total of 159 serum samples were collected in a period ranging from 22–116 months. Enzyme-linked immunosorbent assays (IgG, IgA, and IgE ELISA) were standardized, using excretory–secretory antigens obtained from the second-stage larvae of a Toxocara canis culture. The sensitivity found for the IgG, IgA, and IgE ELISA, as determined in visceral toxocariasis patients, was 100%, 47.8%, and 78.3%, respectively. Approximately 84% of the patients presented single or multiple parasitosis, as diagnosed by stool examination, yet such variables did not appear to affect the anti-Toxocara immune response. Titers of specific IgE antibody showed a significant decrease during the first year after treatment, followed by a decrease in the IgA titers in the second year, and in the IgG titers from the fourth year onwards. Sera from all patients presented high avidity IgG antibodies, indicating that they were in the chronic phase of the disease. Moreover, 1 year after treatment, the level of leukocytes, eosinophils, and anti-A isohemagglutinin in patients decreased significantly. The present data suggest that IgE antibodies plus eosinophil counts are helpful parameters for patient followup after chemotherapy.
Resumo:
Porokeratosis is a primary keratinizing disorder of unknown etiology. This disorder is characterized by the presence of centrifugally enlarging hyperkeratotic plaques, associated with the histopathological hallmark of cornoid lamellae. Genital porokeratosis is extremely rare. No more than thirty cases have been reported in the literature, including only one case of linear porokeratosis confined to the genital area. This case report describes a patient with genital linear porokeratosis, who was successfully treated with cryotherapy. Over two years of follow-up, the lesion improved and there was no evidence of recurrence or signs of malignant transformation. Nevertheless, there is a need for long-term follow-up data on recurrence and malignant transformation.