114 resultados para Randomized-trial
Resumo:
Background-Prasugrel is a novel thienopyridine that reduces new or recurrent myocardial infarctions (MIs) compared with clopidogrel in patients with acute coronary syndrome undergoing percutaneous coronary intervention. This effect must be balanced against an increased bleeding risk. We aimed to characterize the effect of prasugrel with respect to the type, size, and timing of MI using the universal classification of MI. Methods and Results-We studied 13 608 patients with acute coronary syndrome undergoing percutaneous coronary intervention randomized to prasugrel or clopidogrel and treated for 6 to 15 months in the Trial to Assess Improvement in Therapeutic Outcomes by Optimizing Platelet Inhibition With Prasugrel-Thrombolysis in Myocardial Infarction (TRITON-TIMI 38). Each MI underwent supplemental classification as spontaneous, secondary, or sudden cardiac death (types 1, 2, and 3) or procedure related (Types 4 and 5) and examined events occurring early and after 30 days. Prasugrel significantly reduced the overall risk of MI (7.4% versus 9.7%; hazard ratio [HR], 0.76; 95% confidence interval [CI], 0.67 to 0.85; P < 0.0001). This benefit was present for procedure-related MIs (4.9% versus 6.4%; HR, 0.76; 95% CI, 0.66 to 0.88; P = 0.0002) and nonprocedural (type 1, 2, or 3) MIs (2.8% versus 3.7%; HR, 0.72; 95% CI, 0.59 to 0.88; P = 0.0013) and consistently across MI size, including MIs with a biomarker peak >= 5 times the reference limit (HR. 0.74; 95% CI, 0.64 to 0.86; P = 0.0001). In landmark analyses starting at 30 days, patients treated with prasugrel had a lower risk of any MI (2.9% versus 3.7%; HR, 0.77; P = 0.014), including nonprocedural MI (2.3% versus 3.1%; HR, 0.74; 95% CI, 0.60 to 0.92; P = 0.0069). Conclusion-Treatment with prasugrel compared with clopidogrel for up to 15 months in patients with acute coronary syndrome undergoing percutaneous coronary intervention significantly reduces the risk of MIs that are procedure related and spontaneous and those that are small and large, including new MIs occurring during maintenance therapy. (Circulation. 2009; 119: 2758-2764.)
Resumo:
Objectives We evaluated demographic, clinical, and angiographic factors influencing the selection of coronary artery bypass graft (CABG) surgery versus percutaneous coronary intervention (PCI) in diabetic patients with multivessel coronary artery disease (CAD) in the BARI 2D (Bypass Angioplasty Revascularization Investigation in Type 2 Diabetes) trial. Background Factors guiding selection of mode of revascularization for patients with diabetes mellitus and multivessel CAD are not clearly defined. Methods In the BARI 2D trial, the selected revascularization strategy, CABG or PCI, was based on physician discretion, declared independent of randomization to either immediate or deferred revascularization if clinically warranted. We analyzed factors favoring selection of CABG versus PCI in 1,593 diabetic patients with multivessel CAD enrolled between 2001 and 2005. Results Selection of CABG over PCI was declared in 44% of patients and was driven by angiographic factors including triple vessel disease (odds ratio [OR]: 4.43), left anterior descending stenosis >= 70% (OR: 2.86), proximal left anterior descending stenosis >= 50% (OR: 1.78), total occlusion (OR: 2.35), and multiple class C lesions (OR: 2.06) (all p < 0.005). Nonangiographic predictors of CABG included age >= 65 years (OR: 1.43, p = 0.011) and non-U.S. region (OR: 2.89, p = 0.017). Absence of prior PCI (OR: 0.45, p < 0.001) and the availability of drug-eluting stents conferred a lower probability of choosing CABG (OR: 0.60, p = 0.003). Conclusions The majority of diabetic patients with multivessel disease were selected for PCI rather than CABG. Preference for CABG over PCI was largely based on angiographic features related to the extent, location, and nature of CAD, as well as geographic, demographic, and clinical factors. (Bypass Angioplasty Revascularization Investigation in Type 2 Diabetes [BARI 2D]; NCT00006305) (J Am Coll Cardiol Intv 2009;2:384-92) (C) 2009 by the American College of Cardiology Foundation
Resumo:
Foram utilizados 1.015 frangos de corte machos, linhagem Ross, dos 37 aos 49 dias de idade, para avaliar os efeitos de diferentes níveis de lisina digestível nas dietas experimentais. O delineamento experimental foi inteiramente casualizado, com 5 níveis de lisina digestível (0,90; 0,95; 1,00; 1,05; 1,10%), cada um com 7 repetições e 29 aves por unidade experimental. Utilizaram-se dietas isoenergéticas com 3.250 de kcal de EM/kg e isoprotéicas, com 18% de PB, à base de milho e farelo de soja. Foram avaliados o ganho de peso, o consumo de ração, a conversão alimentar, as características de carcaça, o rendimento de cortes, a composição e a deposição de nutrientes corporais. Os níveis de lisina digestível influenciaram, entre as características de desempenho, apenas a conversão alimentar, que melhorou de forma linear de acordo com níveis de lisina da ração. Das características de carcaça e rendimento de cortes, apenas gordura abdominal aumentou de forma quadrática conforme os níveis de lisina. Os níveis de lisina digestível tiveram efeito quadrático no teor da matéria mineral da carcaça, mas não influenciaram a composição química das vísceras e do sangue. Observou-se, contudo, tendência a aumento linear na deposição de proteína da carcaça e do corpo vazio com o aumento no nível de lisina digestível. Os resultados de desempenho indicam que o nível de lisina digestível da ração de frangos de corte machos no período de 37 a 49 dias de idade deve ser igual ou superior a 1,10%.
Resumo:
Background: We evaluated the effectiveness of a school-based intervention on the promotion of physical activity among high school students in Brazil: the Saude no Boa project. Methods: A school-based, randomized trial was carried out in 2 Brazilian cities: Recife (northeast) and Florianopolis (south). Ten schools in each city were matched by size and location, and randomized into intervention or control groups. The intervention included environmental/organizational changes, physical activity education, and personnel training and engagement. Students age 15 to 24 years were evaluated at baseline and 9 months later (end of school year). Results: Although similar at baseline, after the intervention, the control group reported significantly fewer d/wk accumulating 60 minutes+ moderate-to-vigorous physical activity (MVPA) in comparison with the intervention group (2.6 versus 3.3, P < .001). The prevalence of inactivity (0 days per week) rose in the control and decreased in the intervention group. The odds ratio for engaging at least once per week in physical activity associated with the intervention was 1.83 (95% CI = 1.24-2.71) in the unadjusted analysis and 1.88 (95% CI = 1.27-2.79) after controlling for gender. Conclusion: The Saude no Boa intervention was effective at reducing the prevalence of physical inactivity. The possibility of expanding the intervention to other locations should be considered.
Resumo:
Continued assessment of temporal trends in mortality and epidemiology of specific cardiovascular diseases in South America is needed to provide a scientific basis for rational allocation of the limited healthcare resources and introduction of strategies to reduce risk and predict the future burden of cardiovascular disease. The epidemiology of cardiomyopathies, adult valve disease and heart failure (HF) in South America is reviewed here. Diseases of the circulatory system are the main cause of death based on data from about 50% of the South American population. Among the cardiovascular causes of death, cerebrovascular disease is predominant followed by ischaemic heart disease, other heart diseases and hypertensive disease. Of note, cerebrovascular disease is the main cause of death in women, and race also influenced cardiovascular mortality rates. HF is the most important cardiovascular reason for admission to hospital due to cardiovascular disease of ischaemic, idiopathic dilated cardiomyopathic, valvular, hypertensive and chagasic aetiologies. Also, mortality due to HF is high, especially owing to Chagas' disease. HF and aetiologies associated with HF are responsible for 6.3% of deaths. Rheumatic fever is the leading cause of valvular heart disease. The findings have important public health implications because the allocation of healthcare resources, and strategies to reduce the risk of HF should also consider controlling Chagas' disease and rheumatic fever in South American countries.
Resumo:
Introduction: Treatment of severe bacterial peritonitis, especially by videolaparoscopy, is still a matter of investigation. The aim of the present study was to evaluate the effect of videolaparoscopy and laparotomy access with or without antibiotics on the outcome of severe bacterial peritonitis in rats. Materials and Methods: Sixty-four male Wistar rats were equally assigned to 8 groups: Sham surgery (SHAM), SHAM+antibiotics (SHAM+AB), cecal ligation and puncture (CLP), CLP+AB, CLP+videolaparoscopy (VLAP), CLP+laparotomy (LAP), VLAP+AB, and LAP+AB. All treated animals were submitted to an evaluation of bacteremia, white cell counts, and cytokine determinations: interleukin (IL)-1, IL-6, and tumor necrosis factor-alpha (TNF-alpha). The groups treated with antibiotics received gentamicin and metronidazole. Survival was monitored over a period of 7 days. Results: Peritonitis induced by CLP was severe, with IL-1, IL-6, and TNF-alpha levels and lethality being significantly higher compared to the SHAM group. The IL-6 levels in the VLAP group were significantly higher compared to the CLP and VLAP+AB groups, and the TNF-alpha levels in the VLAP and LAP+AB groups were significantly higher compared to the LAP group. The survival time was significantly higher in the CLP+AB and VLAP+AB groups, when compared to the CLP group. There was no significant difference in bacteremia and lethality rates between the resources employed for treatment of peritonitis. Conclusions: Although the use of laparoscopic access itself exacerbates the inflammatory response, the combination with antibiotics minimizes this effect and increases the survival time. However, all of the resources used for treating severe peritonitis, when applied alone or in combination, have an equivalent influence on bacteremia and lethality rates.
Resumo:
Our objective was to compare the frequency, degree, and location of perineal trauma during spontaneous delivery with or without perineal injections of hyaluronidase (HAase). This was a randomized, controlled pilot study, conducted in a midwifie-led hospital birth center in Sao Paulo, Brazil. Primiparous women (N = 139) were randomly assigned to an intervention group (HAase injection, n = 71) or to a control group (no injection, n = 68). Significant differences were noted between the two groups in frequency of perineal trauma (intervention, 39.4%; control, 76.5%), degree of spontaneous laceration (intervention, 0.0%; control, 82.4%), and laceration located in the posterior region of the perineum (intervention, 54.2%; control, 84.3%). When episiotomy and second-degree lacerations were considered together and women with intact perineum were excluded from the analysis, the difference between the groups was no longer significant. With the use of the HAase enzyme, the relative risk was 0.5 for perineal trauma and 0.0 for second-degree lacerations. The present findings suggest that perineal injection of HAase prevented perineal trauma. These findings provide strong rationale for a larger follow-up study.
Resumo:
BACKGROUND: Guidelines for red blood cell (RBC) transfusions exist; however, transfusion practices vary among centers. This study aimed to analyze transfusion practices and the impact of patients and institutional characteristics on the indications of RBC transfusions in preterm infants. STUDY DESIGN AND METHODS: RBC transfusion practices were investigated in a multicenter prospective cohort of preterm infants with a birth weight of less than 1500 g born at eight public university neonatal intensive care units of the Brazilian Network on Neonatal Research. Variables associated with any RBC transfusions were analyzed by logistic regression analysis. RESULTS: Of 952 very-low-birth-weight infants, 532 (55.9%) received at least one RBC transfusion. The percentages of transfused neonates were 48.9, 54.5, 56.0, 61.2, 56.3, 47.8, 75.4, and 44.7%, respectively, for Centers 1 through 8. The number of transfusions during the first 28 days of life was higher in Center 4 and 7 than in other centers. After 28 days, the number of transfusions decreased, except for Center 7. Multivariate logistic regression analysis showed higher likelihood of transfusion in infants with late onset sepsis (odds ratio [OR], 2.8; 95% confidence interval [CI], 1.8-4.4), intraventricular hemorrhage (OR, 9.4; 95% CI, 3.3-26.8), intubation at birth (OR, 1.7; 95% CI, 1.0-2.8), need for umbilical catheter (OR, 2.4; 95% CI, 1.3-4.4), days on mechanical ventilation (OR, 1.1; 95% CI, 1.0-1.2), oxygen therapy (OR, 1.1; 95% CI, 1.0-1.1), parenteral nutrition (OR, 1.1; 95% CI, 1.0-1.1), and birth center (p < 0.001). CONCLUSIONS: The need of RBC transfusions in very-low-birth-weight preterm infants was associated with clinical conditions and birth center. The distribution of the number of transfusions during hospital stay may be used as a measure of neonatal care quality.
Resumo:
Objective: Alterations in selenium (Se) status may result in suboptimal amounts of selenoproteins, which have been associated with increased oxidative stress levels. The Pro198Leu polymorphism at the glutathione peroxidase-1 (GPx1) gene is supposed to be functional. The response of Se status, GPx activity, and levels of DNA damage to a Se supplementation trial between the genotypes related to that polymorphism was investigated. Methods: A randomized trial was conducted with 37 morbidly obese women. Participants consumed one Brazil nut, which provided approximately 290 mu g of Se a day, for 8 wk. Blood Se concentrations, erythrocyte GPx activity, and DNA damage levels were measured at baseline and at 8 wk. The results were compared by genotypes. Results: The genotype frequencies were 0.487, 0.378, and 0.135 for Pro/Pro (the wild-type genotype), Pro/Leu, and Leu/Leu, respectively. At baseline, 100% of the subjects were Se deficient, and after the supplementation, there was an improvement in plasma Se (P < 0.001 for Pro/Pro and Pro/Leu, P < 0.05 for Leu/Leu), erythrocyte Se (P = 0.00 for Pro/Pro and Pro/Leu, P < 0.05 for Leu/Leu), and GPx activity (P = 0.00 for Pro/Pro, P < 0.00001 for Pro/Leu, P < 0.001 for Leu/Leu). In addition, the Pro/Pro group showed a decrease in DNA damage after Brazil nut consumption compared with baseline (P < 0.005), and those levels were higher in Leu/Leu subjects compared with those with the wild-type genotype (P < 0.05). Conclusion: Consumption of one unit of Brazil nuts daily effectively increases Se status and increases GPx activity in obese women, regardless of GPx1 Pro198Leu polymorphism. However, the evaluated biomarkers showed distinct results in response to the supplementation when the polymorphism was considered. (c) 2011 Elsevier Inc. All rights reserved.
Resumo:
Autologous hematopoietic stem cell transplantation (HSCT) has proved efficient to treat hematological malignancies. However, some patients fail to mobilize HSCs. It is known that the microenvironment may undergo damage after allogeneic HSCT. However little is known about how chemotherapy and growth factors contribute to this damage. We studied the stromal layer formation(SLF) and velocity before and after HSC mobilization, through long-term bone marrow culture from 22 patients and 10 healthy donors. Patients` SLF was similar at pre- (12/22)and post-mobilization (9/20), however for controls this occurred more at pre- mobilization (9/10; p=0.03). SLF velocity was higher at pre than post-mobilization in both groups. Leukemias and multiple myeloma showed faster growth of SLF than lymphomas at post-mobilization, the latter being similar to controls. These findings could be explained by less uncommitted HSC in controls than patients at post-mobilization. Control HSCs may migrate more in response to mobilization, resulting in a reduced population by those cells.
Resumo:
To discuss and share knowledge around advances in the care of patients with thrombotic disorders, the Third International Symposium of Thrombosis and Anticoagulation was held in So Paulo, Brazil, from October 14-16, 2010. This scientific program was developed by clinicians for clinicians, and was promoted by four major clinical research institutes: the Brazilian Clinical Research Institute, the Duke Clinical Research Institute of the Duke University School of Medicine, the Canadian VIGOUR Centre, and the Uppsala Clinical Research Center. Comprising 3 days of academic presentations and open discussion, the symposium had as its primary goal to educate, motivate, and inspire internists, cardiologists, hematologists, and other physicians by convening national and international visionaries, thought-leaders, and dedicated clinician-scientists. This paper summarizes the symposium proceedings.
Resumo:
Purpose: The aim of this study is to evaluate the relationship between timing of renal replacement therapy (RRT) in severe acute kidney injury and clinical outcomes. Methods: This was a prospective multicenter observational study conducted at 54 intensive care units (ICUs) in 23 countries enrolling 1238 patients. Results: Timing of RRT was stratified into ""early"" and ""late"" by median urea and creatinine at the time RRT was started. Timing was also categorized temporally from ICU admission into early (<2 days), delayed (2-5 days), and late (>5 days). Renal replacement therapy timing by serum urea showed no significant difference in crude (63.4% for urea <= 24.2 mmol/L vs 61.4% for urea >24.2 mmol/L; odds ratio [OR], 0.92; 95% confidence interval [CI], 0.73-1.15; P = .48) or covariate-adjusted mortality (OR, 1.25; 95% CI, 0.91-1.70; P = .16). When stratified by creatinine, late RRT was associated with lower crude (53.4% for creatinine >309 mu mol/L vs 71.4% for creatinine <= 309 mu mol/L; OR, 0.46; 95% CI, 0.36-0.58; P < .0001) and covariate-adjusted mortality (OR, 0.51; 95% CI, 0.37-0.69; P < .001).However, for timing relative to ICU admission, late RRT was associated with greater crude (72.8% vs 62.3% vs 59%, P < .001) and covariate-adjusted mortality (OR, 1.95; 95% CI, 1.30-2.92; P = .001). Overall, late RRT was associated with a longer duration of RRT and stay in hospital and greater dialysis dependence. Conclusion: Timing of RRT, a potentially modifiable factor, might exert an important influence on patient survival. However, this largely depended on its definition. Late RRT (days from admission) was associated with a longer duration of RRT, longer hospital stay, and higher dialysis dependence. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Background-Randomized trials that studied clinical outcomes after percutaneous coronary intervention (PCI) with bare metal stenting versus coronary artery bypass grafting (CABG) are underpowered to properly assess safety end points like death, stroke, and myocardial infarction. Pooling data from randomized controlled trials increases the statistical power and allows better assessment of the treatment effect in high-risk subgroups. Methods and Results-We performed a pooled analysis of 3051 patients in 4 randomized trials evaluating the relative safety and efficacy of PCI with stenting and CABG at 5 years for the treatment of multivessel coronary artery disease. The primary end point was the composite end point of death, stroke, or myocardial infarction. The secondary end point was the occurrence of major adverse cardiac and cerebrovascular accidents, death, stroke, myocardial infarction, and repeat revascularization. We tested for heterogeneities in treatment effect in patient subgroups. At 5 years, the cumulative incidence of death, myocardial infarction, and stroke was similar in patients randomized to PCI with stenting versus CABG (16.7% versus 16.9%, respectively; hazard ratio, 1.04, 95% confidence interval, 0.86 to 1.27; P = 0.69). Repeat revascularization, however, occurred significantly more frequently after PCI than CABG (29.0% versus 7.9%, respectively; hazard ratio, 0.23; 95% confidence interval, 0.18 to 0.29; P<0.001). Major adverse cardiac and cerebrovascular events were significantly higher in the PCI than the CABG group (39.2% versus 23.0%, respectively; hazard ratio, 0.53; 95% confidence interval, 0.45 to 0.61; P<0.001). No heterogeneity of treatment effect was found in the subgroups, including diabetic patients and those presenting with 3-vessel disease. Conclusions-In this pooled analysis of 4 randomized trials, PCI with stenting was associated with a long-term safety profile similar to that of CABG. However, as a result of persistently lower repeat revascularization rates in the CABG patients, overall major adverse cardiac and cerebrovascular event rates were significantly lower in the CABG group at 5 years.
Resumo:
Acute kidney injury (AKI) is now well recognized as an independent risk factor for increased morbidity and mortality particularly when dialysis is needed. Although renal replacement therapy (RRT) has been used in AKI for more than five decades, there is no standard methodology to predict which AKI patients will need dialysis and who will recover renal function without requiring dialysis. The lack of consensus on what parameters should guide the decision to start dialysis has led to a wide variation in dialysis utilization. A contributing factor is the lack of studies in the modern era evaluating the relationship of timing of dialysis initiation and outcomes. Although listed as one of the top priorities in research on AKI, timing of dialysis initiation has not been included as a factor in large, randomized controlled trials in this area. In this review we will discuss the criteria that have been used to define early vs. late initiation in previous studies on dialysis initiation. In addition, we propose a patient-centered approach to define early and late initiation that could serve as framework for managing patients and for future studies in this area.
Resumo:
Background and purpose: To evaluate biochemical control and treatment related toxicity of patients with localized adenocarcinoma of the prostate treated with high dose-rate brachytherapy (HDRB) combined with conventional 2D or 3D-conformal external beam irradiation (EBI). Material and methods: Four-hundred and three patients treated between December 2000 and March 2004. HDRB was delivered with three fractions of 5.5-7 Gy with a single implant, followed by 45 Gy delivered with 2D or 3D conformal EBI. Results: The median follow-up was 48.4 months. Biochemical failure (BF) occurred in 9.6% according to both ASTRO and Phoenix consensus criteria. Mean time to relapse was 13 and 26 months, respectively. The 5-year BF free survival using the ASTRO criteria was 94.3%, 86.9% and 86.6% for the low, intermediate and high risk groups, respectively; using Phoenix criteria, 92.4%, 88.0% and 85.3%, respectively. The only predictive factor of BF in the multivariate analysis by both ASTRO and Phoenix criteria was the presence of prostate nodules detected by digital palpation, and patients younger than 60 years presented a higher chance of failure using Phoenix criteria only. Conclusions: Treatment scheme is feasible and safe with good efficacy. (C) 2011 Elsevier Ireland Ltd All rights reserved. Radiotherapy and Oncology 98 (2011) 169-174