64 resultados para Prospective Randomized Trial
Resumo:
Objective: Early treatment in sepsis may improve outcome. The aim of this study was to evaluate how the delay in starting resuscitation influences the severity of sepsis and the treatment needed to achieve hemodynamic stability. Design: Prospective, randomized, controlled experimental study. Setting: Experimental laboratory in a university hospital. Subjects: Thirty-two anesthetized and mechanically ventilated pigs. Interventions: Pigs were randomly assigned (n = 8 per group) to a nonseptic control group or one of three groups in which fecal peritonitis (peritoneal instillation of 2 g/kg autologous feces) was induced, and a 48-hr period of protocolized resuscitation started 6 (Delta T-6 hrs), 12 (Delta T-12 hrs), or 24 (Delta T-24 hrs) hrs later. The aim of this study was to evaluate the impact of delays in resuscitation on disease severity, need for resuscitation, and the development of sepsis-associated organ and mitochondrial dysfunction. Measurements and Main Results: Any delay in starting resuscitation was associated with progressive signs of hypovolemia and increased plasma levels of interleukin-6 and tumor necrosis factor-alpha prior to resuscitation. Delaying resuscitation increased cumulative net fluid balances (2.1 +/- 0.5 mL/kg/hr, 2.8 +/- 0.7 mL/kg/hr, and 3.2 +/- 1.5 mL/kg/hr, respectively, for groups.T-6 hrs, Delta T-12 hrs, and.T-24 hrs; p < .01) and norepinephrine requirements during the 48-hr resuscitation protocol (0.02 +/- 0.04 mu g/kg/min, 0.06 +/- 0.09 mu g/kg/min, and 0.13 +/- 0.15 mu g/kg/min; p = .059), decreased maximal brain mitochondrial complex II respiration (p = .048), and tended to increase mortality (p = .08). Muscle tissue adenosine triphosphate decreased in all groups (p < .01), with lowest values at the end in groups Delta T-12 hrs and.T-24 hrs. Conclusions: Increasing the delay between sepsis initiation and resuscitation increases disease severity, need for resuscitation, and sepsis-associated brain mitochondrial dysfunction. Our results support the concept of a critical window of opportunity in sepsis resuscitation. (Crit Care Med 2012; 40:2841-2849)
Resumo:
Background and objectives: Longitudinal, prospective, randomized, blinded Trial to assess the influence of pleural drain (non-toxic PVC) site of insertion on lung function and postoperative pain of patients undergoing coronary artery bypass grafting in the first three days post-surgery and immediately after chest tube removal. Method: Thirty six patients scheduled for elective myocardial revascularization with cardiopulmonary bypass (CPB) were randomly allocated into two groups: SX group (subxiphoid) and IC group (intercostal drain). Spirometry, arterial blood gases, and pain tests were recorded. Results: Thirty one patients were selected, 16 in SX group and 15 in IC group. Postoperative (PO) spirometric values were higher in SX than in IC group (p < 0.05), showing less influence of pleural drain location on breathing. PaO2 on the second PO increased significantly in SX group compared with IC group (p < 0.0188). The intensity of pain before and after spirometry was lower in SX group than in IC group (p < 0.005). Spirometric values were significantly increased in both groups after chest tube removal. Conclusion: Drain with insertion in the subxiphoid region causes less change in lung function and discomfort, allowing better recovery of respiratory parameters.
Resumo:
Background-The importance of complete revascularization remains unclear and contradictory. This current investigation compares the effect of complete revascularization on 10-year survival of patients with stable multivessel coronary artery disease (CAD) who were randomly assigned to percutaneous coronary intervention (PCI) or coronary artery bypass graft (CABG). Methods and Results-This is a post hoc analysis of the Second Medicine, Angioplasty, or Surgery Study (MASS II), which is a randomized trial comparing treatments in patients with stable multivessel CAD, and preserved systolic ventricular function. We analyzed patients who underwent surgery (CABG) or stent angioplasty (PCI). The survival free of overall mortality of patients who underwent complete (CR) or incomplete revascularization (IR) was compared. Of the 408 patients randomly assigned to mechanical revascularization, 390 patients (95.6%) underwent the assigned treatment; complete revascularization was achieved in 224 patients (57.4%), 63.8% of those in the CABG group and 36.2% in the PCI group (P = 0.001). The IR group had more prior myocardial infarction than the CR group (56.2% X 39.2%, P = 0.01). During a 10-year follow-up, the survival free of cardiovascular mortality was significantly different among patients in the 2 groups (CR, 90.6% versus IR, 84.4%; P = 0.04). This was mainly driven by an increased cardiovascular specific mortality in individuals with incomplete revascularization submitted to PCI (P = 0.05). Conclusions-Our study suggests that in 10-year follow-up, CR compared with IR was associated with reduced cardiovascular mortality, especially due to a higher increase in cardiovascular-specific mortality in individuals submitted to PCI.
Resumo:
OBJECTIVE: This study sought to compare the effects and outcomes of two ophthalmic viscosurgical devices, 1.6% hyaluronic acid/4.0% chondroitin sulfate and 2.0% hydroxypropylmethylcellulose, during phacoemulsification. METHODS: This prospective, randomized clinical trial comprised 78 eyes (39 patients) that received phacoemulsification performed by the same surgeon using a standardized technique. Patients were randomly assigned to receive either 1.6% hyaluronic acid/4.0% chondroitin sulfate or 2.0% hydroxypropylmethylcellulose on the first eye. The other eye was treated later and received the other viscoelastic agent. Preoperative and postoperative examinations (5, 24 and 48 hours; 7 and 14 days; 3 and 6 months) included measurements of the total volume of the ophthalmic viscosurgical device, ultrasound and washout times to completely remove the ophthalmic viscosurgical device, intraocular pressure, central corneal thickness and best-corrected visual acuity. The corneal endothelial cell count was measured at baseline and at six months postoperatively. ClinicalTrials.gov: NCT01387620. RESULTS: There were no statistically significant differences between groups in terms of cataract density or ultrasound time. However, it took longer to remove 2.0% hydroxypropylmethylcellulose than 1.6% hyaluronic acid/ 4.0% chondroitin sulfate, and the amount of viscoelastic material used was greater in the 2.0% hydroxypropylmethylcellulose group. In addition, the best-corrected visual acuity was significantly better in the hyaluronic acid/ chondroitin sulfate group, but this preferable outcome was only observed at 24 hours after the operation. There were no statistically significant differences between the two ophthalmic viscosurgical devices regarding the central corneal thickness or intraocular pressure measurements at any point in time. The corneal endothelial cell count was significantly higher in the hyaluronic acid/chondroitin sulfate group. CONCLUSION: The ophthalmic viscosurgical device consisting of 1.6% hyaluronic acid/4.0% chondroitin sulfate was more efficient during phacoemulsification and was easier to remove after IOL implantation than 2.0% hydroxypropylmethylcellulose. In addition, the corneal endothelial cell count was significantly higher following the use of hyaluronic acid/chondroitin sulfate than with hydroxypropylmethylcellulose, which promoted an improved level of corneal endothelium protection.
Resumo:
The results of several studies assessing dialysis dose have dampened the enthusiasm of clinicians for considering dialysis dose as a modifiable factor influencing outcomes in patients with acute kidney injury. Powerful evidence from two large, multicenter trials indicates that increasing the dialysis dose, measured as hourly effluent volume, has no benefit in continuous renal replacement therapy (CRRT). However, some important operational characteristics that affect delivered dose were not evaluated. Effluent volume does not correspond to the actual delivered dose, as a decline in filter efficacy reduces solute removal during therapy. We believe that providing accurate parameters of delivered dose could improve the delivery of a prescribed dose and refine the assessment of the effect of dose on outcomes in critically ill patients treated with CRRT.
Resumo:
OBJECTIVES: This prospective, randomized, experimental study with rats aimed to investigate the influence of general treatment strategies on the motor recovery of Wistar rats with moderate contusive spinal cord injury. METHODS: A total of 51 Wistar rats were randomized into five groups: control, maze, ramp, runway, and sham (laminectomy only). The rats underwent spinal cord injury at the T9-T10 levels using the NYU-Impactor. Each group was trained for 12 minutes twice a week for two weeks before and five weeks after the spinal cord injury, except for the control group. Functional motor recovery was assessed with the Basso, Beattie, and Bresnahan Scale on the first postoperative day and then once a week for five weeks. The animals were euthanized, and the spinal cords were collected for histological analysis. RESULTS: Ramp and maze groups showed an earlier and greater functional improvement effect than the control and runway groups. However, over time, unexpectedly, all of the groups showed similar effects as the control group, with spontaneous recovery. There were no histological differences in the injured area between the trained and control groups. CONCLUSION: Short-term benefits can be associated with a specific training regime; however, the same training was ineffective at maintaining superior long-term recovery. These results might support new considerations before hospital discharge of patients with spinal cord injuries.
Resumo:
BACKGROUND Vorapaxar is a new oral protease-activated receptor 1 (PAR-1) antagonist that inhibits thrombin-induced platelet activation. METHODS In this multinational, double-blind, randomized trial, we compared vorapaxar with placebo in 12,944 patients who had acute coronary syndromes without ST-segment elevation. The primary end point was a composite of death from cardiovascular causes, myocardial infarction, stroke, recurrent ischemia with rehospitalization, or urgent coronary revascularization. RESULTS Follow-up in the trial was terminated early after a safety review. After a median follow-up of 502 days (interquartile range, 349 to 667), the primary end point occurred in 1031 of 6473 patients receiving vorapaxar versus 1102 of 6471 patients receiving placebo (Kaplan-Meier 2-year rate, 18.5010 vs. 19.9%; hazard ratio, 0.92; 95% confidence interval [CI], 0.85 to 1.01; P=0.07). A composite of death from cardiovascular causes, myocardial infarction, or stroke occurred in 822 patients in the vorapaxar group versus 910 in the placebo group (14.7% and 16.4%, respectively; hazard ratio, 0.89; 95% CI, 0.81 to 0.98; P=0.02). Rates of moderate and severe bleeding were 7.2% in the vorapaxar group and 5.2% in the placebo group (hazard ratio, 1.35; 95% CI, 1.16 to 1.58; P<0.001). Intracranial hemorrhage rates were 1.1% and 0.2%, respectively (hazard ratio, 3.39; 95% CI, 1.78 to 6.45; P<0.001). Rates of nonhemorrhagic adverse events were similar in the two groups. CONCLUSIONS In patients with acute coronary syndromes, the addition of vorapaxar to standard therapy did not significantly reduce the primary composite end point but significantly increased the risk of major bleeding, including intracranial hemorrhage. (Funded by Merck; TRACER ClinicalTrials.gov number, NCT00527943.)
Resumo:
Background The effect of intensified platelet inhibition for patients with unstable angina or myocardial infarction without ST-segment elevation who do not undergo revascularization has not been delineated. Methods In this double-blind, randomized trial, in a primary analysis involving 7243 patients under the age of 75 years receiving aspirin, we evaluated up to 30 months of treatment with prasugrel (10 mg daily) versus clopidogrel (75 mg daily). In a secondary analysis involving 2083 patients 75 years of age or older, we evaluated 5 mg of prasugrel versus 75 mg of clopidogrel. Results At a median follow-up of 17 months, the primary end point of death from cardiovascular causes, myocardial infarction, or stroke among patients under the age of 75 years occurred in 13.9% of the prasugrel group and 16.0% of the clopidogrel group (hazard ratio in the prasugrel group, 0.91; 95% confidence interval [CI], 0.79 to 1.05; P = 0.21). Similar results were observed in the overall population. The prespecified analysis of multiple recurrent ischemic events (all components of the primary end point) suggested a lower risk for prasugrel among patients under the age of 75 years (hazard ratio, 0.85; 95% CI, 0.72 to 1.00; P = 0.04). Rates of severe and intracranial bleeding were similar in the two groups in all age groups. There was no significant between-group difference in the frequency of nonhemorrhagic serious adverse events, except for a higher frequency of heart failure in the clopidogrel group. Conclusions Among patients with unstable angina or myocardial infarction without ST- segment elevation, prasugrel did not significantly reduce the frequency of the primary end point, as compared with clopidogrel, and similar risks of bleeding were observed. (Funded by Eli Lilly and Daiichi Sankyo; TRILOGY ACS ClinicalTrials.gov number, NCT00699998.)
Resumo:
Across the Americas and the Caribbean, nearly 561,000 slide-confirmed malaria infections were reported officially in 2008. The nine Amazonian countries accounted for 89% of these infections; Brazil and Peru alone contributed 56% and 7% of them, respectively. Local populations of the relatively neglected parasite Plasmodium vivax, which currently accounts for 77% of the regional malaria burden, are extremely diverse genetically and geographically structured. At a time when malaria elimination is placed on the public health agenda of several endemic countries, it remains unclear why malaria proved so difficult to control in areas of relatively low levels of transmission such as the Amazon Basin. We hypothesize that asymptomatic parasite carriage and massive environmental changes that affect vector abundance and behavior are major contributors to malaria transmission in epidemiologically diverse areas across the Amazon Basin. Here we review available data supporting this hypothesis and discuss their implications for current and future malaria intervention policies in the region. Given that locally generated scientific evidence is urgently required to support malaria control interventions in Amazonia, we briefly describe the aims of our current field-oriented malaria research in rural villages and gold-mining enclaves in Peru and a recently opened agricultural settlement in Brazil. (C) 2011 Elsevier B.V. All rights reserved.
THE APPLICABILITY OF AURICULOTHERAPY WITH NEEDLES OR SEEDS TO REDUCE STRESS IN NURSING PROFESSIONALS
Resumo:
This clinical randomized trial was performed with the objective to evaluate the stress levels of the nursing staff of a hospital and analyze the effectiveness of auriculotherapy with needles and seeds. The 75 participants with mean (44/58.7%) and high (31/41.3%) scores according to the Stress Symptoms List were divided into groups (control, needles, and seeds), who received eight sessions on the Shenmen, Kidney and Brainstem points and were evaluated at the baseline, fourth and eighth sessions and on the 15-day follow-up session. The analysis of variance (ANOVA) showed significant differences among the groups at the third assessment (F=3.963/P=0.023) and follow-up (F=6.136/P=0.003). These differences occurred between the control and needle groups. The 'seeds' and needles groups both showed differences (p < 0.05) at the second assessment when compared within the same group. In conclusion, auriculotherapy reduced the stress in the nursing staff, with needles showing better results than seeds for high scores, maintaining the effects for 15 days.
Resumo:
OBJECTIVES: Though elderly persons with chronic atrial fibrillation have more comorbidities that could limit indications for the chronic use of anticoagulants, few studies have focused on the risk of falls within this particular group. To evaluate the predictors of the risk of falls among elderly with chronic atrial fibrillation, a cross-sectional, observational study was performed. METHODS: From 295 consecutive patients aged 60 years or older with a history of atrial fibrillation who were enrolled within the last 2 years in the cardiogeriatrics outpatient clinic of the Instituto do Coracao do Hospital das Clinicas da Faculdade de Medicina da Universidade de Sao Paulo, 107 took part in this study. Their age was 77.9 +/- 6.4 years, and 62 were female. They were divided into two groups: a) no history of falls in the previous year and b) a history of one or more falls in the previous year. Data regarding the history of falls and social, demographic, anthropometric, and clinical information were collected. Multidimensional assessment instruments and questionnaires were applied. RESULTS: At least one fall was reported in 55 patients (51.4%). Among them, 27 (49.1%) presented recurrent falls, with body lesions in 90.4% and fractures in 9.1% of the cases. Multivariate logistic regression showed that self-reported difficulty maintaining balance, use of amiodarone, and diabetes were independent variables associated with the risk of falls, with a sensitivity of 92.9% and a specificity of 44.9%. CONCLUSION: In a group of elderly patients with chronic atrial fibrillation who were relatively independent and able to attend an outpatient clinic, the occurrence of falls with recurrence and clinical consequences was high. Difficulty maintaining balance, the use of amiodarone and a diagnosis of diabetes mellitus were independent predictors of the risk for falls. Thus, simple clinical data predicted falls better than objective functional tests.
Resumo:
To investigate the potential role of vitamin or mineral supplementation on the risk of head and neck cancer (HNC), we analyzed individual-level pooled data from 12 casecontrol studies (7,002 HNC cases and 8,383 controls) participating in the International Head and Neck Cancer Epidemiology consortium. There were a total of 2,028 oral cavity cancer, 2,465 pharyngeal cancer, 874 unspecified oral/pharynx cancer, 1,329 laryngeal cancer and 306 overlapping HNC cases. Odds ratios (OR) and 95% confidence intervals (CIs) for self reported ever use of any vitamins, multivitamins, vitamin A, vitamin C, vitamin E, and calcium, beta-carotene, iron, selenium and zinc supplements were assessed. We further examined frequency, duration and cumulative exposure of each vitamin or mineral when possible and stratified by smoking and drinking status. All ORs were adjusted for age, sex, race/ethnicity, study center, education level, pack-years of smoking, frequency of alcohol drinking and fruit/vegetable intake. A decreased risk of HNC was observed with ever use of vitamin C (OR = 0.76, 95% CI = 0.590.96) and with ever use of calcium supplement (OR = 0.64, 95% CI = 0.420.97). The inverse association with HNC risk was also observed for 10 or more years of vitamin C use (OR = 0.72, 95% CI = 0.540.97) and more than 365 tablets of cumulative calcium intake (OR = 0.36, 95% CI = 0.160.83), but linear trends were not observed for the frequency or duration of any supplement intake. We did not observe any strong associations between vitamin or mineral supplement intake and the risk of HNC.
Resumo:
Purpose: There is no consensus on the optimal method to measure delivered dialysis dose in patients with acute kidney injury (AKI). The use of direct dialysate-side quantification of dose in preference to the use of formal blood-based urea kinetic modeling and simplified blood urea nitrogen (BUN) methods has been recommended for dose assessment in critically-ill patients with AKI. We evaluate six different blood-side and dialysate-side methods for dose quantification. Methods: We examined data from 52 critically-ill patients with AKI requiring dialysis. All patients were treated with pre-dilution CWHDF and regional citrate anticoagulation. Delivered dose was calculated using blood-side and dialysis-side kinetics. Filter function was assessed during the entire course of therapy by calculating BUN to dialysis fluid urea nitrogen (FUN) ratios q/12 hours. Results: Median daily treatment time was 1,413 min (1,260-1,440). The median observed effluent volume per treatment was 2,355 mL/h (2,060-2,863) (p<0.001). Urea mass removal rate was 13.0 +/- 7.6 mg/min. Both EKR (r(2)=0.250; p<0.001) and K-D (r(2)=0.409; p<0.001) showed a good correlation with actual solute removal. EKR and K-D presented a decline in their values that was related to the decrease in filter function assessed by the FUN/BUN ratio. Conclusions: Effluent rate (ml/kg/h) can only empirically provide an estimated of dose in CRRT. For clinical practice, we recommend that the delivered dose should be measured and expressed as K-D. EKR also constitutes a good method for dose comparisons over time and across modalities.
Resumo:
OBJECTIVE: The significance of pretransplant, donor-specific antibodies on long-term patient outcomes is a subject of debate. This study evaluated the impact and the presence or absence of donor-specific antibodies after kidney transplantation on short-and long-term graft outcomes. METHODS: We analyzed the frequency and dynamics of pretransplant donor-specific antibodies following renal transplantation from a randomized trial that was conducted from 2002 to 2004 and correlated these findings with patient outcomes through 2009. Transplants were performed against a complement-dependent T-and B-negative crossmatch. Pre- and posttransplant sera were available from 94 of the 118 patients (80%). Antibodies were detected using a solid-phase (Luminex (R)), single-bead assay, and all tests were performed simultaneously. RESULTS: Sixteen patients exhibited pretransplant donor-specific antibodies, but only 3 of these patients (19%) developed antibody-mediated rejection and 2 of them experienced early graft losses. Excluding these 2 losses, 6 of 14 patients exhibited donor-specific antibodies at the final follow-up exam, whereas 8 of these patients (57%) exhibited complete clearance of the donor-specific antibodies. Five other patients developed "de novo'' posttransplant donor-specific antibodies. Death-censored graft survival was similar in patients with pretransplant donor-specific and non-donor-specific antibodies after a mean follow-up period of 70 months. CONCLUSION: Pretransplant donor-specific antibodies with a negative complement-dependent cytotoxicity crossmatch are associated with a risk for the development of antibody-mediated rejection, although survival rates are similar when patients transpose the first months after receiving the graft. Our data also suggest that early posttransplant donor-specific antibody monitoring should increase knowledge of antibody dynamics and their impact on long-term graft outcome.
Resumo:
BACKGROUND The safety and efficacy of adding antiretroviral drugs to standard zidovudine prophylaxis in infants of mothers with human immunodeficiency virus (HIV) infection who did not receive antenatal antiretroviral therapy (ART) because of late identification are unclear. We evaluated three ART regimens in such infants. METHODS Within 48 hours after their birth, we randomly assigned formula-fed infants born to women with a peripartum diagnosis of HIV type 1 (HIV-1) infection to one of three regimens: zidovudine for 6 weeks (zidovudine-alone group), zidovudine for 6 weeks plus three doses of nevirapine during the first 8 days of life (two-drug group), or zidovudine for 6 weeks plus nelfinavir and lamivudine for 2 weeks (three-drug group). The primary outcome was HIV-1 infection at 3 months in infants uninfected at birth. RESULTS A total of 1684 infants were enrolled in the Americas and South Africa (566 in the zidovudine-alone group, 562 in the two-drug group, and 556 in the three-drug group). The overall rate of in utero transmission of HIV-1 on the basis of Kaplan-Meier estimates was 5.7% (93 infants), with no significant differences among the groups. Intrapartum transmission occurred in 24 infants in the zidovudine-alone group (4.8%; 95% confidence interval [CI], 3.2 to 7.1), as compared with 11 infants in the two-drug group (2.2%; 95% CI, 1.2 to 3.9; P=0.046) and 12 in the three-drug group (2.4%; 95% CI, 1.4 to 4.3; P=0.046). The overall transmission rate was 8.5% (140 infants), with an increased rate in the zidovudine-alone group (P=0.03 for the comparisons with the two-and three-drug groups). On multivariate analysis, zidovudine monotherapy, a higher maternal viral load, and maternal use of illegal substances were significantly associated with transmission. The rate of neutropenia was significantly increased in the three-drug group (P < 0.001 for both comparisons with the other groups). CONCLUSIONS In neonates whose mothers did not receive ART during pregnancy, prophylaxis with a two-or three-drug ART regimen is superior to zidovudine alone for the prevention of intrapartum HIV transmission; the two-drug regimen has less toxicity than the three-drug regimen. (Funded by the Eunice Kennedy Shriver National Institute of Child Health and Human Development [NICHD] and others; ClinicalTrials.gov number, NCT00099359.)