83 resultados para Hospital units


Relevância:

20.00% 20.00%

Publicador:

Resumo:

CTX-M-encoding genes from Klebsiella spp. strains isolated in 2000 and 2006 were characterized as well as their genetic environment. CTX-M-2 variants were predominant in Klebsiella pneumoniae strains, which showed a greater variability in bla(CTX-M) genes, integrons, and plasmids in 2006 when compared to strains collected in 2000. CTX-M-9-producing Klebsiella oxytoca was identified in 2000 as clonal dissemination. (C) 2010 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To describe an outbreak of imipenem-resistant metallo-beta-lactamase-producing Pseudomonas aeruginosa, enzyme type bla, by horizontal transmission in patients admitted to a mixed adult ICU. Methods: A case-control study was carried out, including 47 patients (cases) and 122 patients (control) admitted to the mixed ICU of a university hospital in Minas Gerais. Brazil from November 2003 to July 2005. The infection site, risk factors, mortality, antibiotic susceptibility, metallo-beta-lactamase (MBL) production, enzyme type, and clonal diversity were analyzed, Results: A temporal/spatial relationship was detected in most patients (94%), overall mortality was 55.3%, and pneumonia was the predominant infection (85%). The majority of isolates (95%) were resistant to imipenem and other antibiotics, except for polymyxin, and showed MBL production (76.7%). Only bla SPM-1 (33%) was identified in the 15 specimens analyzed. In addition, 4 clones were identified, with a predominance of clone A (61.5%) and B (23.1%). On multivariate analysis, advanced age, mechanical ventilation, tracheostomy, and previous imipenem use were significant risk factors for imipenem-resistant P. aeruginosa infection. Conclusions: Clonal dissemination of MBL-producing P. aeruginosa strains with a spatial/temporal relationship disclosed problems in the practice of hospital infection control, low adherence to hand hygiene, and empirical antibiotic use. (C) 2008 Elsevier Espana, S.L. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: To describe current practice for the discontinuation of continuous renal replacement therapy in a multinational setting and to identify variables associated with successful discontinuation. The approach to discontinue continuous renal replacement therapy may affect patient outcomes. However, there is lack of information on how and under what conditions continuous renal replacement therapy is discontinued. Design: Post hoc analysis of a prospective observational study. Setting. Fifty-four intensive care units in 23 countries. Patients: Five hundred twenty-nine patients (52.6%) who survived initial therapy among 1006 patients treated with continuous renal replacement therapy. Interventions: None. Measurements and Main Results., Three hundred thirteen patients were removed successfully from continuous renal replacement therapy and did not require any renal replacement therapy for at least 7 days and were classified as the ""success"" group and the rest (216 patients) were classified as the ""repeat-RRT"" (renal replacement therapy) group. Patients in the ""success"" group had lower hospital mortality (28.5% vs. 42.7%, p < .0001) compared with patients in the ""repeat-RRT"" group. They also had lower creatinine and urea concentrations and a higher urine output at the time of stopping continuous renal replacement therapy. Multivariate logistic regression analysis for successful discontinuation of continuous renal replacement therapy identified urine output (during the 24 hrs before stopping continuous renal replacement therapy: odds ratio, 1.078 per 100 mL/day increase) and creatinine (odds ratio, 0.996 per mu mol/L increase) as significant predictors of successful cessation. The area under the receiver operating characteristic curve to predict successful discontinuation of continuous renal replacement therapy was 0.808 for urine output and 0.635 for creatinine. The predictive ability of urine output was negatively affected by the use of diuretics (area under the receiver operating characteristic curve, 0.671 with diuretics and 0.845 without diuretics). Conclusions. We report on the current practice of discontinuing continuous renal replacement therapy in a multinational setting. Urine output at the time of initial cessation (if continuous renal replacement therapy was the most important predictor of successful discontinuation, especially if occurring without the administration of diuretics. (Crit Care Med 2009; 37:2576-2582)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: The aim of this study is to evaluate the relationship between timing of renal replacement therapy (RRT) in severe acute kidney injury and clinical outcomes. Methods: This was a prospective multicenter observational study conducted at 54 intensive care units (ICUs) in 23 countries enrolling 1238 patients. Results: Timing of RRT was stratified into ""early"" and ""late"" by median urea and creatinine at the time RRT was started. Timing was also categorized temporally from ICU admission into early (<2 days), delayed (2-5 days), and late (>5 days). Renal replacement therapy timing by serum urea showed no significant difference in crude (63.4% for urea <= 24.2 mmol/L vs 61.4% for urea >24.2 mmol/L; odds ratio [OR], 0.92; 95% confidence interval [CI], 0.73-1.15; P = .48) or covariate-adjusted mortality (OR, 1.25; 95% CI, 0.91-1.70; P = .16). When stratified by creatinine, late RRT was associated with lower crude (53.4% for creatinine >309 mu mol/L vs 71.4% for creatinine <= 309 mu mol/L; OR, 0.46; 95% CI, 0.36-0.58; P < .0001) and covariate-adjusted mortality (OR, 0.51; 95% CI, 0.37-0.69; P < .001).However, for timing relative to ICU admission, late RRT was associated with greater crude (72.8% vs 62.3% vs 59%, P < .001) and covariate-adjusted mortality (OR, 1.95; 95% CI, 1.30-2.92; P = .001). Overall, late RRT was associated with a longer duration of RRT and stay in hospital and greater dialysis dependence. Conclusion: Timing of RRT, a potentially modifiable factor, might exert an important influence on patient survival. However, this largely depended on its definition. Late RRT (days from admission) was associated with a longer duration of RRT, longer hospital stay, and higher dialysis dependence. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims: There remains significant concern about the long-term safety of drug-eluting stents (DES). However, bare metal stents (BMS) have been used safely for over two decades. There is therefore a pressing need to explore alternative strategies for reducing restenosis with BMS. This study was designed to examine whether IVUS-guided cutting balloon angioplasty (CBA) with BMS could convey similar restenosis rates to DES. Methods and results: In the randomised REstenosis reDUction by Cutting balloon angioplasty Evaluation (REDUCE III) study, 521 patients were divided into four groups based on device and IVUS use before BMS (IVUS-CBA-BMS: 137 patients; Angio-CBA-BMS: 123; IVUS-BA-BMS: 142; and Angio-BA-BMS: 119). At follow-up, the IVUS-CBA-BMS group had a significantly lower restenosis rate (6.6%) than the other groups (p=0.016). We performed a quantitative coronary angiography (QCA) based matched comparison between an IVUS-guided CBA-BMS strategy (REDUCE III) and a DES strategy (Rapamycin-Eluting-Stent Evaluation At Rotterdam Cardiology. Hospital, the RESEARCH study). We matched the presence of diabetes, vessel size, and lesion severity by QCA. Restenosis (>50% diameter stenosis at follow-up) and target vessel revascularisation (TVR) were examined. QCA-matched comparison resulted in 120-paired lesions. While acute gain was significantly greater in IVUS-CBA-BMS than DES (1.65 +/- 0.41 mm vs. 1.28 +/- 0.57 mm, p=0.001), late loss was significantly less with DES than with IVUS-CBA-BMS (0.03 +/- 0.42 mm vs. 0.80 +/- 0.47 mm, p=0.001). However, no difference was found in restenosis rates (IVUS-CBA-BMS: 6.6% vs. DES: 5.0%, p=0.582) and TVR (6.6% and 6.6%, respectively). Conclusions: An IVUS-guided CBA-BMS strategy yielded restenosis rates similar to those achieved by DES and provided an effective alternative to the use of DES.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: mild head trauma (MHT) is defined as a transient neurological deficit after trauma with a history of impairment or loss of consciousness lasting less than 15 min and/or posttraumatic amnesia, and a Glasgow Coma Scale between 13 and 15 on hospital admission. We evaluated 50 MHT patients 18 months after the trauma, addressing signs and symptoms of post-concussion syndrome, quality of life and the presence of anxiety and depression. We correlate those findings with the S100B protein levels and cranial CT scan performed at hospital admission after the trauma. Method: patients were asked to fill out questionnaires to assess quality of life (SF36), anxiety and depression (HADS), and signs and symptoms of post-concussion syndrome. For the control group, we asked the patient`s household members, who had no history of head trauma of any type, to answer the same questionnaires for comparison. Results: total quality of life index for patients with MHT was 58.16 (+/-5), lower than the 73.47 (+/-4) presented by the control group. Twenty patients (55.2%) and four (11.1%) controls were depressed. Seventeen patients (47.2%) presented anxiety, whereas only eight (22.2%) controls were considered anxious. Victims of MHT complained more frequently of loss of balance, dry mouth, pain in the arms, loss of memory and dizziness than their respective controls (p < 0.05). We found no correlation between the presence of these signs and symptoms, quality of life, presence of anxiety and depression with S100B protein levels or with presence of injury in the cranial CT performed at hospital admission. Conclusion: MHT is associated with a higher incidence of post-concussion syndrome symptoms, lower quality of life and anxiety than their respective controls even 18 months after the trauma. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: The aim of this study was to determine the impact of endovascular surgery versus open vascular technique training in a Brazilian teaching service. DESIGN: Cross-sectional study. SETTING: Hospital das Clinicas-Faculty of Medicine University of Sao Paulo, a tertiary institutional hospital Brazil. PARTICIPANTS: We reviewed 1,040 arterial operations performed during 2 distinct time periods: January 1995 to December 1996, and January 2006 to December 2007. Based on the disease treated, the procedures were classified into the following 5 groups: abdominal aortic aneurysms (AAA), aorto-iliac obstructive disease (Al), obstructive disease of the femoropoplitealtibial segment (FP), carotid disease (C), and others (0). The operations were also divided into an endovascular surgery (ES) group and an open surgery (OS) group. We compared the number of open and endovascular procedures for each arterial disease group during both periods. RESULTS: During the 2006-2007 period, 654 patients were treated surgically, whereas over the 1995-1996 period, 386 arterial operations were performed. A. significant increase in endovascular procedures (p < 0.001) was found from the 1995-1996 period to the 2006-201)7 period (35 vs 351, respectively) in all groups, whereas open surgery showed a slight increase in the number of procedures in the AAA and 0 groups only. In the 1995-1996 period, OS was the primary surgical method for all groups, but in the 2006-2007 time frame, OS was performed more frequently than ES only in the AAA and 0 groups. Considering all vascular disease groups, OS was the technique used in 90.9% (351 of 386) of the operations during 1995-1996, whereas in 2006-2007, OS was performed in only 46.3% (303 of 654) of the procedures. CONCLUSIONS: The increase in the number of ES observed over the past decade has had little impact on OS procedures performed at our medical center, not bringing harm to open surgical training. (J Surg 68:19-23. (C) 2011 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context Perioperative red blood cell transfusion is commonly used to address anemia, an independent risk factor for morbidity and mortality after cardiac operations; however, evidence regarding optimal blood transfusion practice in patients undergoing cardiac surgery is lacking. Objective To define whether a restrictive perioperative red blood cell transfusion strategy is as safe as a liberal strategy in patients undergoing elective cardiac surgery. Design, Setting, and Patients The Transfusion Requirements After Cardiac Surgery (TRACS) study, a prospective, randomized, controlled clinical noninferiority trial conducted between February 2009 and February 2010 in an intensive care unit at a university hospital cardiac surgery referral center in Brazil. Consecutive adult patients (n=502) who underwent cardiac surgery with cardiopulmonary bypass were eligible; analysis was by intention-to-treat. Intervention Patients were randomly assigned to a liberal strategy of blood transfusion (to maintain a hematocrit >= 30%) or to a restrictive strategy (hematocrit >= 24%). Main Outcome Measure Composite end point of 30-day all-cause mortality and severe morbidity (cardiogenic shock, acute respiratory distress syndrome, or acute renal injury requiring dialysis or hemofiltration) occurring during the hospital stay. The noninferiority margin was predefined at -8% (ie, 8% minimal clinically important increase in occurrence of the composite end point). Results Hemoglobin concentrations were maintained at a mean of 10.5 g/dL(95% confidence interval [CI], 10.4-10.6) in the liberal-strategy group and 9.1 g/dL (95% CI, 9.09.2) in the restrictive-strategy group (P<.001). A total of 198 of 253 patients (78%) in the liberal-strategy group and 118 of 249 (47%) in the restrictive-strategy group received a blood transfusion (P<.001). Occurrence of the primary end point was similar between groups (10% liberal vs 11% restrictive; between-group difference, 1% [95% CI, -6% to 4%]; P=.85). Independent of transfusion strategy, the number of transfused red blood cell units was an independent risk factor for clinical complications or death at 30 days (hazard ratio for each additional unit transfused, 1.2 [95% CI, 1.1-1.4]; P=.002). Conclusion Among patients undergoing cardiac surgery, the use of a restrictive perioperative transfusion strategy compared with a more liberal strategy resulted in noninferior rates of the combined outcome of 30-day all-cause mortality and severe morbidity. Trial Registration clinicaltrials.gov Identifier: NCT01021631 JAMA. 2010; 304(14):1559-1567 www.jama.com

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Physical activity (PA) has proven benefits in the primary prevention of heart diseases such as heart failure (HF). Although it is well known, HF PA habits and physicians` advice have been poorly described. The aim of this study was to investigate if physicians were advising HF patients to exercise and to quantify patients` exercise profiles in a complex cardiology hospital. Methods: All 131 HF patients (80 male, average age 53 +/- 10 years, NYHA class I-V, left ventricular ejection fraction 35 +/- 11%, 35 ischemic, 35 idiopatic , 32 hypertensive and 29 with Chagas disease) went to the hospital for a HF routine check-up. On this occasion, after seeing the physician, we asked the patients if the physician had advised them about PA. Then, we asked them to fill in the international physical activity questionnaire (IPQA) Short Form to classify their PA level. Results: Our data showed a significant difference between patients who had received any kind of PA advice from physicians (36%) and those who had not (64%, p<0.0001). Using the IPAQ criteria, of the 36% of patients who had received advice, 12.4% were classified as low and 23.6% as moderate. Of the 64% of patients who did not receive advice, 26.8% were classified as lowand 37.2% as moderate. Etiology (except Chagas), functional class, ejection fraction, sex and age did not influence the PA profile. Conclusions: Physicians at a tertiary cardiology hospital were not giving patients satisfactory advice as to PA. Our data supports the need to strengthen exercise encouragement by physicians and for complementary studies on this area. (Cardiol J 2010; 17, 2: 143-148)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background - The effect of prearrest left ventricular ejection fraction ( LVEF) on outcome after cardiac arrest is unknown. Methods and Results - During a 26-month period, Utstein-style data were prospectively collected on 800 consecutive inpatient adult index cardiac arrests in an observational, single-center study at a tertiary cardiac care hospital. Prearrest echocardiograms were performed on 613 patients ( 77%) at 11 +/- 14 days before the cardiac arrest. Outcomes among patients with normal or nearly normal prearrest LVEF ( >= 45%) were compared with those of patients with moderate or severe dysfunction ( LVEF < 45%) by chi(2) and logistic regression analyses. Survival to discharge was 19% in patients with normal or nearly normal LVEF compared with 8% in those with moderate or severe dysfunction ( adjusted odds ratio, 4.8; 95% confidence interval, 2.3 to 9.9; P < 0.001) but did not differ with regard to sustained return of spontaneous circulation ( 59% versus 56%; P = 0.468) or 24-hour survival ( 39% versus 36%; P = 0.550). Postarrest echocardiograms were performed on 84 patients within 72 hours after the index cardiac arrest; the LVEF decreased 25% in those with normal or nearly normal prearrest LVEF ( 60 +/- 9% to 45 +/- 14%; P < 0.001) and decreased 26% in those with moderate or severe dysfunction ( 31 +/- 7% to 23 +/- 6%, P < 0.001). For all patients, prearrest beta-blocker treatment was associated with higher survival to discharge ( 33% versus 8%; adjusted odds ratio, 3.9; 95% confidence interval, 1.8 to 8.2; P < 0.001). Conclusions - Moderate and severe prearrest left ventricular systolic dysfunction was associated with substantially lower rates of survival to hospital discharge compared with normal or nearly normal function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is a retrospective observational study of clinical and epidemiologic data from bloodstream yeast infections over 5 years (2004-2008) in a tertiary-care hospital. During this period, there were 52 such infections, at a rate of 2.4 per 1,000 hospital admissions. Non-C. albicans Candida species and other genera were responsible for 82% of infections, with C. tropicalis and C. parapsilosis being the most common. In 2008 no C. albicans infections occurred. Several uncommon fungal pathogens were observed, including Trichosporon asahii, Rhodotorula spp. and Candida zeylanoides. Of 16 isolates tested, 3 (19%) were resistant to fluconazole, including one C. zeylanoides (MIC 8 mu g/ml) and one C. tropicalis (MIC 16 mu g/ml) isolate, as well as intrinsically resistant C. krusei. All isolates tested were susceptible to itraconazole (n = 7) and amphotericin B (n = 8). Yeast infections were associated with severe underlying diseases, mainly hematological/solid cancers (71%), hospitalization in the ICU (41%), central venous catheters (80%), and use of antimicrobials (94%). The overall mortality rate was 50%. Our finding of a predominance of non-C. albicans Candida species infection with uncommon yeasts, and fluconazole resistance, suggests the need for continuous surveillance of fungemia and of antibiotic susceptibility trends, in order to adopt treatment strategies applicable to particular healthcare institutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: This study evaluated the impact of 2 models of educational intervention on rates of central venous catheter-associated bloodstream infections (CVC-BSIs). Methods: This was a prospective observational study conducted between January 2005 and June 2007 in 2 medical intensive care units (designated ICU A and ICU B) in a large teaching hospital. The study was divided into in 3 periods: baseline (only rates were evaluated), preintervention (questionnaire to evaluate knowledge of health care workers [HCWs] and observation of CVC care in both ICUs), and intervention (in ICU A, tailored, continuous intervention; in ICU B, a single lecture). The preintervention and intervention periods for each ICU were compared. Results: During the preintervention period, 940 CVC-days were evaluated in ICUA and 843 CVC-days were evaluated in ICU B. During the intervention period, 2175 CVC-days were evaluated in ICUA and 1694 CVC-days were evaluated in ICU B. Questions regarding CVC insertion, disinfection during catheter manipulation, and use of an alcohol-based product during dressing application were answered correctly by 70%-100% HCWs. Nevertheless, HCWs` adherence to these practices in the preintervention period was low for CVC handling and dressing, hand hygiene (6%-35%), and catheter hub disinfection (45%-68%). During the intervention period, HCWs` adherence to hand hygiene was 48%-98%, and adherence to hub disinfection was 82%-97%. CVC-BSI rates declined in both units. In ICUA, this decrease was progressive and sustained, from 12CVC-BSIs/1000 CVC-days at baseline to 0 after 9 months. In ICU B, the rate initially dropped from 16.2 to 0 CVC-BSIs/1000 CVC-days, but then increased to 13.7 CVC-BSIs/1000 CVC-days. Conclusion: Personal customized, continuous intervention seems to develop a ""culture of prevention"" and is more effective than single intervention, leading to a sustained reduction of infection rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To describe the effect of active surveillance to control vancomycin-resistant enterococci (VRE) after an outbreak, 549 surveillance rectal cultures were performed in 308 patients (35% positive). An educational intervention to prevent transmission was implemented. Infection and colonization by VR-Enterococcus faecalis decreased, but Enterococcus faecium persisted despite control measures. Infections by VR-E faecalis fell to zero in 2008. We observed difficulties in controlling colonization with measures directed mainly by surveillance cultures and differences between responses of E faecium and E faecalis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cervical cancer is a serious public health problem in women in developing countries because of absence or ineffectiveness of screening programs. Several biases to access medical care and inequity of public health system in a continental country like Brazil limit the implementation of adequate programs to appropriately prevent the cervical cancer. Therefore, the aim of this study was to evaluate the results of applying the mobile unit (MU) for cervical cancer screening. From May 2003 to May 2004, a cervical cancer screening was offered to women aged 20-69 years, residing in 19 municipal districts of the Barretos county region, in Sao Paulo. Out of the 9,560 examination available, 2,964 (31%) women underwent screening. The medium distance traveled by the MU was 45 km. The medium time spent by women in the MU for completion of the questionnaire and doing the exam was 20 minutes. It was observed that 17.0% of women screened had never had the test or had not had it repeated within the last 3 years. The negative response was more common among women aged 20 to 29 years and 60 to 69 years and among women with less schooling and lower socio-economic income (P < 0.05). MU can significantly overcome the chronic deficiency of public health system accessibility offering opportunity to these women to participate in screening programs. Diagn. Cytopathol. 2010;38:727-730. (C) 2009 Wiley-Liss, Inc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To test the hypothesis that red blood cell (RBC) transfusions in preterm infants are associated with increased intra-hospital mortality. Study design Variables associated with death were studied with Cox regression analysis in a prospective cohort of preterm infants with birth weight <1500 g in the Brazilian Network on Neonatal Research. Intra-hospital death and death after 28 days of life were analyzed as dependent variables. Independent variables were infant demographic and clinical characteristics and RBC transfusions. Results Of 1077 infants, 574 (53.3%) received at least one RBC transfusion during the hospital stay. The mean number of transfusions per infant was 3.3 +/- 3.4, with 2.1 +/- 2.1 in the first 28 days of life. Intra-hospital death occurred in 299 neonates (27.8%), and 60 infants (5.6%) died after 28 days of life. After adjusting for confounders, the relative risk of death during hospital stay was 1.49 in infants who received at least one RBC transfusion in the first 28 days of life, compared with infants who did not receive a transfusion. The risk of death after 28 days of life was 1.89 times higher in infants who received more than two RBC transfusions during their hospital stay, compared with infants who received one or two transfusions. Conclusion Transfusion was associated with increased death, and transfusion guidelines should consider risks and benefits of transfusion. (J Pediatr 2011; 159: 371-6).