1000 resultados para Hospital de Pobres Sacerdotes (Valencia)-Història
Resumo:
Introduction: Pediatric percutaneous renal biopsy (Bx) is a routine procedure in pediatric nephrology to obtain renal tissues for histological study. We evaluated the safety, efficacy, indications and renal findings of this procedure at a tertiary care pediatric university hospital and compared our findings with the literature. Methods: Retrospective study based on medical records from January 1993 to June 2006. Results: In the study period, 305 Bx were performed in 262 patients, 127 (48.5%) male, aged 9.8 +/- 4.2 years. A 16-gauge needle was utilized in 56/305 Bx, an 18-gauge needle in 252/305 Bx (82.6%). 56.1% Bx were performed under sedation plus local anesthesia, 43.9% under general anesthesia. The number of punctures per Bx was 3.1 +/- 1.3. Minor complications occurred in 8.6% procedures. The 16-gauge needle caused a higher frequency of renal hematomas (p = 0.05). The number of glomeruli per puncture was >= 5 in 96.7% and >= 7 in 92%. Glomeruli number per puncture and frequency of complications were not different according to the type of anesthesia used. A renal pathology diagnosis was achieved in 93.1% Bx. The main indications of Bx were nephrotic syndrome (NS), lupus nephritis (LN) and hematuria (HE). The diagnosis of minimal change disease (MCD) (61.3%), class V (35.6%) and IgA nephropathy (26.3%) predominated in NS, LN and HE patients, respectively. Conclusion: Pediatric real-time ultrasound-guided percutaneous renal biopsy was safe and effective. The main clinical indications for Bx were NS and LN, the predominant renal pathology diagnoses were MCD and class V LN.
Resumo:
Background: Double-balloon enteroscopy (DBE) allows evaluation and therapy for various small bowel diseases. In this series the outcome of a 4-year experience in a tertiary hospital school in Brazil is reported. Methods: A total of 457 consecutive DBE were performed in 418 patients from August 2004 to August 2008. 93 patients with several indications, whose aim was not the evaluation of suspected diseases of the small bowel mucosa, were excluded, therefore leaving 364 DBE in 325 patients for analysis. Data were retrospectively collected with regard to clinical, endoscopic findings, therapy and complications. Results: Among the 364 DBE performed in 325 patients, 143/325 were males (44%) and 182/325 females (56%) with a mean age of 48.6 +/- 15.7 years (range 17-89). Mean investigation time was 64 +/- 22 min (range 35-135). The depth of insertion beyond the ligament of Treitz was 230 +/- 85 cm (range 30-500) by the antegrade approach and 140 +/- 75 cm (range 0-320) by the retrograde approach. Total enteroscopy was achieved in 41.66% of the attempts (30 of 72 patients). Overall diagnostic yield was 54.95% (200 of 364 procedures) ranging from 0 to 100% in this series, depending on the indication. Angiodysplasia was the main diagnosis in 24.5% (49 of 200 procedures) and endoscopic treatment, including biopsies, hemostasis, tattooing and polypectomy were performed in 65.38% (238 of 364 procedures). No major complications were reported. Conclusions: DBE is a feasible, safe and well-tolerated procedure allowing endoscopic therapy. Selection of indications increases its diagnostic yield. Copyright (C) 2009 S. Karger AG, Basel
Resumo:
BACKGROUND: Transanal endoscopic microsurgery may represent appropriate diagnostic and therapeutic procedure in selected patients with distal rectal cancer following neoadjuvant chemoradiation. Even though this procedure has been associated with low rates of postoperative complications, patients undergoing neoadjuvant chemoradiation seem to be at increased risk for suture line dehiscence. In this setting, we compared the clinical outcomes of patients undergoing transanal endoscopic microsurgery with and without neoadjuvant chemoradiation. METHODS: Thirty-six consecutive patients were treated by transanal endoscopic microsurgery at a single institution. Twenty-three patients underwent local excision after neoadjuvant chemoradiation therapy for rectal adenocarcinoma, and 13 patients underwent local excision without any neoadjuvant treatment for benign and malignant rectal tumors. Chemoradiation therapy included 50.4 to 54Gy and 5-fluorouracil-based chemotherapy. All patients underwent transanal endoscopic microsurgery with primary closure of the rectal defect. Complications (immediate and late) and readmission rates were compared between groups. RESULTS: Overall, median hospital stay was 2 days. Immediate (30-d) complication rate was 44% for grade II/III complications. Patients undergoing neoadjuvant chemoradiation therapy were more likely to develop grade II/III immediate complications (56% vs 23%; P = .05). Overall, the 30-day readmission rate was 30%. Wound dehiscence was significantly more frequent among patients undergoing neoadjuvant chemoradiation therapy (70% vs 23%; P = .03). Patients undergoing neoadjuvant chemoradiation therapy were at significantly higher risk of requiring readmission (43% vs 7%; P = .02). CONCLUSION: Transanal local excision with the use of endoscopic microsurgical approach may result in significant postoperative morbidity, wound dehiscence, and readmission rates, in particular, because of rectal pain secondary to wound dehiscence. In this setting, the benefits of this minimally invasive approach either for diagnostic or therapeutic purposes become significantly restricted to highly selected patients that can potentially avoid a major operation but will still face a significantly morbid and painful procedure.
Resumo:
The purpose of the present article was to present the series operated by a Liver Transplant Group of the interior of the State of Sao Paulo, Brazil. Sixty patients were transplanted from May 2001 to May 2007. Thirty percent of the patients had alcoholic cirrhosis. 18.3% had C virus-induced cirrhosis, 10% had C virus- and alcohol-induced cirrhosis, 6% had B virus-induced cirrhosis, 13.3% had cryptogenic cirrhosis, 8.3% autoimmune cirrhosis, 13.3% had familial amyloidotic polyneuropathy (FAP), and 13.3% had hepatocellular carcinomas. The series was divided by a chronological criterion into two periods: A (n = 42) and B (n = 18) with the latter group operated based upon the Model for End-stage Liver Disease (MELD) criterion. Sixty-nine percent were men. Age ranged from 14 to 66 years. Period A included 12% Child A: 59.2%, Child B; 24%, Child C; and 4.8%, FAR Period B comprises 22.2% Child A: 11.1%, Child B: 33.3%, Child C: and 33.3%, FAP. MELD scores ranged from 8 to 35 for period A and from 14 to 31 for period B. Intraoperative mortality was 2/42 patients for period A and 0/18 for period B, overall postoperative mortality was 40% including for period A, 35% among Child B and C patients, and 5 % among FAP and Child A patients (P <.05) and 16.6% for period B among 11. 1 % Child B patients and 5.5 % FAP patients; 3.3 % of patients required retransplantation due to hepatic artery thrombosis. Real postoperative survival was 60% during period A and 83.3% during period B, with an overall survival rate of 67% for the two periods. The present results show levels of postoperative mortality, (especially during period B), and survival rates similar to those reported by several other centers in Brazil.
Resumo:
Background Disease management programs (DMPs) are developed to address the high morbi-mortality and costs of congestive heart failure (CHF). Most studies have focused on intensive programs in academic centers. Washington County Hospital (WCH) in Hagerstown, MD, the primary reference to a semirural county, established a CHF DMP in 2001 with standardized documentation of screening and participation. Linkage to electronic records and state vital statistics enabled examination of the CHF population including individuals participating and those ineligible for the program. Methods All WCH inpatients with CHF International Classification of Diseases, Ninth Revision code in any position of the hospital list discharged alive. Results Of 4,545 consecutive CHF admissions, only 10% enrolled and of those only 52.2% made a call. Enrollment in the program was related to: age (OR 0.64 per decade older, 95% CI 0.58-0.70), CHF as the main reason for admission (OR 3.58, 95% CI 2.4-4.8), previous admission for CHF (OR 1.14, 95% CI 1.09-1.2), and shorter hospital stay (OR 0.94 per day longer, 95% CI 0.87-0.99). Among DMP participants mortality rates were lowest in the first month (80/1000 person-years) and increased subsequently. The opposite mortality trend occurred in nonenrolled groups with mortality in the first month of 814 per 1000 person-years in refusers and even higher in ineligible (1569/1000 person-years). This difference remained significant after adjustment. Re-admission rates were lower among participants who called consistently (adjusted incidence rate ratio 0.62, 95% CI 0.52-0.77). Conclusion Only a small and highly select group participated in a low-intensity DMP for CHF in a community-based hospital. Design of DMPs should incorporate these strong selective factors to maximize program impact. (Am Heart J 2009; 15 8:459-66.)
Resumo:
The Brazilian emergency system is being reorganized as a hierarchy in the region of Ribeirao Preto, state of Sao Paulo. We found increased occupational risk for tuberculosis in this region tertiary reference center-a nurse technician (Incidence rate [IR] 526.3/100000 inhabitants) had a risk of tuberculosis 12.6 (95% confidence interval [CI], 2.57-37.23) greater than the city population (41.8/100000 inhabitants). The system reorganization will have to make the centers adequate to deal with this problem. (C) 2008 Elsevier Inc. All rights reserved.
Resumo:
Background & aim: Many disease outbreaks of food origin are caused by foods prepared in Food Service and Nutrition Units of hospitals, affecting hospitalized patients who, in most cases, are immunocompromised and therefore at a higher risk of severe worsening of their clinical status. The aim of this study was to determine the variations in temperature and the time-temperature factor of hospital diets. Methods: The time and temperature for the preparation of 4 diets of modified consistency were determined on 5 nonconsecutive days in a hospital Diet and Nutrition Unit at the end of preparation and during the maintenance period, portioning and distribution at 3 sites, i.e., the first, the middle and the last to receive the diets. Results and discussion: All foods reached an adequate temperature at the end of cooking, but temperature varied significantly from the maintenance period to the final distribution, characterizing critical periods for microorganism proliferation. During holding, temperatures that presented a risk were reached by 16.7% of the meats and 59% of the salads of the general diet, by 16.7% of the garnishes in the bland diet and by 20% of the meats and garnishes in the viscous diet. The same occurred at the end of distribution for 100% of the hot samples and of the salads and for 61% of the desserts. None of the preparations remained at risk temperature for a time exceeding that established by law. Conclusion: The exposure to inadequate temperature did not last long enough to pose risks to the patient.
Resumo:
Background: Acute kidney injury (AKI) is a frequent complication in hospitalized patients, especially in those in intensive care units (ICU). The RIFLE classification might be a valid prognostic factor for critically ill cancer patients. The present study aims to evaluate the discriminatory capacity of RIFLE versus other general prognostic scores in predicting hospital mortality in critically ill cancer patients. Methods: This is a single-center study conducted in a cancer-specialized ICU in Brazil. All of the 288 patients hospitalized from May 2006 to June 2008 were included. RIFLE classification, APACHE II, SOFA, and SAPS II scores were calculated and the area under receiver operating characteristic (AROC) curves and logistic multiple regression were performed using hospital mortality as the outcome. Results: AKI, defined by RIFLE criteria, was observed in 156 (54.2%) patients. The distribution of patients with any degree of AKI was: risk, n = 96 (33.3%); injury, n = 30 (10.4%), and failure, n = 30 (10.4%). Mortality was 13.6% for non-AKI patients, 49% for RIFLE `R` patients, 62.3% for RIFLE `I` patients, and 86.8% for RIFLE `F` patients (p = 0.0006). Logistic regression analysis showed that RIFLE criteria, APACHE II, SOFA, and SAPS II were independent factors for mortality in this population. The discrimination of RIFLE was good (AROC 0.801, 95% CI 0.748-0.854) but inferior compared to those of APACHE II (AROC 0.940, 95% CI 0.915-0.966), SOFA (AROC 0.910, 95% CI 0.876-0.943), and SAPS II (AROC 0.869, 95% CI 0.827-0.912). Conclusion: AKI is a frequent complication in ICU patients with cancer. RIFLE was inferior to commonly used prognostic scores for predicting mortality in this cohort of patients. Copyright (C) 2011 S. Karger AG, Basel
Resumo:
This study investigated the etiology of canine ehrlichiosis and possible clinical and epidemiological data associated with the infection in 70 dogs suspect of ehrlichiosis attended at the Veterinary Hospital of the Sao Paulo State University in Botucatu city during 2001 and 2002. Dogs were evaluated by clinical-epidemiological and hematological data and molecular analysis by partial amplification and DNA sequencing of the ehrlichial dsb gene. E. canes DNA was amplified and sequenced in 28 (40.0%) dogs. Dogs younger than 12 months old showed significantly higher infection rates (65.0%; P < 0.05). Diarrhea, apathy, and anorexia were the major clinical signs observed in 55.2% (P = 0.05), 47.0% (P > 0.05), and 42.4% (P > 0.05) of the PCR-positive dogs, respectively. Twenty-five anemic (<5.5 x 10(6) RBC.mu L(-1)), and 8 leukopenic (<5.5 x 10(3) WBC.mu L(-1)) dogs were PCR-positive (P > 0.05). All 28 PCR-positive dogs showed thrombocytopenia (<175 x 10(3) platelets.mu L(-1)) and revealed statistical significance (P < 0.05). E. canis was the only Ehrlichia species found in dogs in the studied region, with higher infection rates in younger dogs, and statistically associated with thrombocytopenia.
Resumo:
A pilot survey was undertaken of injury presentations to a public hospital emergency department to determine patterns of alcohol use in this population. Of the 402 injury presentations in the study period, a total of 236 injury cases were interviewed, of whom 45% (n=107) and 29% (n=69) had consumed alcohol 24 and 6 hours prior to injury. Mean age for all injury presentations was 35.1 years, and 32.6 years for alcohol injury cases. For both injury groups, males were significantly younger than females. Recent alcohol ingestion was three times more common among male than female injury presentations, but with females drinking at significantly lower levels. Of males who had consumed alcohol 6 hours prior to injury, nearly 70% were drinking at NHMRC harmful levels and 61% had drunk more than eight standard drinks. Overall, alcohol-involved injury cases commonly occurred among low-income, single males around 30 years of age who were regular heavy drinkers who were drinking heavily in licensed premises prior to their injury, and who sustained injury through intentional harm. In addition, one in five of the alcohol-involved injury cases were aged 15-18 years, i.e. below the legal age of purchase. The high proportion of hazardous and harmful drinkers among those who had consumed alcohol within the last 6 hours, and the injury sample overall, highlights the need for further research to explore the relationship between the occurrence of injury and the drinking patterns and environments associated with injury. Further research is also required to assess the efficacy of early and brief interventions for alcohol and drug use within the emergency ward setting. This information would enable appropriate public health interventions to be initiated.
Resumo:
Australian isolates of vancomycin-resistant enterococci (VRE) have been widely scattered geographically, predominantly polyclonal and of the VanB phenotype. Forty-nine VRE were isolated from 47 patients in our hospital from October 1996 to December 1999. Forty-four of these VRE were Enterococcus faecium with a vanA glycopeptide resistance genotype. Four isolates were pathogenic. Thirty-five VRE were from an outbreak in the Renal and Infectious Diseases Units over a four-month period. Pulsed-field gel electrophoresis (PFGE) demonstrated that 41 of the 49 VRE were indistinguishable or closely related. Enhanced environmental cleaning, strict contact isolation of colonized patients and reducing inpatient admissions terminated the epidemic. Cohorting of methicillin-resistant Staphylococcus aureus (MRSA)-positive patients was restricted because VPE patients occupied the isolation facilities. This resulted in a statistically significant increase in MRSA infections across the hospital. VRE epidemics have the ability to influence the epidemiology of other nosocomial pathogens when infection control resources are exhausted. (C) 2001 The Hospital Infection Society.