997 resultados para reference center forsocial Assistance
Resumo:
INTRODUCTION: Hypoplastic left heart syndrome (HLHS) is a major cause of cardiac death during the first week of life. The hybrid approach is a reliable, reproducible treatment option for patients with HLHS. Herein we report our results using this approach, focusing on its efficacy, safety and late outcome. METHODS: We reviewed prospectively collected data on patients treated for HLHS using a hybrid approach between July 2007 and September 2014. RESULTS: Nine patients had a stage 1 hybrid procedure, with seven undergoing a comprehensive stage 2 procedure. One patient completed the Fontan procedure. Five patients underwent balloon atrial septostomy after the hybrid procedure; in three patients, a stent was placed across the atrial septum. There were three deaths: two early after the hybrid procedure and one early after stage two palliation. Overall survival was 66%. CONCLUSIONS: In our single-center series, the hybrid approach for HLHS yields intermediate results comparable to those of the Norwood strategy. The existence of dedicated teams for the diagnosis and management of these patients, preferably in high-volume centers, is of major importance in this condition.
Resumo:
In a liver transplant (LT) center, treatments with Prometheus were evaluated. The main outcome considered was 1 and 6 months survival. Methods. During the study period, 74 patients underwent treatment with Prometheus; 64 were enrolled,with a mean age of 51 13 years; 47men underwent 212 treatments (mean, 3.02 per patient). The parameters evaluated were age, sex, laboratorial (liver enzymes, ammonia) and clinical (model for end-stage liver disease and Child-Turcotte-Pugh score) data. Results. Death was verified in 23 patients (35.9%) during the hospitalization period, 20 patients (31.3%) were submitted to liver transplantation, and 21 were discharged. LT was performed in 4 patients with acute liver failure (ALF, 23.7%), in 7 patients with acute on chronic liver failure (AoCLF, 43.7%), and in 6 patients with liver disease after LT (30%). Seven patients who underwent LT died (35%). In the multivariate analysis, older age (P ¼ .015), higher international normalized ratio (INR) (P ¼ .019), and acute liver failure (P ¼ .039) were independently associated with an adverse 1-month clinical outcome. On the other hand, older age (P ¼ .011) and acute kidney injury (P ¼ .031) at presentation were both related to worse 6-month outcome. For patients with ALF and AoCLF we did not observe the same differences. Conclusions. In this cohort, older age was the most important parameter defining 1- and 6-month survival, although higher INR and presence of ALF were important for 1-month survival and AKI for 6-month survival. No difference was observed between patients who underwent LT or did not have LT.
Resumo:
Objective. To access the incidence of infectious problems after liver transplantation (LT). Design. A retrospective, single-center study. Materials and Methods. Patients undergoing LT from January 2008 to December 2011 were considered. Exclusion criterion was death occurring in the first 48 hours after LT. We determined the site of infection and the bacterial isolates and collected and compared recipient’s variables, graft variables, surgical data, post-LT clinical data. Results. Of the 492 patients who underwent LT and the 463 considered for this study, 190 (Group 1, 41%) developed at least 1 infection, with 298 infections detected. Of these, 189 microorganisms were isolated, 81 (51%) gram-positive bacteria (most frequently Staphylococcus spp). Biliary infections were more frequent (mean time of 160.4 167.7 days after LT); from 3 months after LT, gram-negative bacteria were observed (57%). Patients with infections after LT presented lower aminotransferase levels, but higher requirements in blood transfusions, intraoperative vasopressors, hemodialysis, and hospital stay. Operative and cold ischemia times were similar. Conclusion. We found a 41% incidence of all infections in a 2-year follow-up after LT. Gram-positive bacteria were more frequent isolated; however, negative bacteria were commonly isolated later. Clinical data after LT were more relevant for the development of infections. Donors’ variables should be considered in future analyses.
Resumo:
INTRODUCTION: The aim of this preliminary work is to analyze the clinical features of 52 patients with a functional transplanted kidney for >25 years (all first transplant and all deceased donor recipients) and to compare with a similar though more complete study from Hôpital Necker-Paris 2012. METHODS: The mean graft survival at 25 years is 12.7% and at 30 years is 10%. The actual mean serum creatinine concentration is 1.3 mg/L. We analyzed recipient age (mean, 35.9 years) and gender (29 men and 23 women). Donor age was 26.7 ± 10.3 years. Seven patients (13.4%) were transplanted with 1 HLA mismatch, 42.3% with 2 mismatches, and 44.2% with 3 mismatches. Mean cold ischemia time was 15.45 ± 7.7 hours. Of the recipients, 76% had immediate graft function; 38% experienced 1 acute rejection episode and 4 patients had 2 rejection crises. The initial immunosuppressive regimen was azathioprine (AZA) + prednisolone (Pred) in 14 patients, cyclosporin (CSA) + Pred in 13 patients, and CSA + AZA + Pred in 25 patients. Of these patients, 19% maintained their initial regimen, and 54% (28 patients) were very stable on a mixed CSA regimen for >25 years. RESULTS: We present the major complications (diabetes, neoplasia, and hepatitis C virus positivity). CONCLUSION: Our results in deceased donor kidney recipients for >25 years are similar to the mixed population (deceased donors and living donors) presented by the Necker group, although 54% of our patients remain on CSA immunosuppression, contradicting the idea that its use is not compatible with good long-term kidney function in transplant recipients.
Resumo:
INTRODUCTION: New scores have been developed and validated in the US for in-hospital mortality risk stratification in patients undergoing coronary angioplasty: the National Cardiovascular Data Registry (NCDR) risk score and the Mayo Clinic Risk Score (MCRS). We sought to validate these scores in a European population with acute coronary syndrome (ACS) and to compare their predictive accuracy with that of the GRACE risk score. METHODS: In a single-center ACS registry of patients undergoing coronary angioplasty, we used the area under the receiver operating characteristic curve (AUC), a graphical representation of observed vs. expected mortality, and net reclassification improvement (NRI)/integrated discrimination improvement (IDI) analysis to compare the scores. RESULTS: A total of 2148 consecutive patients were included, mean age 63 years (SD 13), 74% male and 71% with ST-segment elevation ACS. In-hospital mortality was 4.5%. The GRACE score showed the best AUC (0.94, 95% CI 0.91-0.96) compared with NCDR (0.87, 95% CI 0.83-0.91, p=0.0003) and MCRS (0.85, 95% CI 0.81-0.90, p=0.0003). In model calibration analysis, GRACE showed the best predictive power. With GRACE, patients were more often correctly classified than with MCRS (NRI 78.7, 95% CI 59.6-97.7; IDI 0.136, 95% CI 0.073-0.199) or NCDR (NRI 79.2, 95% CI 60.2-98.2; IDI 0.148, 95% CI 0.087-0.209). CONCLUSION: The NCDR and Mayo Clinic risk scores are useful for risk stratification of in-hospital mortality in a European population of patients with ACS undergoing coronary angioplasty. However, the GRACE score is still to be preferred.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
The four studies in this article introduce a questionnaire to measure Strength of the HRM System (HRMSQ), a multidimensional construct, theoretically developed by Bowen and Ostroff (2004). Strength of the HRM System is a set of process characteristics that lead to effectiveness in conveying signals to employees that allow them to create a shared meaning of desired and appropriate work behaviours. Nine characteristics are suggested, grouped in three features: Distinctiveness, Consistency and Consensus. Study 1 developed and tested a questionnaire in a sample of workers from five different sectors. Study 2 cross-validated the measure in a sample of civil servants in a municipality. These two studies used performance appraisal as the reference HRM practice and led to a short version of the HRMSQ. Study 3 and Study 4 extend the HRMSQ to several common HRM practices. The HRMSQ is tested in two samples, of call center and several private and public organizations‟ workers (study 3). In study 4 the questionnaire is refined and tested with a sample from a hotel chain and finally cross-validated with two other samples, in the insurance and batteries sectors, leading to a longer version of the HRMSQ. Content analysis of several interviews with human resource managers and the Rasch model (1960, 1961, 1980), were used to define and select the indicators of the questionnaire. Convergent, discriminant and predictive validity of the measure are tested. The results of the four studies highlight the complexity of the relationships between the proposed characteristics and support the validity of a parsimonious measure of Strength of the HRM System.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Electrotécnica e Computadores
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
A survey was done to determine the most common hospital accidents with biologically contaminated material among students at the Medical College of the Federal University of Minas Gerais. Six hundred and ninety-four students (between fifth and twelfth semesters of the college course) answered the questionnaire individually. Three-hundred and forty-nine accidents were reported. The accident rate was found to be 33.9% in the third semester of the course, and increased over time, reaching 52.3% in the last semester. Sixty-three percent of the accidents were needlestick or sharp object injuries; 18.3% mucous membrane exposure; 16.6% were on the skin, and 1.7% were simultaneously on the skin and mucous membrane exposure. The contaminating substances were: blood (88.3%), vaginal secretion (1.7%), and others (9.1%). The parts of the body most frequently affected were: hands (67%), eyes (18.9%), mouth (1.7%), and others (6.3%). The procedures being performed when the accidents occurred were: suture (34.1%), applying anesthesia (16.6%), assisting surgery (8.9%), disposing of needles (8.6%), assisting delivery (6.3%), and others (25.9%). Forty-nine percent of those involved reported the accident to the accident control department. Of these 29.2% did not receive adequate medical assistance. Eight percent of those involved used antiretroviral drugs and of these 86% discontinued the treatment on receiving the Elisa method applied to the patient (HIV-negative); 6.4% discontinued the treatment due to its side-effects; and 16% completed the treatment.