814 resultados para Retrospective cohort analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Injecting drug use (IDU) before and after liver transplantation (LT) is poorly described. The aim of this study was to quantify relapse and survival in this population and to describe the causes of mortality after LT.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is unknown whether body-mass index (BMI) and commonly defined BMI categories are associated with mortality in patients with septic shock.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background In Switzerland there are about 150,000 equestrians. Horse related injuries, including head and spinal injuries, are frequently treated at our level I trauma centre. Objectives To analyse injury patterns, protective factors, and risk factors related to horse riding, and to define groups of safer riders and those at greater risk Methods We present a retrospective and a case-control survey at conducted a tertiary trauma centre in Bern, Switzerland. Injured equestrians from July 2000 - June 2006 were retrospectively classified by injury pattern and neurological symptoms. Injured equestrians from July-December 2008 were prospectively collected using a questionnaire with 17 variables. The same questionnaire was applied in non-injured controls. Multiple logistic regression was performed, and combined risk factors were calculated using inference trees. Results Retrospective survey A total of 528 injuries occured in 365 patients. The injury pattern revealed as follows: extremities (32%: upper 17%, lower 15%), head (24%), spine (14%), thorax (9%), face (9%), pelvis (7%) and abdomen (2%). Two injuries were fatal. One case resulted in quadriplegia, one in paraplegia. Case-control survey 61 patients and 102 controls (patients: 72% female, 28% male; controls: 63% female, 37% male) were included. Falls were most frequent (65%), followed by horse kicks (19%) and horse bites (2%). Variables statistically significant for the controls were: Older age (p = 0.015), male gender (p = 0.04) and holding a diploma in horse riding (p = 0.004). Inference trees revealed typical groups less and more likely to suffer injury. Conclusions Experience with riding and having passed a diploma in horse riding seem to be protective factors. Educational levels and injury risk should be graded within an educational level-injury risk index.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are more than 10 million prison inmates throughout the world and this number is increasing continuously. Prisoners are a particularly vulnerable minority group that has special healthcare needs and demands on healthcare services and providers. The aim of this study was to give an overview of prisoners' healthcare problems leading to emergency department admission, in order to make recommendations to help to optimise treatment of this target group.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: The paucity of data on resource use in critically ill patients with hematological malignancy and on these patients' perceived poor outcome can lead to uncertainty over the extent to which intensive care treatment is appropriate. The aim of the present study was to assess the amount of intensive care resources needed for, and the effect of treatment of, hemato-oncological patients in the intensive care unit (ICU) in comparison with a nononcological patient population with a similar degree of organ dysfunction. METHODS: A retrospective cohort study of 101 ICU admissions of 84 consecutive hemato-oncological patients and 3,808 ICU admissions of 3,478 nononcological patients over a period of 4 years was performed. RESULTS: As assessed by Therapeutic Intervention Scoring System points, resource use was higher in hemato-oncological patients than in nononcological patients (median (interquartile range), 214 (102 to 642) versus 95 (54 to 224), P < 0.0001). Severity of disease at ICU admission was a less important predictor of ICU resource use than necessity for specific treatment modalities. Hemato-oncological patients and nononcological patients with similar admission Simplified Acute Physiology Score scores had the same ICU mortality. In hemato-oncological patients, improvement of organ function within the first 48 hours of the ICU stay was the best predictor of 28-day survival. CONCLUSION: The presence of a hemato-oncological disease per se is associated with higher ICU resource use, but not with increased mortality. If withdrawal of treatment is considered, this decision should not be based on admission parameters but rather on the evolutional changes in organ dysfunctions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: Despite the key role of hemodynamic goals, there are few data addressing the question as to which hemodynamic variables are associated with outcome or should be targeted in cardiogenic shock patients. The aim of this study was to investigate the association between hemodynamic variables and cardiogenic shock mortality. METHODS: Medical records and the patient data management system of a multidisciplinary intensive care unit (ICU) were reviewed for patients admitted because of cardiogenic shock. In all patients, the hourly variable time integral of hemodynamic variables during the first 24 hours after ICU admission was calculated. If hemodynamic variables were associated with 28-day mortality, the hourly variable time integral of drops below clinically relevant threshold levels was computed. Regression models and receiver operator characteristic analyses were calculated. All statistical models were adjusted for age, admission year, mean catecholamine doses and the Simplified Acute Physiology Score II (excluding hemodynamic counts) in order to account for the influence of age, changes in therapies during the observation period, the severity of cardiovascular failure and the severity of the underlying disease on 28-day mortality. RESULTS: One-hundred and nineteen patients were included. Cardiac index (CI) (P = 0.01) and cardiac power index (CPI) (P = 0.03) were the only hemodynamic variables separately associated with mortality. The hourly time integral of CI drops <3, 2.75 (both P = 0.02) and 2.5 (P = 0.03) L/min/m2 was associated with death but not that of CI drops <2 L/min/m2 or lower thresholds (all P > 0.05). The hourly time integral of CPI drops <0.5-0.8 W/m2 (all P = 0.04) was associated with 28-day mortality but not that of CPI drops <0.4 W/m2 or lower thresholds (all P > 0.05). CONCLUSIONS: During the first 24 hours after intensive care unit admission, CI and CPI are the most important hemodynamic variables separately associated with 28-day mortality in patients with cardiogenic shock. A CI of 3 L/min/m2 and a CPI of 0.8 W/m2 were most predictive of 28-day mortality. Since our results must be considered hypothesis-generating, randomized controlled trials are required to evaluate whether targeting these levels as early resuscitation endpoints can improve mortality in cardiogenic shock.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dog bites in humans are a complex problem, embracing both public health and animal welfare. The primary aim of this study is to examine primary and secondary presentations related to dog bite injuries in adults. Methods. We retrospectively assessed all adult patients admitted with a dog bite injury to the Emergency Department of Bern University Hospital. Results. A total of 431 patients were eligible for the study. Forty-nine (11.4%) of all patients were admitted with secondary presentations. Bites to the hands were most common (177, 41.1%). All patients (47, 100%) with secondary presentations were admitted because of signs of infection. The median time since the dog bite was 3.8 days (SD 3.9, range 1–21). Thirty-one patients had already been treated with antibiotic; coamoxicillin was the most common primary antibiotic therapy (27/47 patients, 57.4%). Patients with injuries to the hand were at increased risk of secondary presentations (OR 2.08, 95% CI 1.21–3.55, < 0.006). Conclusion. Dog bite injuries to the hands are a major problem. They often lead to infectious complications. Immediate antibiotic therapy should carefully be evaluated for each patient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to examine the success rate of paramedian palatal Orthosystem first- and second-generation implants used for anchorage in orthodontic treatment in patients treated by one experienced orthodontist. The records of 143 patients (90 female, 53 male, median age: 15.7 years, range: 10.2-50.9) receiving 145 palatal implants of the first or second generation (Orthosystem, Straumann AG, Basel, Switzerland) were examined. All the palatal implants were placed in a paramedian palatal location by three experienced surgeons. Stable implants were orthodontically loaded after a healing period of 3 months. Out of the 145 inserted paramedian palatal implants only seven implants (4.8%) were not considered stable after insertion. All the successfully osseointegrated implants remained stable during orthodontic treatment. Paramedian palatal implants are highly reliable and effective devices to obtain skeletal anchorage for orthodontic treatment. This study has shown that the paramedian location is a good alternative to the median location.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Assessment of the proportion of patients with well controlled cardiovascular risk factors underestimates the proportion of patients receiving high quality of care. Evaluating whether physicians respond appropriately to poor risk factor control gives a different picture of quality of care. We assessed physician response to control cardiovascular risk factors, as well as markers of potential overtreatment in Switzerland, a country with universal healthcare coverage but without systematic quality monitoring, annual report cards on quality of care or financial incentives to improve quality. METHODS We performed a retrospective cohort study of 1002 randomly selected patients aged 50-80 years from four university primary care settings in Switzerland. For hypertension, dyslipidemia and diabetes mellitus, we first measured proportions in control, then assessed therapy modifications among those in poor control. "Appropriate clinical action" was defined as a therapy modification or return to control without therapy modification within 12 months among patients with baseline poor control. Potential overtreatment of these conditions was defined as intensive treatment among low-risk patients with optimal target values. RESULTS 20% of patients with hypertension, 41% with dyslipidemia and 36% with diabetes mellitus were in control at baseline. When appropriate clinical action in response to poor control was integrated into measuring quality of care, 52 to 55% had appropriate quality of care. Over 12 months, therapy of 61% of patients with baseline poor control was modified for hypertension, 33% for dyslipidemia, and 85% for diabetes mellitus. Increases in number of drug classes (28-51%) and in drug doses (10-61%) were the most common therapy modifications. Patients with target organ damage and higher baseline values were more likely to have appropriate clinical action. We found low rates of potential overtreatment with 2% for hypertension, 3% for diabetes mellitus and 3-6% for dyslipidemia. CONCLUSIONS In primary care, evaluating whether physicians respond appropriately to poor risk factor control, in addition to assessing proportions in control, provide a broader view of the quality of care than relying solely on measures of proportions in control. Such measures could be more clinically relevant and acceptable to physicians than simply reporting levels of control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cisplatin, a major antineoplastic drug used in the treatment of solid tumors, is a known nephrotoxin. This retrospective cohort study evaluated the prevalence and severity of cisplatin nephrotoxicity in 54 children and its impact on height and weight.We recorded the weight, height, serum creatinine, and electrolytes in each cisplatin cycle and after 12 months of treatment. Nephrotoxicity was graded as follows: normal renal function (Grade 0); asymptomatic electrolyte disorders, including an increase in serum creatinine, up to 1.5 times baseline value (Grade 1); need for electrolyte supplementation <3 months and/or increase in serum creatinine 1.5 to 1.9 times from baseline (Grade 2); increase in serum creatinine 2 to 2.9 times from baseline or need for electrolyte supplementation for more than 3 months after treatment completion (Grade 3); and increase in serum creatinine ≥3 times from baseline or renal replacement therapy (Grade 4).Nephrotoxicity was observed in 41 subjects (75.9%). Grade 1 nephrotoxicity was observed in 18 patients (33.3%), Grade 2 in 5 patients (9.2%), and Grade 3 in 18 patients (33.3%). None had Grade 4 nephrotoxicity. Nephrotoxicity patients were younger and received higher cisplatin dose, they also had impairment in longitudinal growth manifested as statistically significant worsening on the height Z Score at 12 months after treatment. We used a multiple logistic regression model using the delta of height Z Score (baseline-12 months) as dependent variable in order to adjust for the main confounder variables such as: germ cell tumor, cisplatin total dose, serum magnesium levels at 12 months, gender, and nephrotoxicity grade. Patients with nephrotoxicity Grade 1 where at higher risk of not growing (OR 5.1, 95% CI 1.07-24.3, P=0.04). The cisplatin total dose had a significant negative relationship with magnesium levels at 12 months (Spearman r=-0.527, P=<0.001).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION The incidence of cancer increases with age and owing to the changing demographics we are increasingly confronted with treating bladder cancer in old patients. We report our results in patients>75 years of age who underwent open radical cystectomy (RC) and urinary diversion. MATERIAL AND METHODS From January 2000 to March 2013, a consecutive series of 224 old patients with complete follow-up who underwent RC and urinary diversion (ileal orthotopic bladder substitute [OBS], ileal conduit [IC], and ureterocutaneostomy [UCST]) were included in this retrospective single-center study. End points were the 90-day complication rates (Clavien-Dindo classification), 90-day mortality rates, overall and cancer-specific survival rates, and continence rates (OBS). RESULTS Median age was 79.2 years (range: 75.1-91.6); 35 of the 224 patients (17%) received an OBS, 178 of the 224 patients (78%) an IC, and 11 of the 224 patients (5%) an UCST. The 90-day complication rate was 54.3% in the OBS (major: Clavien grade 3-5: 22.9%, minor: Clavien Grade 1-2: 31.4%), 56.7% in the IC (major: 27%, minor: 29.8%), and 63.6% in the UCST group (major: 36.4%, minor: 27.3%); P = 0.001. The 90-day mortality was 0% in the OBS group, 13% in the IC group, and 10% in the UCST group (P = 0.077). The Glasgow prognostic score was an independent predictor of all survival parameters assessed, including 90-day mortality. Median follow-up was 22 months. Overall and cancer-specific survivals were 90 and 98, 47 and 91, and 11 and 12 months for OBS, IC, and UCST, respectively. In OBS patients, daytime continence was considered as dry in 66% and humid in 20% of patients. Nighttime continence was dry in 46% and humid 26% of patients. CONCLUSION With careful patient selection, oncological and functional outcome after RC can be good in old patients. Old age as the sole criterion should not preclude the indication for RC or the option of OBS. In old patients undergoing OBS, satisfactory continence results can be achieved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES Randomized clinical trials that enroll patients in critical or emergency care (acute care) setting are challenging because of narrow time windows for recruitment and the inability of many patients to provide informed consent. To assess the extent that recruitment challenges lead to randomized clinical trial discontinuation, we compared the discontinuation of acute care and nonacute care randomized clinical trials. DESIGN Retrospective cohort of 894 randomized clinical trials approved by six institutional review boards in Switzerland, Germany, and Canada between 2000 and 2003. SETTING Randomized clinical trials involving patients in an acute or nonacute care setting. SUBJECTS AND INTERVENTIONS We recorded trial characteristics, self-reported trial discontinuation, and self-reported reasons for discontinuation from protocols, corresponding publications, institutional review board files, and a survey of investigators. MEASUREMENTS AND MAIN RESULTS Of 894 randomized clinical trials, 64 (7%) were acute care randomized clinical trials (29 critical care and 35 emergency care). Compared with the 830 nonacute care randomized clinical trials, acute care randomized clinical trials were more frequently discontinued (28 of 64, 44% vs 221 of 830, 27%; p = 0.004). Slow recruitment was the most frequent reason for discontinuation, both in acute care (13 of 64, 20%) and in nonacute care randomized clinical trials (7 of 64, 11%). Logistic regression analyses suggested the acute care setting as an independent risk factor for randomized clinical trial discontinuation specifically as a result of slow recruitment (odds ratio, 4.00; 95% CI, 1.72-9.31) after adjusting for other established risk factors, including nonindustry sponsorship and small sample size. CONCLUSIONS Acute care randomized clinical trials are more vulnerable to premature discontinuation than nonacute care randomized clinical trials and have an approximately four-fold higher risk of discontinuation due to slow recruitment. These results highlight the need for strategies to reliably prevent and resolve slow patient recruitment in randomized clinical trials conducted in the critical and emergency care setting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND  Drug resistance is a major barrier to successful antiretroviral treatment (ART). Therefore, it is important to monitor time trends at a population level. METHODS  We included 11,084 ART-experienced patients from the Swiss HIV Cohort Study (SHCS) between 1999 and 2013. The SHCS is highly representative and includes 72% of patients receiving ART in Switzerland. Drug resistance was defined as the presence of at least one major mutation in a genotypic resistance test. To estimate the prevalence of drug resistance, data for patients with no resistance test was imputed based on patient's risk of harboring drug resistant viruses. RESULTS  The emergence of new drug resistance mutations declined dramatically from 401 to 23 patients between 1999 and 2013. The upper estimated prevalence limit of drug resistance among ART-experienced patients decreased from 57.0% in 1999 to 37.1% in 2013. The prevalence of three-class resistance decreased from 9.0% to 4.4% and was always <0.4% for patients who initiated ART after 2006. Most patients actively participating in the SHCS in 2013 with drug resistant viruses initiated ART before 1999 (59.8%). Nevertheless, in 2013, 94.5% of patients who initiated ART before 1999 had good remaining treatment options based on Stanford algorithm. CONCLUSION  HIV-1 drug resistance among ART-experienced patients in Switzerland is a well-controlled relic from the pre-combination ART era. Emergence of drug resistance can be virtually stopped with new potent therapies and close monitoring.