979 resultados para 30-day readmission
Resumo:
Hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX) is a nitramine compound that has been used heavily by the military as an explosive. Manufacturing, use, and disposal of RDX have led to several contamination sites across the United States. RDX is both persistent in the environment and a threat to human health, making its remediation vital. The use of plants to extract RDX from the soil and metabolize it once it is in the plant tissue, is being considered as a possible solution. In the present study, the tropical grass Chrysopogon zizanioides was grown hydroponically in the presence RDX at 3 different concentration levels: 0.3, 1.1, and 2.26 ppm. The uptake of RDX was quantified by high performance liquid chromatography (HPLC) analysis of media samples taken every 6 hr during the first 24 hr and then daily over a 30-day experimental period. A rapid decrease in RDX concentration in the media of both controls and plant treatments was seen within the first 18 hours of the experiment with the greatest loss in RDX over time occurring within the first 6 hours of exposure. The loss was similar in both controls and plant exposures and possibly attributed to rapid uptake by the containers. A plant from one treatment at each of the three concentrations was harvested at Day 10, 20 and 30 throughout the experiment and extracted to determine the localization of RDX within the tissue and potentially identify any metabolites on the basis of differing retention times. Of the treatments containing 0.3, 1.1, and 2.26 ppm RDX, 13.1%, 18.3%, and 24.2% respectively, was quantified in vetiver extracts, with the majority of the RDX being localized to the roots. All plants not yet harvested were harvested on Day 30 of the experiment. A total of three plants exposed to each concentration level as well as the control, were extracted and analyzed with HPLC to determine amount of RDX taken up, localization of RDX within the plant tissue, and potentially identify any metabolites. Phytotoxicity of RDX to vetiver was also monitored. While a loss in biomass was observed in plants exposed to all the different concentrations of RDX, control plants grown in media not exposed to RDX showed the greatest biomass loss of all the treatments. There was also little variation in chlorophyll content between the different concentration treatments with RDX. This preliminary greenhouse study of RDX uptake 10 by Chrysopogon zizanioides will help indicate the potential ability of vetiver to serve as a plant system in the phytoremediation of RDX.
Resumo:
OBJECTIVE: Treatment of central and paracentral pulmonary embolism in patients with hemodynamic compromise remains a subject of debate, and no consensus exists regarding the best method: thrombolytic agents, catheter-based thrombus aspiration or fragmentation, or surgical embolectomy. We reviewed our experience with emergency surgical pulmonary embolectomy. METHODS: Between January of 2000 and March of 2007, 25 patients (17 male, mean age 60 years) underwent emergency open embolectomy for central and paracentral pulmonary embolism. Eighteen patients presented in cardiogenic shock, 8 of whom had cardiac arrest and required cardiopulmonary resuscitation. All patients underwent operation with mild hypothermic cardiopulmonary bypass. Concomitant procedures were performed in 8 patients (3 coronary artery bypass grafts, 2 patent foramen ovale closures, 4 ligations of the left atrial appendage, 3 removals of a right atrial thrombus). Follow-up is 96% complete with a median of 2 years (range, 2 months to 6 years). RESULTS: All patients survived the procedure, but 2 patients died in the hospital on postoperative days 1 (intracerebral bleeding) and 11 (multiorgan failure), accounting for a 30-day mortality of 8% (95% confidence interval: 0.98-0.26). Four patients died later because of their underlying disease. Pre- and postoperative echocardiographic pressure measurements demonstrated the reduction of the pulmonary hypertension to half of the systemic pressure values or less. CONCLUSION: Surgical pulmonary embolectomy is an excellent option for patients with major pulmonary embolism and can be performed with minimal mortality and morbidity. Even patients who present with cardiac arrest and require preoperative cardiopulmonary resuscitation show satisfying results. Immediate surgical desobstruction favorably influences the pulmonary pressure and the recovery of right ventricular function, and remains the treatment of choice for patients with massive central and paracentral embolism with hemodynamic and respiratory compromise.
Resumo:
The toxicity of long-term immunosuppressive therapy has become a major concern in long-term follow-up of heart transplant recipients. In this respect the quality of renal function is undoubtedly linked to cyclosporin A (CsA) drug levels. In cardiac transplantation, specific CsA trough levels have historically been maintained between 250 and 350 micrograms/L in many centers without direct evidence for the necessity of such high levels while using triple-drug immunosuppression. This retrospective analysis compares the incidence of acute and chronic graft rejection as well as overall mortality between groups of patients with high (250 to 350 micrograms/L) and low (150 to 250 micrograms/L) specific CsA trough levels. A total of 332 patients who underwent heart transplantation between October 1985 and October 1992 with a minimum follow-up of 30 days were included in this study (46 women and 276 men; aged, 44 +/- 12 years; mean follow-up, 1,122 +/- 777 days). Standard triple-drug immunosuppression included first-year specific CsA target trough levels of 250 to 300 micrograms/L. Patients were grouped according to their average creatinine level in the first postoperative year (group I, < 130 mumol/L, n = 234; group II, > or = 130 mumol/L, n = 98). The overall 5-year survival excluding the early 30-day mortality was 92% (group I, 216/232) and 91% (group II, 89/98) with 75% of the mortality due to chronic rejection. The rate of rejection for the entire follow-up period was similar in both groups (first year: group I, 3.2 +/- 2.6 rejection/patient/year; group II, 3.6 +/- 2.7 rejection/patient/year; p = not significant).(ABSTRACT TRUNCATED AT 250 WORDS)
Resumo:
BACKGROUND: Transcatheter aortic valve implantation (TAVI) for high-risk and inoperable patients with severe aortic stenosis is an emerging procedure in cardiovascular medicine. Little is known of the impact of TAVI on renal function. METHODS: We analysed retrospectively renal baseline characteristics and outcome in 58 patients including 2 patients on chronic haemodialysis undergoing TAVI at our institution. Acute kidney injury (AKI) was defined according to the RIFLE classification. RESULTS: Fifty-eight patients with severe symptomatic aortic stenosis not considered suitable for conventional surgical valve replacement with a mean age of 83 +/- 5 years underwent TAVI. Two patients died during transfemoral valve implantation and two patients in the first month after TAVI resulting in a 30-day mortality of 6.9%. Vascular access was transfemoral in 46 patients and transapical in 12. Estimated glomerular filtration rate (eGFR) increased in 30 patients (56%). Fifteen patients (28%) developed AKI, of which four patients had to be dialyzed temporarily and one remained on chronic renal replacement therapy. Risk factors for AKI comprised, among others, transapical access, number of blood transfusions, postinterventional thrombocytopaenia and severe inflammatory response syndrome (SIRS). CONCLUSIONS: TAVI is feasible in patients with a high burden of comorbidities and in patients with pre-existing end-stage renal disease who would be otherwise not considered as candidates for conventional aortic valve replacement. Although GFR improved in more than half of the patients, this benefit was associated with a risk of postinterventional AKI. Future investigations should define preventive measures of peri-procedural kidney injury.
Resumo:
BACKGROUND: A complete remission is essential for prolonging survival in patients with acute myeloid leukemia (AML). Daunorubicin is a cornerstone of the induction regimen, but the optimal dose is unknown. In older patients, it is usual to give daunorubicin at a dose of 45 to 50 mg per square meter of body-surface area. METHODS: Patients in whom AML or high-risk refractory anemia had been newly diagnosed and who were 60 to 83 years of age (median, 67) were randomly assigned to receive cytarabine, at a dose of 200 mg per square meter by continuous infusion for 7 days, plus daunorubicin for 3 days, either at the conventional dose of 45 mg per square meter (411 patients) or at an escalated dose of 90 mg per square meter (402 patients); this treatment was followed by a second cycle of cytarabine at a dose of 1000 mg per square meter every 12 hours [DOSAGE ERROR CORRECTED] for 6 days. The primary end point was event-free survival. RESULTS: The complete remission rates were 64% in the group that received the escalated dose of daunorubicin and 54% in the group that received the conventional dose (P=0.002); the rates of remission after the first cycle of induction treatment were 52% and 35%, respectively (P<0.001). There was no significant difference between the two groups in the incidence of hematologic toxic effects, 30-day mortality (11% and 12% in the two groups, respectively), or the incidence of moderate, severe, or life-threatening adverse events (P=0.08). Survival end points in the two groups did not differ significantly overall, but patients in the escalated-treatment group who were 60 to 65 years of age, as compared with the patients in the same age group who received the conventional dose, had higher rates of complete remission (73% vs. 51%), event-free survival (29% vs. 14%), and overall survival (38% vs. 23%). CONCLUSIONS: In patients with AML who are older than 60 years of age, escalation of the dose of daunorubicin to twice the conventional dose, with the entire dose administered in the first induction cycle, effects a more rapid response and a higher response rate than does the conventional dose, without additional toxic effects. (Current Controlled Trials number, ISRCTN77039377; and Netherlands National Trial Register number, NTR212.)
Resumo:
BACKGROUND: Splenic involvement in amyloidosis is rather frequent (5-10%). An atraumatic rupture of the affected spleen is however an extremely rare event. We report on a patient with undiagnosed amyloidosis who underwent emergency splenectomy for atraumatic splenic rupture. METHODS: Review of the literature and identification of 31 patients, including our own case report, with atraumatic splenic rupture in amyloidosis. Analysis of the clinical presentation, the surgical management, the nomenclature and definition of predisposing factors of splenic rupture. RESULTS: We identified 15 women and 16 men (mean age 53.3 +/- 12.4 years; median 52, range: 27-82 years) with an atraumatic splenic rupture. Easy skin bruisability and factor X deficiency were detected in four (13%) and five patients (16%), respectively. The diagnosis of splenic rupture was made either by computed tomography (n = 12), ultrasound (n = 5), exploratory laparotomy (n = 9) or autopsy (n = 4). All patients underwent surgery (n = 27) or autopsy (n = 4). Amyloidosis was previously diagnosed in nine patients (29%). In the remaining 22 patients (71%), the atraumatic splenic rupture represented the initial manifestation of amyloidosis. Twenty-five patients (81%) suffered from primary (AL) and four patients (13%) from secondary amyloidosis (AA). In two patients, the type of amyloidosis was not specified. A moderate splenomegaly was a common feature (68%) and the characteristic intraoperative finding was an extended subcapsular hematoma with a limited parenchymal laceration (65%). In five patients with known amyloidosis, the atraumatic splenic rupture was closely associated with autologous stem-cell transplantation (ASCT) (16%). Three patients were suffering from multiple myeloma (10%). A biopsy-proven amyloidotic liver involvement was present in 14 patients (45%), which lead to atraumatic liver rupture in two patients. The splenic rupture related 30-day mortality was 26% (8/31). CONCLUSIONS: Atraumatic splenic rupture in amyloidosis is associated with a high 30-day mortality. It occurs predominantly in patients with previously undiagnosed amyloidosis. A moderate splenomegaly, coagulation abnormalities (easy skin bruisability, factor X deficiency) and treatment of amyloidosis with ASCT are considered predisposing factors for an atraumatic splenic rupture.
Resumo:
BACKGROUND: Stage IIIB non-small-cell lung cancer (NSCLC) is usually thought to be unresectable, and is managed with chemotherapy with or without radiotherapy. However, selected patients might benefit from surgical resection after neoadjuvant chemotherapy and radiotherapy. The aim of this multicentre, phase II trial was to assess the efficacy and toxicity of a neoadjuvant chemotherapy and radiotherapy followed by surgery in patients with technically operable stage IIIB NSCLC. METHODS: Between September, 2001, and May, 2006, patients with pathologically proven and technically resectable stage IIIB NSCLC were sequentially treated with three cycles of neoadjuvant chemotherapy (cisplatin with docetaxel), immediately followed by accelerated concomitant boost radiotherapy (44 Gy in 22 fractions) and definitive surgery. The primary endpoint was event-free survival at 12 months. Efficacy analyses were done by intention to treat. This trial is registered with ClinicalTrials.gov, number NCT00030810. FINDINGS: 46 patients were enrolled, with a median age of 60 years (range 28-70). 13 (28%) patients had N3 disease, 36 (78%) had T4 disease. All patients received chemotherapy; 35 (76%) patients received radiotherapy. The main toxicities during chemotherapy were neutropenia (25 patients [54%] at grade 3 or 4) and febrile neutropenia (nine [20%]); the main toxicity after radiotherapy was oesophagitis (ten patients [29%]; nine grade 2, one grade 3). 35 patients (76%) underwent surgery, with pneumonectomy in 17 patients. A complete (R0) resection was achieved in 27 patients. Peri-operative complications occurred in 14 patients, including two deaths (30-day mortality 5.7%). Seven patients required a second surgical intervention. Pathological mediastinal downstaging was seen in 11 of the 28 patients who had lymph-node involvement at enrolment, a complete pathological response was seen in six patients. Event-free survival at 12 months was 54% (95% CI 39-67). After a median follow-up of 58 months, the median overall survival was 29 months (95% CI 16.1-NA), with survival at 1, 3, and 5 years of 67% (95% CI 52-79), 47% (32-61), and 40% (24-55). INTERPRETATION: A treatment strategy of neoadjuvant chemotherapy and radiotherapy followed by surgery is feasible in selected patients. Toxicity is considerable, but manageable. Survival compares favourably with historical results of combined treatment for less advanced stage IIIA disease. FUNDING: Swiss Group for Clinical Cancer Research (SAKK) and an unrestricted educational grant by Sanofi-Aventis (Switzerland).
Resumo:
The heifer development project was a five year project conducted on the site of the former Jackson County Farm north of Andrew, Iowa, for four years and on an area producer’s farm for the fifth year. Heifers arrived around December 1 each year and the average number of heifers each year was 43 with a low of 37 and high of 47. After a 30+ day warm-up period the heifers were put on a 112-day test from early January to late April. They were fed a shelled corn and legume-grass hay ration consisting of between 13% and 14% crude protein and a range of .44 to .58 megacal/pound of NEg over the five years. During the 112-day test heifers gained 1.86, 1.78, 1.5, 1.63 and 2.2 pounds per day, respectively, for years 1992 through 1996. The actual average breeding weight was less than the target weight in three years by 5, 12 and 22 pounds and exceeded the target weight in two year by 17 and 28 pounds. Estrus synchronization used a combination of MGA feeding and Lutalyse injection. Heifers were heat detected and bred 12 hours later for a three-day period. On the fourth day, all heifers not bred were mass inseminated. Heifers then ran with the cleanup bull for 58 days. The average synchronization response rate during the project was 79%. The overall pregnancy rates based on September pregnancy averaged 92%. The five year average total cost per head for heifer development was $286.18 or about $.85 per day. Feed and pasture costs averaged 61% of the total costs.
Resumo:
The heifer development project took place the past four years on the site of the former Jackson County Farm north of Andrew, Iowa. Heifers arrived around December 1 with 38 heifers delivered for 1992, 44 for 1993, 46 for 1994, and 47 for 1995. After a 30+ day warm-up period, the heifers were put on a 112-day test from early January to late April. They were fed a shelled corn and legume-grass hay ration consisting of between 13% and 14% crude protein and .48, .58, .44, and .54 megacal/pound of NEg respectively for the years 1992 - 1995. During the 112-day test heifers gained 1.86, 1.78, 1.5, and 1.63 pounds per day respectively for years 1992 through 1995. The 1995 heifers averaged 853 pounds at breeding (22 pounds under target weight). This compares with previous years in which the breeding weight was less than target weight in two years by 5 and 12 pounds and exceeded the target weight in one year by 17 pounds. Estrus synchronization used a combination of MGA feeding and Lutalyse injection. Heifers were heatdetected and bred 12 hours later for a three-day period. On the fourth day, all heifers not bred were mass inseminated. Heifers then ran with the cleanup bull for 58 days. The synchronization response rate in 1995 was 83%, which compares with the three year previous average of 77%. The overall pregnancy rates based on September pregnancy exams were 94.6% in 1992, 93% in 1993, 91% in 1994, and 91.5% in 1995. Development costs for the 326 days in 1995 totaled $269.14 per heifer. This compares with the average of $286. 92 for the three previous years. The four-year average total cost per head for heifer development was $282.48, or about $.84 per day. Feed and pasture costs represented 58% of the total costs, or $.49 per day.
Resumo:
Respiratory disease resulting from infection of calves with Haemophilus somnus (H. somnus) is an annual occurrence in fall calves at the McNay Farm. Previous observations of skin test reactivity to H. somnus antigens suggested a role for this phenomenon in the pathogenesis of the disease. Groups of calves, about 90 days of age, were vaccinated with four different commercial H. somnus vaccines, and serum levels of H. somnus antibodies were determined. Antibodies of the IgG and IgE classes were detected with ELISA procedures conducted on sera collected before and after vaccination. Most of the calves had detectable H. somnus IgE class antibodies at the start of the experimentation but IgG class antibodies were minimal. Antibodies of both classes increased in nonvaccinated and vaccinated calves during the 30 day period of experimentation. However, the level of IgE class antibodies in vaccinates was lower than in controls suggesting that vaccination may limit the IgE response.
Resumo:
INTRODUCTION Low systolic blood pressure (SBP) is an important secondary insult following traumatic brain injury (TBI), but its exact relationship with outcome is not well characterised. Although a SBP of <90mmHg represents the threshold for hypotension in consensus TBI treatment guidelines, recent studies suggest redefining hypotension at higher levels. This study therefore aimed to fully characterise the association between admission SBP and mortality to further inform resuscitation endpoints. METHODS We conducted a multicentre cohort study using data from the largest European trauma registry. Consecutive adult patients with AIS head scores >2 admitted directly to specialist neuroscience centres between 2005 and July 2012 were studied. Multilevel logistic regression models were developed to examine the association between admission SBP and 30 day inpatient mortality. Models were adjusted for confounders including age, severity of injury, and to account for differential quality of hospital care. RESULTS 5057 patients were included in complete case analyses. Admission SBP demonstrated a smooth u-shaped association with outcome in a bivariate analysis, with increasing mortality at both lower and higher values, and no evidence of any threshold effect. Adjusting for confounding slightly attenuated the association between mortality and SBP at levels <120mmHg, and abolished the relationship for higher SBP values. Case-mix adjusted odds of death were 1.5 times greater at <120mmHg, doubled at <100mmHg, tripled at <90mmHg, and six times greater at SBP<70mmHg, p<0.01. CONCLUSIONS These findings indicate that TBI studies should model SBP as a continuous variable and may suggest that current TBI treatment guidelines, using a cut-off for hypotension at SBP<90mmHg, should be reconsidered.
Resumo:
BACKGROUND Overlapping first generation sirolimus- and paclitaxel-eluting stents are associated with persistent inflammation, fibrin deposition and delayed endothelialisation in preclinical models, and adverse angiographic and clinical outcomes--including death and myocardial infarction (MI)--in clinical studies. OBJECTIVES To establish as to whether there are any safety concerns with newer generation drug-eluting stents (DES). DESIGN Propensity score adjustment of baseline anatomical and clinical characteristics were used to compare clinical outcomes (Kaplan-Meier estimates) between patients implanted with overlapping DES (Resolute zotarolimus-eluting stent (R-ZES) or R-ZES/other DES) against no overlapping DES. Additionally, angiographic outcomes for overlapping R-ZES and everolimus-eluting stents were evaluated in the randomised RESOLUTE All-Comers Trial. SETTING Patient level data from five controlled studies of the RESOLUTE Global Clinical Program evaluating the R-ZES were pooled. Enrollment criteria were generally unrestrictive. PATIENTS 5130 patients. MAIN OUTCOME MEASURES 2-year clinical outcomes and 13-month angiographic outcomes. RESULTS 644 of 5130 patients (12.6%) in the RESOLUTE Global Clinical Program underwent overlapping DES implantation. Implantation of overlapping DES was associated with an increased frequency of MI and more complex/calcified lesion types at baseline. Adjusted in-hospital, 30-day and 2-year clinical outcomes indicated comparable cardiac death (2-year overlap vs non-overlap: 3.0% vs 2.1%, p=0.36), major adverse cardiac events (13.3% vs 10.7%, p=0.19), target-vessel MI (3.9% vs 3.4%, p=0.40), clinically driven target vessel revascularisation (7.7% vs 6.5%, p=0.32), and definite/probable stent thrombosis (1.4% vs 0.9%, p=0.28). 13-month adjusted angiographic outcomes were comparable between overlapping and non-overlapping DES. CONCLUSIONS Overlapping newer generation DES are safe and effective, with comparable angiographic and clinical outcomes--including repeat revascularisation--to non-overlapping DES.
Resumo:
BACKGROUND Acute cardiogenic shock after myocardial infarction is associated with high in-hospital mortality attributable to persisting low-cardiac output. The Impella-EUROSHOCK-registry evaluates the safety and efficacy of the Impella-2.5-percutaneous left-ventricular assist device in patients with cardiogenic shock after acute myocardial infarction. METHODS AND RESULTS This multicenter registry retrospectively included 120 patients (63.6±12.2 years; 81.7% male) with cardiogenic shock from acute myocardial infarction receiving temporary circulatory support with the Impella-2.5-percutaneous left-ventricular assist device. The primary end point evaluated mortality at 30 days. The secondary end point analyzed the change of plasma lactate after the institution of hemodynamic support, and the rate of early major adverse cardiac and cerebrovascular events as well as long-term survival. Thirty-day mortality was 64.2% in the study population. After Impella-2.5-percutaneous left-ventricular assist device implantation, lactate levels decreased from 5.8±5.0 mmol/L to 4.7±5.4 mmol/L (P=0.28) and 2.5±2.6 mmol/L (P=0.023) at 24 and 48 hours, respectively. Early major adverse cardiac and cerebrovascular events were reported in 18 (15%) patients. Major bleeding at the vascular access site, hemolysis, and pericardial tamponade occurred in 34 (28.6%), 9 (7.5%), and 2 (1.7%) patients, respectively. The parameters of age >65 and lactate level >3.8 mmol/L at admission were identified as predictors of 30-day mortality. After 317±526 days of follow-up, survival was 28.3%. CONCLUSIONS In patients with acute cardiogenic shock from acute myocardial infarction, Impella 2.5-treatment is feasible and results in a reduction of lactate levels, suggesting improved organ perfusion. However, 30-day mortality remains high in these patients. This likely reflects the last-resort character of Impella-2.5-application in selected patients with a poor hemodynamic profile and a greater imminent risk of death. Carefully conducted randomized controlled trials are necessary to evaluate the efficacy of Impella-2.5-support in this high-risk patient group.
Resumo:
The planning of refractive surgical interventions is a challenging task. Numerical modeling has been proposed as a solution to support surgical intervention and predict the visual acuity, but validation on patient specific intervention is missing. The purpose of this study was to validate the numerical predictions of the post-operative corneal topography induced by the incisions required for cataract surgery. The corneal topography of 13 patients was assessed preoperatively and postoperatively (1-day and 30-day follow-up) with a Pentacam tomography device. The preoperatively acquired geometric corneal topography – anterior, posterior and pachymetry data – was used to build patient-specific finite element models. For each patient, the effects of the cataract incisions were simulated numerically and the resulting corneal surfaces were compared to the clinical postoperative measurements at one day and at 30-days follow up. Results showed that the model was able to reproduce experimental measurements with an error on the surgically induced sphere of 0.38D one day postoperatively and 0.19D 30 days postoperatively. The standard deviation of the surgically induced cylinder was 0.54D at the first postoperative day and 0.38D 30 days postoperatively. The prediction errors in surface elevation and curvature were below the topography measurement device accuracy of ±5μm and ±0.25D after the 30-day follow-up. The results showed that finite element simulations of corneal biomechanics are able to predict post cataract surgery within topography measurement device accuracy. We can conclude that the numerical simulation can become a valuable tool to plan corneal incisions in cataract surgery and other ophthalmosurgical procedures in order to optimize patients' refractive outcome and visual function.
Drug-related emergency department visits by elderly patients presenting with non-specific complaints
Resumo:
BACKGROUND Since drug-related emergency department (ED) visits are common among older adults, the objectives of our study were to identify the frequency of drug-related problems (DRPs) among patients presenting to the ED with non-specific complaints (NSC), such as generalized weakness and to evaluate responsible drug classes. METHODS Delayed type cross-sectional diagnostic study with a prospective 30 day follow-up in the ED of the University Hospital Basel, Switzerland. From May 2007 until April 2009, all non-trauma patients presenting to the ED with an Emergency Severity Index (ESI) of 2 or 3 were screened and included, if they presented with non-specific complaints. After having obtained complete 30-day follow-up, two outcome assessors reviewed all available information, judged whether the initial presentation was a DRP and compared their judgment with the initial ED diagnosis. Acute morbidity ("serious condition") was allocated to individual cases according to predefined criteria. RESULTS The study population consisted of 633 patients with NSC. Median age was 81 years (IQR 72/87), and the mean Charlson comorbidity index was 2.5 (IQR 1/4). DRPs were identified in 77 of the 633 cases (12.2%). At the initial assessment, only 40% of the DRPs were correctly identified. 64 of the 77 identified DRPs (83%) fulfilled the criteria "serious condition". Polypharmacy and certain drug classes (thiazides, antidepressants, benzodiazepines, anticonvulsants) were associated with DRPs. CONCLUSION Elderly patients with non-specific complaints need to be screened systematically for drug-related problems. TRIAL REGISTRATION ClinicalTrials.gov: NCT00920491.