833 resultados para Burns and scalds -- Patients -- Rehabilitation. Burns and scalds in children.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Patients presenting to the emergency department (ED) currently face inacceptable delays in initial treatment, and long, costly hospital stays due to suboptimal initial triage and site-of-care decisions. Accurate ED triage should focus not only on initial treatment priority, but also on prediction of medical risk and nursing needs to improve site-of-care decisions and to simplify early discharge management. Different triage scores have been proposed, such as the Manchester triage system (MTS). Yet, these scores focus only on treatment priority, have suboptimal performance and lack validation in the Swiss health care system. Because the MTS will be introduced into clinical routine at the Kantonsspital Aarau, we propose a large prospective cohort study to optimize initial patient triage. Specifically, the aim of this trial is to derive a three-part triage algorithm to better predict (a) treatment priority; (b) medical risk and thus need for in-hospital treatment; (c) post-acute care needs of patients at the most proximal time point of ED admission. Methods/design: Prospective, observational, multicenter, multi-national cohort study. We will include all consecutive medical patients seeking ED care into this observational registry. There will be no exclusions except for non-adult and non-medical patients. Vital signs will be recorded and left over blood samples will be stored for later batch analysis of blood markers. Upon ED admission, the post-acute care discharge score (PACD) will be recorded. Attending ED physicians will adjudicate triage priority based on all available results at the time of ED discharge to the medical ward. Patients will be reassessed daily during the hospital course for medical stability and readiness for discharge from the nurses and if involved social workers perspective. To assess outcomes, data from electronic medical records will be used and all patients will be contacted 30 days after hospital admission to assess vital and functional status, re-hospitalization, satisfaction with care and quality of life measures. We aim to include between 5000 and 7000 patients over one year of recruitment to derive the three-part triage algorithm. The respective main endpoints were defined as (a) initial triage priority (high vs. low priority) adjudicated by the attending ED physician at ED discharge, (b) adverse 30 day outcome (death or intensive care unit admission) within 30 days following ED admission to assess patients risk and thus need for in-hospital treatment and (c) post acute care needs after hospital discharge, defined as transfer of patients to a post-acute care institution, for early recognition and planning of post-acute care needs. Other outcomes are time to first physician contact, time to initiation of adequate medical therapy, time to social worker involvement, length of hospital stay, reasons fordischarge delays, patient’s satisfaction with care, overall hospital costs and patients care needs after returning home. Discussion: Using a reliable initial triage system for estimating initial treatment priority, need for in-hospital treatment and post-acute care needs is an innovative and persuasive approach for a more targeted and efficient management of medical patients in the ED. The proposed interdisciplinary , multi-national project has unprecedented potential to improve initial triage decisions and optimize resource allocation to the sickest patients from admission to discharge. The algorithms derived in this study will be compared in a later randomized controlled trial against a usual care control group in terms of resource use, length of hospital stay, overall costs and patient’s outcomes in terms of mortality, re-hospitalization, quality of life and satisfaction with care.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND & AIMS: Age is frequently discussed as negative host factor to achieve a sustained virological response (SVR) to antiviral therapy of chronic hepatitis C. However, elderly patients often show advanced fibrosis/cirrhosis as known negative predictive factor. The aim of this study was to assess age as an independent predictive factor during antiviral therapy. METHODS: Overall, 516 hepatitis C patients were treated with pegylated interferon-α and ribavirin, thereof 66 patients ≥60 years. We analysed the impact of host factors (age, gender, fibrosis, haemoglobin, previous hepatitis C treatment) and viral factors (genotype, viral load) on SVR per therapy course by performing a generalized estimating equations (GEE) regression modelling, a matched pair analysis and a classification tree analysis. RESULTS: Overall, SVR per therapy course was 42.9 and 26.1%, respectively, in young and elderly patients with hepatitis C virus (HCV) genotypes 1/4/6. The corresponding figures for HCV genotypes 2/3 were 74.4 and 84%. In the GEE model, age had no significant influence on achieving SVR. In matched pair analysis, SVR was not different in young and elderly patients (54.2 and 55.9% respectively; P = 0.795 in binominal test). In classification tree analysis, age was not a relevant splitting variable. CONCLUSIONS: Age is not a significant predictive factor for achieving SVR, when relevant confounders are taken into account. As life expectancy in Western Europe at age 60 is more than 20 years, it is reasonable to treat chronic hepatitis C in selected elderly patients with relevant fibrosis or cirrhosis but without major concomitant diseases, as SVR improves survival and reduces carcinogenesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Limited data exist on the longitudinal crestal bone changes around teeth compared with implants in partially edentulous patients. This study sought to compare the 10-year radiographic crestal bone changes (bone level [BL]) around teeth and implants in periodontally compromised (PCPs) and periodontally healthy (PHPs) patients. METHODS A total of 120 patients were evaluated for the radiographic crestal BL around dental implants and adjacent teeth at time of implant crown insertion and at the 10-year follow-up. Sixty patients had a previous history of periodontitis (PCPs), and the remaining 60 were PHPs. In each category (PCP and PHP), two different implant systems were used. The mean BL change at the implant and at the adjacent tooth at the interproximal area was calculated by subtracting the radiographic crestal BL at the time of crown cementation from the radiographic crestal BL at the 10-year follow-up. RESULTS At 10 years after therapy, the survival rate ranged from 80% to 95% for subgroups for implants, whereas it was 100% for the adjacent teeth. In all eight different patient categories evaluated, teeth demonstrated a significantly more stable radiographic BL compared with adjacent dental implants (teeth BL, 0.44 ± 0.23 mm; implant BL, 2.28 ± 0.72 mm; P <0.05). Radiographic BL changes around teeth seemed not to be influenced by the presence or absence of advanced bone loss (≥3 mm) at the adjacent implants. CONCLUSIONS Natural teeth yielded better long-term results with respect to survival rate and marginal BL changes compared with dental implants. Moreover, these findings also extend to teeth with an initial reduced periodontal attachment level, provided adequate periodontal treatment and maintenance are performed. As a consequence, the decision of tooth extraction attributable to periodontal reasons in favor of a dental implant should be carefully considered in partially edentulous patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECT After subarachnoid hemorrhage (SAH), seizure occurs in up to 26% of patients. The impact of seizure on outcome has been studied, yet its impact on grading is unknown. The authors evaluated the impact of early-onset seizures (EOS) on grading of spontaneous SAH and on outcome. METHODS This retrospective analysis included consecutive patients with SAH who were treated at the NeuroCenter, Inselspital, University Hospital Bern, Switzerland, between January 2005 and December 2010. Demographic data, clinical data, and reports of EOS were recorded. The EOS were defined as seizures occurring within 24 hours after ictus. Patients were graded according to the World Federation of Neurosurgical Societies (WFNS) scale pre- and postresuscitation and dichotomized into good (WFNS I-III) and poor (WFNS IV-V) grades. Outcome was assessed at 6 months by using the modified Rankin Scale (mRS); an mRS score of 0-3 was considered a good outcome and an mRS score of 4-6 was considered a poor outcome. RESULTS Forty-one of 425 patients with SAH had EOS. Twenty-seven of those 41 patients (65.9%) had a poor WFNS grade. Twenty-eight (68.3%) achieved a good outcome, 11 (26.8%) had a poor outcome, and 2 (4.9%) were lost to followup. Early-onset seizures were proven in 9 of 16 electroencephalograms. The EOS were associated with poor WFNS grade (OR 2.81, 97.5% CI 1.14-7.46; p = 0.03) and good outcome (OR 4.01, 97.5% CI 1.63-10.53; p = 0.03). Increasing age, hydrocephalus, intracerebral hemorrhage, and intraventricular hemorrhage were associated with poor WFNS grade, whereas only age, intracerebral hemorrhage (p < 0.001), and poor WFNS grade (p < 0.001) were associated with poor outcome. CONCLUSIONS Patients with EOS were classified significantly more often in a poor grade initially, but then they significantly more often achieved a good outcome. The authors conclude that EOS can negatively influence grading. This might influence decision making for the care of patients with SAH, so grading of patients with EOS should be interpreted with caution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE Whether or not a high risk of falls increases the risk of bleeding in patients receiving anticoagulants remains a matter of debate. METHODS We conducted a prospective cohort study involving 991 patients ≥ 65 years of age who received anticoagulants for acute venous thromboembolism (VTE) at nine Swiss hospitals between September 2009 and September 2012. The study outcomes were as follows: the time to a first major episode of bleeding; and clinically relevant nonmajor bleeding. We determined the associations between the risk of falls and the time to a first episode of bleeding using competing risk regression, accounting for death as a competing event. We adjusted for known bleeding risk factors and anticoagulation as a time-varying covariate. RESULTS Four hundred fifty-eight of 991 patients (46%) were at high risk of falls. The mean duration of follow-up was 16.7 months. Patients at high risk of falls had a higher incidence of major bleeding (9.6 vs. 6.6 events/100 patient-years; P = 0.05) and a significantly higher incidence of clinically relevant nonmajor bleeding (16.7 vs. 8.3 events/100 patient-years; P < 0.001) than patients at low risk of falls. After adjustment, a high risk of falls was associated with clinically relevant nonmajor bleeding [subhazard ratio (SHR) = 1.74, 95% confidence interval (CI) = 1.23-2.46], but not with major bleeding (SHR = 1.24, 95% CI = 0.83-1.86). CONCLUSION In elderly patients who receive anticoagulants because of VTE, a high risk of falls is significantly associated with clinically relevant nonmajor bleeding, but not with major bleeding. Whether or not a high risk of falls is a reason against providing anticoagulation beyond 3 months should be based on patient preferences and the risk of VTE recurrence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND/AIMS Clinical differentiation between organic hypersomnia and non-organic hypersomnia (NOH) is challenging. We aimed to determine the diagnostic value of sleepiness and performance tests in patients with excessive daytime sleepiness (EDS) of organic and non-organic origin. METHODS We conducted a retrospective comparison of the multiple sleep latency test (MSLT), pupillography, and the Steer Clear performance test in three patient groups complaining of EDS: 19 patients with NOH, 23 patients with narcolepsy (NAR), and 46 patients with mild to moderate obstructive sleep apnoea syndrome (OSAS). RESULTS As required by the inclusion criteria, all patients had Epworth Sleepiness Scale (ESS) scores >10. The mean sleep latency in the MSLT indicated mild objective sleepiness in NOH (8.1 ± 4.0 min) and OSAS (7.2 ± 4.1 min), but more severe sleepiness in NAR (2.5 ± 2.0 min). The difference between NAR and the other two groups was significant; the difference between NOH and OSAS was not. In the Steer Clear performance test, NOH patients performed worst (error rate = 10.4%) followed by NAR (8.0%) and OSAS patients (5.9%; p = 0.008). The difference between OSAS and the other two groups was significant, but not between NOH and NAR. The pupillary unrest index was found to be highest in NAR (11.5) followed by NOH (9.2) and OSAS (7.4; n.s.). CONCLUSION A high error rate in the Steer Clear performance test along with mild sleepiness in an objective sleepiness test (MSLT) in a patient with subjective sleepiness (ESS) is suggestive of NOH. This disproportionately high error rate in NOH may be caused by factors unrelated to sleep pressure, such as anergia, reduced attention and motivation affecting performance, but not conventional sleepiness measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Distinct populations of neutrophils have been identified based on the expression of intercellular adhesion molecule 1 (ICAM1, CD54) and chemokine receptor 1 (CXCR1, interleukin 8 receptor α). AIM We analyzed the expression of vascular endothelial growth factor receptor 1 (VEGFR1), a physiological negative regulator of angiogenesis, on distinct populations of neutrophils from the blood of patients before and after adjuvant chemotherapy for breast cancer. MATERIALS AND METHODS Neutrophil populations were distinguished as reverse transmigrated (ICAM1(high)/CXCR1(low)), naïve (ICAM1(low)/CXCR1(high)), or tissue-resident neutrophils (ICAM1(low)/CXCR1(low)), and their VEGFR1 expression quantified. RESULTS Reverse transmigrated ICAM1(high)/CXCR1(low) neutrophilic granulocytes decreased significantly after chemotherapy and these were also the cells with highest mean fluorescence intensity for VEGFR1. CONCLUSION Chemotherapy mainly reduces the number of reverse transmigrated long-lived ICAM1(high)/CXCR1(low) VEGFR1-expressing neutrophils. The decrease of antiangiogenic VEGFR1 may have a potential impact on tumour angiogenesis in patients undergoing adjuvant chemotherapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND It was found that age and female gender are predisposing factors for hyponatremia in patients taking thiazides. OBJECTIVE To investigate whether a relationship exists between age and gender and serum sodium and potassium as well as the prevalence rates in a large population of patients presenting to the emergency department of a university hospital. METHODS In this retrospective analysis we gathered data on age, gender and current diuretic medication of all patients admitted to the emergency department of a large university hospital with measurement of serum sodium and potassium between January 1, 2009 and December 31, 2010. Prevalence rates of and risk factors for electrolyte disorders were calculated on the basis of these data. RESULTS A total of 20,667 patients were included in the analysis. Serum sodium levels declined significantly with increasing age while serum potassium rose, independent of diuretic medication at presentation. The prevalence rates of hyponatremia and hyperkalemia increased from 2.3% for hyponatremia in patients aged 16-21 years to 16.9% in patients aged >80 years and from 0.8% for hyperkalemia to 10.4%. In the regression analysis, age >60 years was a predictor for the presence of hyponatremia and hyperkalemia as was current use of diuretic medication. Male gender was associated with a decreased prevalence of hyponatremia and hypokalemia, while it was a predictor of hyperkalemia. CONCLUSIONS Sodium levels were lower with increasing age, independent of diuretic intake, while potassium levels were higher. We found dramatically increasing prevalences of hyponatremia and hyperkalemia with increasing age, while no such effect could be found for hypernatremia and hypokalemia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE To assess safety up to 1 year of follow-up associated with prasugrel and clopidogrel use in a prospective cohort of patients with acute coronary syndromes (ACS). METHODS Between 2009 and 2012, 2286 patients invasively managed for ACS were enrolled in the multicentre Swiss ACS Bleeding Cohort, among whom 2148 patients received either prasugrel or clopidogrel according to current guidelines. Patients with ST-elevation myocardial infarction (STEMI) preferentially received prasugrel, while those with non-STEMI, a history of stroke or transient ischaemic attack, age ≥75 years, or weight <60 kg received clopidogrel or reduced dose of prasugrel to comply with the prasugrel label. RESULTS After adjustment using propensity scores, the primary end point of clinically relevant bleeding events (defined as the composite of Bleeding Academic Research Consortium, BARC, type 3, 4 or 5 bleeding) at 1 year, occurred at a similar rate in both patient groups (prasugrel/clopidogrel: 3.8%/5.5%). Stratified analyses in subgroups including patients with STEMI yielded a similar safety profile. After adjusting for baseline variables, no relevant differences in major adverse cardiovascular and cerebrovascular events were observed at 1 year (prasugrel/clopidogrel: cardiac death 2.6%/4.2%, myocardial infarction 2.7%/3.8%, revascularisation 5.9%/6.7%, stroke 1.0%/1.6%). Of note, this study was not designed to compare efficacy between prasugrel and clopidogrel. CONCLUSIONS In this large prospective ACS cohort, patients treated with prasugrel according to current guidelines (ie, in patients without cerebrovascular disease, old age or underweight) had a similar safety profile compared with patients treated with clopidogrel. CLINICAL TRIAL REGISTRATION NUMBER SPUM-ACS: NCT01000701; COMFORTABLE AMI: NCT00962416.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND & AIMS Pegylated interferon is still the backbone of hepatitis C treatment and may cause thrombocytopenia, leading to dose reductions, early discontinuation, and eventually worse clinical outcome. We assessed associations between interferon-induced thrombocytopenia and bleeding complications, interferon dose reductions, early treatment discontinuation, as well as SVR and long-term clinical outcome. METHODS All consecutive patients with chronic HCV infection and biopsy-proven advanced hepatic fibrosis (Ishak 4-6) who initiated interferon-based therapy between 1990 and 2003 in 5 large hepatology units in Europe and Canada were included. RESULTS Overall, 859 treatments were administered to 546 patients. Baseline platelets (in 10(9)/L) were normal (⩾150) in 394 (46%) treatments; thrombocytopenia was moderate (75-149) in 324 (38%) and severe (<75) in 53 (6%) treatments. Thrombocytopenia-induced interferon dose reductions occurred in 3 (1%); 46 (16%), and 15 (30%) treatments respectively (p<0.001); interferon was discontinued due to thrombocytopenia in 1 (<1%), 8 (3%), and in 8 (16%) treatments respectively (p<0.001). In total, 104 bleeding events were reported during 53 treatments. Only two severe bleeding complications occurred. Multivariate analysis showed that cirrhosis and a platelet count below 50 were associated with on-treatment bleeding. Within thrombocytopenic patients, patients attaining SVR had a lower occurrence of liver failure (p<0.001), hepatocellular carcinoma (p<0.001), liver related death or liver transplantation (p<0.001), and all-cause mortality (p=0.001) compared to patients without SVR. CONCLUSIONS Even in thrombocytopenic patients with chronic HCV infection and advanced hepatic fibrosis, on-treatment bleedings are generally mild. SVR was associated with a marked reduction in cirrhosis-related morbidity and mortality, especially in patients with baseline thrombocytopenia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION Patients admitted to intensive care following surgery for faecal peritonitis present particular challenges in terms of clinical management and risk assessment. Collaborating surgical and intensive care teams need shared perspectives on prognosis. We aimed to determine the relationship between dynamic assessment of trends in selected variables and outcomes. METHODS We analysed trends in physiological and laboratory variables during the first week of intensive care unit (ICU) stay in 977 patients at 102 centres across 16 European countries. The primary outcome was 6-month mortality. Secondary endpoints were ICU, hospital and 28-day mortality. For each trend, Cox proportional hazards (PH) regression analyses, adjusted for age and sex, were performed for each endpoint. RESULTS Trends over the first 7 days of the ICU stay independently associated with 6-month mortality were worsening thrombocytopaenia (mortality: hazard ratio (HR) = 1.02; 95% confidence interval (CI), 1.01 to 1.03; P <0.001) and renal function (total daily urine output: HR =1.02; 95% CI, 1.01 to 1.03; P <0.001; Sequential Organ Failure Assessment (SOFA) renal subscore: HR = 0.87; 95% CI, 0.75 to 0.99; P = 0.047), maximum bilirubin level (HR = 0.99; 95% CI, 0.99 to 0.99; P = 0.02) and Glasgow Coma Scale (GCS) SOFA subscore (HR = 0.81; 95% CI, 0.68 to 0.98; P = 0.028). Changes in renal function (total daily urine output and renal component of the SOFA score), GCS component of the SOFA score, total SOFA score and worsening thrombocytopaenia were also independently associated with secondary outcomes (ICU, hospital and 28-day mortality). We detected the same pattern when we analysed trends on days 2, 3 and 5. Dynamic trends in all other measured laboratory and physiological variables, and in radiological findings, changes inrespiratory support, renal replacement therapy and inotrope and/or vasopressor requirements failed to be retained as independently associated with outcome in multivariate analysis. CONCLUSIONS Only deterioration in renal function, thrombocytopaenia and SOFA score over the first 2, 3, 5 and 7 days of the ICU stay were consistently associated with mortality at all endpoints. These findings may help to inform clinical decision making in patients with this common cause of critical illness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND HIV-1 RNA viral load (VL) testing is recommended to monitor antiretroviral therapy (ART) but not available in many resource-limited settings. We developed and validated CD4-based risk charts to guide targeted VL testing. METHODS We modeled the probability of virologic failure up to 5 years of ART based on current and baseline CD4 counts, developed decision rules for targeted VL testing of 10%, 20% or 40% of patients in seven cohorts of patients starting ART in South Africa, and plotted cut-offs for VL testing on colour-coded risk charts. We assessed the accuracy of risk chart-guided VL testing to detect virologic failure in validation cohorts from South Africa, Zambia and the Asia-Pacific. FINDINGS 31,450 adult patients were included in the derivation and 25,294 patients in the validation cohorts. Positive predictive values increased with the percentage of patients tested: from 79% (10% tested) to 98% (40% tested) in the South African, from 64% to 93% in the Zambian and from 73% to 96% in the Asia-Pacific cohorts. Corresponding increases in sensitivity were from 35% to 68% in South Africa, from 55% to 82% in Zambia and from 37% to 71% in Asia-Pacific. The area under the receiver-operating curve increased from 0.75 to 0.91 in South Africa, from 0.76 to 0.91 in Zambia and from 0.77 to 0.92 in Asia Pacific. INTERPRETATION CD4-based risk charts with optimal cut-offs for targeted VL testing may be useful to monitor ART in settings where VL capacity is limited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Concentrations of atmospheric noble gases (neon, argon, krypton, and xenon) dissolved in groundwaters from northern Oman indicate that the average ground temperature during the Late Pleistocene (15,000 to 24,000 years before present) was 6.5° ± 0.6°C lower than that of today. Stable oxygen and hydrogen isotopic groundwater data show that the origin of atmospheric water vapor changed from a primarily southern, Indian Ocean source during the Late Pleistocene to a dominantly northern, Mediterranean source today. The reduced northern water vapor source is consistent with a drier Last Glacial Maximum through much of northern Africa and Arabia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND To evaluate in patients with aggressive periodontitis (AgP) the effect of nonsurgical periodontal treatment in conjunction with either additional administration of systemic antibiotics (AB) or application of photodynamic therapy (PDT) on the gingival crevicular fluid (GCF) concentration of matrix metalloproteinases 8 and 9 (MMP-8 and -9). METHODS Thirty-six patients with AgP were included in the study. Patients were randomly assigned to treatment with either scaling and root planing (SRP) followed by systemic administration of AB (e.g. Amoxicillin + Metronidazole) or SRP + PDT. The analysis of MMP-8 and -9 GCF concentrations was performed at baseline and at 3 and 6 months after treatment. Nonparametric U-Mann-Whitney test was used for comparison between groups. Changes from baseline to 3 and 6 months were analyzed with the Friedman's ANOVA test with Kendall's index of consistency. RESULTS In the AB group, patients showed a statistically significant (p = 0.01) decrease of MMP-8 GCF level at both 3 and 6 months post treatment. In the PDT group, the change of MMP-8 GCF level was not statistically significant. Both groups showed at 3 and 6 months a decrease in MMP-9 levels. However, this change did not reach statistical significance. CONCLUSIONS Within the limits of the present study, it may be suggested that in patients with AgP, nonsurgical periodontal therapy in conjunction with adjunctive systemic administration of amoxicilin and metronidazole is more effective in reducing GCF MMP-8 levels compared to the adjunctive use of PDT.