352 resultados para Impact of ICT


Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: To assess the impact of neoadjuvant chemoradiotherapy (NCRT) on anastomotic leakage (AL) and other postoperative outcomes after esophageal cancer (EC) resection. BACKGROUND: Conflicting data have emerged from randomized studies regarding the impact of NCRT on AL. METHODS: Among 2944 consecutive patients operated on for EC between 2000 and 2010 in 30 European centers, patients treated by NCRT after surgery (n = 593) were compared with those treated by primary surgery (n = 1487). Multivariable analyses and propensity score matching were used to compensate for the differences in some baseline characteristics. RESULTS: Patients in the NCRT group were younger, with a higher prevalence of male sex, malnutrition, advanced tumor stage, squamous cell carcinoma, and surgery after 2005 when compared with the primary surgery group. Postoperative AL rates were 8.8% versus 10.6% (P = 0.220), and 90-day postoperative mortality and morbidity rates were 9.3% versus 7.2% (P = 0.110) and 33.4% versus 32.1% (P = 0.564), respectively. Pulmonary complication rates did not differ between groups (24.6% vs 22.5%; P = 0.291), whereas chylothorax (2.5% vs 1.2%; P = 0.020), cardiovascular complications (8.6% vs 0.1%; P = 0.037), and thromboembolic events (8.6% vs 6.0%; P = 0.037) were higher in the NCRT group. After propensity score matching, AL rates were 8.8% versus 11.3% (P = 0.228), with more chylothorax (2.5% vs 0.7%; P = 0.030) and trend toward more cardiovascular and thromboembolic events in the NCRT group (P = 0.069). Predictors of AL were high American Society of Anesthesiologists scores, supracarinal tumoral location, and cervical anastomosis, but not NCRT. CONCLUSIONS: Neoadjuvant chemoradiotherapy does not have an impact on the AL rate after EC resection (NCT 01927016).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study is to investigate the influence of unusual writing positions on a person's signature, in comparison to a standard writing position. Ten writers were asked to sign their signature six times, in each of four different writing positions, including the standard one. In order to take into consideration the effect of the day-to-day variation, this same process was repeated over 12 sessions, giving a total of 288 signatures per subject. The signatures were collected simultaneously in an off-line and on-line acquisition mode, using an interactive tablet and a ballpoint pen. Unidimensional variables (height to width ratio; time with or without in air displacement) and time-dependent variables (pressure; X and Y coordinates; altitude and azimuth angles) were extracted from each signature. For the unidimensional variables, the position effect was assessed through ANOVA and Dunnett contrast tests. Concerning the time-dependent variables, the signatures were compared by using dynamic time warping, and the position effect was evaluated through classification by linear discriminant analysis. Both of these variables provided similar results: no general tendency regarding the position factor could be highlighted. The influence of the position factor varies according to the subject as well as the variable studied. The impact of the session factor was shown to cover the impact that could be ascribed to the writing position factor. Indeed, the day-to-day variation has a greater effect than the position factor on the studied signature variables. The results of this study suggest guidelines for best practice in the area of signature comparisons and demonstrate the importance of a signature collection procedure covering an adequate number of sampling sessions, with a sufficient number of samples per session.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines trends and geographical differences in total and live birth prevalence of trisomies 21, 18 and 13 with regard to increasing maternal age and prenatal diagnosis in Europe. Twenty-one population-based EUROCAT registries covering 6.1 million births between 1990 and 2009 participated. Trisomy cases included live births, fetal deaths from 20 weeks gestational age and terminations of pregnancy for fetal anomaly. We present correction to 20 weeks gestational age (ie, correcting early terminations for the probability of fetal survival to 20 weeks) to allow for artefactual screening-related differences in total prevalence. Poisson regression was used. The proportion of births in the population to mothers aged 35+ years in the participating registries increased from 13% in 1990 to 19% in 2009. Total prevalence per 10 000 births was 22.0 (95% CI 21.7-22.4) for trisomy 21, 5.0 (95% CI 4.8-5.1) for trisomy 18 and 2.0 (95% CI 1.9-2.2) for trisomy 13; live birth prevalence was 11.2 (95% CI 10.9-11.5) for trisomy 21, 1.04 (95% CI 0.96-1.12) for trisomy 18 and 0.48 (95% CI 0.43-0.54) for trisomy 13. There was an increase in total and total corrected prevalence of all three trisomies over time, mainly explained by increasing maternal age. Live birth prevalence remained stable over time. For trisomy 21, there was a three-fold variation in live birth prevalence between countries. The rise in maternal age has led to an increase in the number of trisomy-affected pregnancies in Europe. Live birth prevalence has remained stable overall. Differences in prenatal screening and termination between countries lead to wide variation in live birth prevalence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: Newer antiepileptic drugs (AED) are increasingly prescribed, and seem to have a comparable efficacy as the classical AED in patients living with epilepsy; however, their impact on status epilepticus (SE) prognosis has received little attention. Method: In our prospective SE registry (2006-10) we assessed the use of newer AED (for this purpose: levetiracetam, pregabalin, topiramate, lacosamide) over time, and its relationship to outcome (return to clinical baseline conditions, new handicap, or death). We adjusted for recognized SE outcome predictors (Status Epilepticus Severity Score, STESS; potentially fatal etiology), and the use of >2 AED for a given SE episode. Result: Newer AED were used more often towards the end of the study period (42% versus 30% episodes), and more frequently in severe and difficult to treat episodes. However, after adjustment for SE etiology, STESS, and medication number, newer AED resulted independently related to reduced likelihood of return to baseline (p < 0.01), but not to increased mortality. STESS and etiology were robustly related to both outcomes (p < 0.01 for each), while prescription of >2 AED was only related to lower chance of return to baseline (p = 0.03). Conclusion: Despite increase in the use of newer AED, our findings suggest that SE prognosis has not been improved. This appears similar to recent analyses on patients with refractory epilepsy, and corroborates the hypothesis that SE prognosis is mainly determined by its biological background. Since newer AED are more expensive, prospective trials showing their superiority (at least regarding side effects) appear mandatory to justify their use in this setting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

QUESTION UNDER STUDY: Hospitals transferring patients retain responsibility until admission to the new health care facility. We define safe transfer conditions, based on appropriate risk assessment, and evaluate the impact of this strategy as implemented at our institution. METHODS: An algorithm defining transfer categories according to destination, equipment monitoring, and medication was developed and tested prospectively over 6 months. Conformity with algorithm criteria was assessed for every transfer and transfer category. After introduction of a transfer coordination centre with transfer nurses, the algorithm was implemented and the same survey was carried out over 1 year. RESULTS: Over the whole study period, the number of transfers increased by 40%, chiefly by ambulance from the emergency department to other hospitals and private clinics. Transfers to rehabilitation centres and nursing homes were reassigned to conventional vehicles. The percentage of patients requiring equipment during transfer, such as an intravenous line, decreased from 34% to 15%, while oxygen or i.v. drug requirement remained stable. The percentage of transfers considered below theoretical safety decreased from 6% to 4%, while 20% of transfers were considered safer than necessary. A substantial number of planned transfers could be "downgraded" by mutual agreement to a lower degree of supervision, and the system was stable on a short-term basis. CONCLUSION: A coordinated transfer system based on an algorithm determining transfer categories, developed on the basis of simple but valid medical and nursing criteria, reduced unnecessary ambulance transfers and treatment during transfer, and increased adequate supervision.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

QUESTION UNDER STUDY: Thirty-day readmissions can be classified as potentially avoidable (PARs) or not avoidable (NARs) by following a specific algorithm (SQLape®). We wanted to assess the financial impact of the Swiss-DRG system, which regroups some readmissions occurring within 18 days after discharge within the initial hospital stay, on PARs at our hospital. METHODS: First, PARs were identified from all hospitalisations recorded in 2011 at our university hospital. Second, 2012 Swiss-DRG readmission rules were applied, regrouped readmissions (RR) were identified, and their financial impact computed. Third, RRs were classified as potentially avoidable (PARRs), not avoidable (NARRs), and others causes (OCRRs). Characteristics of PARR patients and stays were retrieved, and the financial impact of PARRS was computed. RESULTS: A total of 36,777 hospitalisations were recorded in 2011, of which 3,140 were considered as readmissions (8.5%): 1,470 PARs (46.8%) and 1,733 NARs (53.2%). The 2012 Swiss-DRG rules would have resulted in 910 RRs (2.5% of hospitalisations, 29% of readmissions): 395 PARRs (43% of RR), 181 NARRs (20%), and 334 OCRRs (37%). Loss in reimbursement would have amounted to CHF 3.157 million (0.6% of total reimbursement). As many as 95% of the 395 PARR patients lived at home. In total, 28% of PARRs occurred within 3 days after discharge, and 58% lasted less than 5 days; 79% of the patients were discharged home again. Loss in reimbursement would amount to CHF 1.771 million. CONCLUSION: PARs represent a sizeable number of 30-day readmissions, as do PARRs of 18-day RRs in the 2012 Swiss DRG system. They should be the focus of attention, as the PARRs represent an avoidable loss in reimbursement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les coûts de traitement de certains patients s'avèrent extrêmement élevés, et peuvent faire soupçonner une prise en charge médicale inadéquate. Comme I'évolution du remboursement des prestations hospitalières passe à des forfaits par pathologie, il est essentiel de vérifier ce point, d'essayer de déterminer si ce type de patients peut être identifié à leur admission, et de s'assurer que leur devenir soit acceptable. Pour les années 1995 et 1997. les coûts de traitement dépassant de 6 déviations standard le coût moyen de la catégorie diagnostique APDRG ont été identifiés, et les dossiers des 50 patients dont les coûts variables étaient les plus élevés ont été analysés. Le nombre total de patients dont I'hospitalisation a entraîné des coûts extrêmes a passé de 391 en 1995 à 328 patients en 1997 (-16%). En ce qui concerne les 50 patients ayant entraîné les prises en charge les plus chères de manière absolue, les longs séjours dans de multiples services sont fréquents, mais 90% des patients sont sortis de l'hôpital en vie, et près de la moitié directement à domicile. Ils présentaient une variabilité importante de diagnostics et d'interventions, mais pas d'évidence de prise en charge inadéquate. En conclusion, les patients qualifiés de cas extrêmes sur un plan économique, ne le sont pas sur un plan strictement médical, et leur devenir est bon. Face à la pression qu'exercera le passage à un mode de financement par pathologie, les hôpitaux doivent mettre au point un système de revue interne de I'adéquation des prestations fournies basées sur des caractéristiques cliniques, s'ils veulent garantir des soins de qualité. et identifier les éventuelles prestations sous-optimales qu'ils pourraient être amenés à délivrer. [Auteurs] Treatment costs for some patients are extremely high and might let think that medical care could have been inadequate. As hospital financing systems move towards reimbursement by diagnostic groups, it is essential to assess whether inadequate care is provided, to try to identify these patients upon admission, and make sure that their outcome is good. For the years 1995 and 1997, treatment costs exceeding by 6 standard deviations the average cost of their APDRG category were identified, and the charts of the 50 patients with the highest variable costs were analyzed. The total number of patients with such extreme costs diminished from 391 in 1995 to 328 in 1997 (-16%). For the 50 most expensive patients, long stays in several services were frequent, but 90% of these patients left the hospital alive, and about half directly to their home. They presented an important variation in diagnoses and operations, but no evidence for inadequate care. Thus, patients qualified as extreme from an economic perspective cannot be qualified as such from a medical perspective, and their outcome is good. To face the pressure linked with the change in financing system, hospitals must develop an internal review system for assessing the adequacy of care, based on clinical characteristics, if they want to guarantee good quality of care and identify potentially inadequate practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Pain is a major issue after burns even when large doses of opioids are prescribed. The study focused on the impact of a pain protocol using hypnosis on pain intensity, anxiety, clinical course, and costs. METHODS: All patients admitted to the ICU, aged >18 years, with an ICU stay >24h, accepting to try hypnosis, and treated according to standardized pain protocol were included. Pain was scaled on the Visual Analog Scale (VAS) (mean of daily multiple recordings), and basal and procedural opioid doses were recorded. Clinical outcome and economical data were retrieved from hospital charts and information system, respectively. Treated patients were matched with controls for sex, age, and the burned surface area. FINDINGS: Forty patients were admitted from 2006 to 2007: 17 met exclusion criteria, leaving 23 patients, who were matched with 23 historical controls. Altogether patients were 36+/-14 years old and burned 27+/-15%BSA. The first hypnosis session was performed after a median of 9 days. The protocol resulted in the early delivery of higher opioid doses/24h (p<0.0001) followed by a later reduction with lower pain scores (p<0.0001), less procedural related anxiety, less procedures under anaesthesia, reduced total grafting requirements (p=0.014), and lower hospital costs per patient. CONCLUSION: A pain protocol including hypnosis reduced pain intensity, improved opioid efficiency, reduced anxiety, improved wound outcome while reducing costs. The protocol guided use of opioids improved patient care without side effects, while hypnosis had significant psychological benefits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION. Reduced cerebral perfusion pressure (CPP) may worsen secondary damage and outcome after severe traumatic brain injury (TBI), however the optimal management of CPP is still debated. STUDY HYPOTHESIS: We hypothesized that the impact of CPP on outcome is related to brain tissue oxygen tension (PbtO2) level and that reduced CPP may worsen TBI prognosis when it is associated with brain hypoxia. DESIGN. Retrospective analysis of prospective database. METHODS. We analyzed 103 patients with severe TBI who underwent continuous PbtO2 and CPP monitoring for an average of 5 days. For each patient, duration of reduced CPP (\60 mm Hg) and brain hypoxia (PbtO2\15 mm Hg for[30 min [1]) was calculated with linear interpolation method and the relationship between CPP and PbtO2 was analyzed with Pearson's linear correlation coefficient. Outcome at 30 days was assessed with the Glasgow Outcome Score (GOS), dichotomized as good (GOS 4-5) versus poor (GOS 1-3). Multivariable associations with outcome were analyzed with stepwise forward logistic regression. RESULTS. Reduced CPP (n=790 episodes; mean duration 10.2 ± 12.3 h) was observed in 75 (74%) patients and was frequently associated with brain hypoxia (46/75; 61%). Episodes where reduced CPP were associated with normal brain oxygen did not differ significantly between patients with poor versus those with good outcome (8.2 ± 8.3 vs. 6.5 ± 9.7 h; P=0.35). In contrast, time where reduced CPP occurred simultaneously with brain hypoxia was longer in patients with poor than in those with good outcome (3.3±7.4 vs. 0.8±2.3 h; P=0.02). Outcome was significantly worse in patients who had both reduced CPP and brain hypoxia (61% had GOS 1-3 vs. 17% in those with reduced CPP but no brain hypoxia; P\0.01). Patients in whom a positive CPP-PbtO2 correlation (r[0.3) was found also were more likely to have poor outcome (69 vs. 31% in patients with no CPP-PbtO2 correlation; P\0.01). Brain hypoxia was an independent risk factor of poor prognosis (odds ratio for favorable outcome of 0.89 [95% CI 0.79-1.00] per hour spent with a PbtO2\15 mm Hg; P=0.05, adjusted for CPP, age, GCS, Marshall CT and APACHE II). CONCLUSIONS. Low CPP may significantly worsen outcome after severe TBI when it is associated with brain tissue hypoxia. PbtO2-targeted management of CPP may optimize TBI therapy and improve outcome of head-injured patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To assess the change in non-compliant items in prescription orders following the implementation of a computerized physician order entry (CPOE) system named PreDiMed. SETTING: The department of internal medicine (39 and 38 beds) in two regional hospitals in Canton Vaud, Switzerland. METHOD: The prescription lines in 100 pre- and 100 post-implementation patients' files were classified according to three modes of administration (medicines for oral or other non-parenteral uses; medicines administered parenterally or via nasogastric tube; pro re nata (PRN), as needed) and analyzed for a number of relevant variables constitutive of medical prescriptions. MAIN OUTCOME MEASURE: The monitored variables depended on the pharmaceutical category and included mainly name of medicine, pharmaceutical form, posology and route of administration, diluting solution, flow rate and identification of prescriber. RESULTS: In 2,099 prescription lines, the total number of non-compliant items was 2,265 before CPOE implementation, or 1.079 non-compliant items per line. Two-thirds of these were due to missing information, and the remaining third to incomplete information. In 2,074 prescription lines post-CPOE implementation, the number of non-compliant items had decreased to 221, or 0.107 non-compliant item per line, a dramatic 10-fold decrease (chi(2) = 4615; P < 10(-6)). Limitations of the computerized system were the risk for erroneous items in some non-prefilled fields and ambiguity due to a field with doses shown on commercial products. CONCLUSION: The deployment of PreDiMed in two departments of internal medicine has led to a major improvement in formal aspects of physicians' prescriptions. Some limitations of the first version of PreDiMed were unveiled and are being corrected.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Screening tests for subclinical cardiovascular disease, such as markers of atherosclerosis, are increasingly used in clinical prevention to identify individuals at high cardiovascular risk. Being aware of these test results might also enhance patient motivation to change unhealthy behaviors but the effectiveness of such a screening strategy has been poorly studied. METHODS: The CAROtid plaque Screening trial on Smoking cessation (CAROSS) is a randomized controlled trial in 530 regular smokers aged 40-70 years to test the hypothesis that carotid plaque screening will influence smokers' behavior with an increased rate of smoking cessation (primary outcome) and an improved control of other cardiovascular risk factors (secondary outcomes) after 1-year follow-up. All smokers will receive a brief advice for smoking cessation,and will subsequently be randomly assigned to either the intervention group (with plaques screening) or the control group (without plaque screening). Carotid ultrasound will be conducted with a standard protocol. Smokers with at least one carotid plaque will receive pictures of their own plaques with a structured explanation on the general significance of plaques. To ensure equal contact conditions, smokers not undergoing ultrasound and those without plaque will receive a relevant explanation on the risks associated with tobacco smoking. Study outcomes will be compared between smokers randomized to plaque screening and smokers not submitted to plaque screening. SUMMARY: This will be the first trial to assess the impact of carotid plaque screening on 1-year smoking cessation rates and levels of control of other cardiovascular risk factors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND AND AIMS: Critically ill patients with complicated evolution are frequently hypermetabolic, catabolic, and at risk of underfeeding. The study aimed at assessing the relationship between energy balance and outcome in critically ill patients. METHODS: Prospective observational study conducted in consecutive patients staying > or = 5 days in the surgical ICU of a University hospital. Demographic data, time to feeding, route, energy delivery, and outcome were recorded. Energy balance was calculated as energy delivery minus target. Data in means+/-SD, linear regressions between energy balance and outcome variables. RESULTS: Forty eight patients aged 57+/-16 years were investigated; complete data are available in 669 days. Mechanical ventilation lasted 11+/-8 days, ICU stay 15+/-9 was days, and 30-days mortality was 38%. Time to feeding was 3.1+/-2.2 days. Enteral nutrition was the most frequent route with 433 days. Mean daily energy delivery was 1090+/-930 kcal. Combining enteral and parenteral nutrition achieved highest energy delivery. Cumulated energy balance was between -12,600+/-10,520 kcal, and correlated with complications (P < 0.001), already after 1 week. CONCLUSION: Negative energy balances were correlated with increasing number of complications, particularly infections. Energy debt appears as a promising tool for nutritional follow-up, which should be further tested. Delaying initiation of nutritional support exposes the patients to energy deficits that cannot be compensated later on.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Profiling miRNA levels in cells with miRNA microarrays is becoming a widely used technique. Although normalization methods for mRNA gene expression arrays are well established, miRNA array normalization has so far not been investigated in detail. In this study we investigate the impact of normalization on data generated with the Agilent miRNA array platform. We have developed a method to select nonchanging miRNAs (invariants) and use them to compute linear regression normalization coefficients or variance stabilizing normalization (VSN) parameters. We compared the invariants normalization to normalization by scaling, quantile, and VSN with default parameters as well as to no normalization using samples with strong differential expression of miRNAs (heart-brain comparison) and samples where only a few miRNAs are affected (by p53 overexpression in squamous carcinoma cells versus control). All normalization methods performed better than no normalization. Normalization procedures based on the set of invariants and quantile were the most robust over all experimental conditions tested. Our method of invariant selection and normalization is not limited to Agilent miRNA arrays and can be applied to other data sets including those from one color miRNA microarray platforms, focused gene expression arrays, and gene expression analysis using quantitative PCR.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bipolar disorder has a major deleterious impact on many aspects of a patient's functioning and health-related quality of life. Although the formal measurement of these deficits has been neglected until recently, many well-designed trials now include an assessment of functioning and health-related quality of life using one or more rating scales. This review describes recent developments in the measurement of functioning and health-related quality of life in bipolar disorder, and discusses the evidence that medications that improve symptoms in bipolar disorder also offer clinically relevant benefits in functioning and health-related quality of life. Direct comparisons of the benefits of medications including atypical antipsychotics are problematic due to differences in trial populations, study durations and rating scales. Data from quetiapine trials indicate that this medication offers prompt and sustained improvement of functioning in patients with mania and enhancement of health-related quality of life in patients with bipolar depression, to accompany the significant improvements in mood episodes.