39 resultados para decision support


Relevância:

60.00% 60.00%

Publicador:

Resumo:

1 6 STRUCTURE OF THIS THESIS -Chapter I presents the motivations of this dissertation by illustrating two gaps in the current body of knowledge that are worth filling, describes the research problem addressed by this thesis and presents the research methodology used to achieve this goal. -Chapter 2 shows a review of the existing literature showing that environment analysis is a vital strategic task, that it shall be supported by adapted information systems, and that there is thus a need for developing a conceptual model of the environment that provides a reference framework for better integrating the various existing methods and a more formal definition of the various aspect to support the development of suitable tools. -Chapter 3 proposes a conceptual model that specifies the various enviromnental aspects that are relevant for strategic decision making, how they relate to each other, and ,defines them in a more formal way that is more suited for information systems development. -Chapter 4 is dedicated to the evaluation of the proposed model on the basis of its application to a concrete environment to evaluate its suitability to describe the current conditions and potential evolution of a real environment and get an idea of its usefulness. -Chapter 5 goes a step further by assembling a toolbox describing a set of methods that can be used to analyze the various environmental aspects put forward by the model and by providing more detailed specifications for a number of them to show how our model can be used to facilitate their implementation as software tools. -Chapter 6 describes a prototype of a strategic decision support tool that allow the analysis of some of the aspects of the environment that are not well supported by existing tools and namely to analyze the relationship between multiple actors and issues. The usefulness of this prototype is evaluated on the basis of its application to a concrete environment. -Chapter 7 finally concludes this thesis by making a summary of its various contributions and by proposing further interesting research directions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVES: Reassessment of ongoing antibiotic therapy is an important step towards appropriate use of antibiotics. This study was conducted to evaluate the impact of a short questionnaire designed to encourage reassessment of intravenous antibiotic therapy after 3 days. PATIENTS AND METHODS: Patients hospitalized on the surgical and medical wards of a university hospital and treated with an intravenous antibiotic for 3-4 days were randomly allocated to either an intervention or control group. The intervention consisted of mailing to the physician in charge of the patient a three-item questionnaire referring to possible adaptation of the antibiotic therapy. The primary outcome was the time elapsed from randomization until a first modification of the initial intravenous antibiotic therapy. It was compared within both groups using Cox proportional-hazard modelling. RESULTS: One hundred and twenty-six eligible patients were randomized in the intervention group and 125 in the control group. Time to modification of intravenous antibiotic therapy was 14% shorter in the intervention group (adjusted hazard ratio for modification 1.28, 95% CI 0.99-1.67, P = 0.06). It was significantly shorter in the intervention group compared with a similar group of 151 patients observed during a 2 month period preceding the study (adjusted hazard ratio 1.17, 95% CI 1.03-1.32, P = 0.02). CONCLUSION: The results suggest that a short questionnaire, easily adaptable to automatization, has the potential to foster reassessment of antibiotic therapy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: A simple prognostic model could help identify patients with pulmonary embolism who are at low risk of death and are candidates for outpatient treatment. METHODS: We randomly allocated 15,531 retrospectively identified inpatients who had a discharge diagnosis of pulmonary embolism from 186 Pennsylvania hospitals to derivation (67%) and internal validation (33%) samples. We derived our rule to predict 30-day mortality using classification tree analysis and patient data routinely available at initial examination as potential predictor variables. We used data from a European prospective study to externally validate the rule among 221 inpatients with pulmonary embolism. We determined mortality and nonfatal adverse medical outcomes across derivation and validation samples. RESULTS: Our final model consisted of 10 patient factors (age > or = 70 years; history of cancer, heart failure, chronic lung disease, chronic renal disease, and cerebrovascular disease; and clinical variables of pulse rate > or = 110 beats/min, systolic blood pressure < 100 mm Hg, altered mental status, and arterial oxygen saturation < 90%). Patients with none of these factors were defined as low risk. The 30-day mortality rates for low-risk patients were 0.6%, 1.5%, and 0% in the derivation, internal validation, and external validation samples, respectively. The rates of nonfatal adverse medical outcomes were less than 1% among low-risk patients across all study samples. CONCLUSIONS: This simple prediction rule accurately identifies patients with pulmonary embolism who are at low risk of short-term mortality and other adverse medical outcomes. Prospective validation of this rule is important before its implementation as a decision aid for outpatient treatment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: Physicians need a specific risk-stratification tool to facilitate safe and cost-effective approaches to the management of patients with cancer and acute pulmonary embolism (PE). The objective of this study was to develop a simple risk score for predicting 30-day mortality in patients with PE and cancer by using measures readily obtained at the time of PE diagnosis. METHODS: Investigators randomly allocated 1,556 consecutive patients with cancer and acute PE from the international multicenter Registro Informatizado de la Enfermedad TromboEmbólica to derivation (67%) and internal validation (33%) samples. The external validation cohort for this study consisted of 261 patients with cancer and acute PE. Investigators compared 30-day all-cause mortality and nonfatal adverse medical outcomes across the derivation and two validation samples. RESULTS: In the derivation sample, multivariable analyses produced the risk score, which contained six variables: age > 80 years, heart rate ≥ 110/min, systolic BP < 100 mm Hg, body weight < 60 kg, recent immobility, and presence of metastases. In the internal validation cohort (n = 508), the 22.2% of patients (113 of 508) classified as low risk by the prognostic model had a 30-day mortality of 4.4% (95% CI, 0.6%-8.2%) compared with 29.9% (95% CI, 25.4%-34.4%) in the high-risk group. In the external validation cohort, the 18% of patients (47 of 261) classified as low risk by the prognostic model had a 30-day mortality of 0%, compared with 19.6% (95% CI, 14.3%-25.0%) in the high-risk group. CONCLUSIONS: The developed clinical prediction rule accurately identifies low-risk patients with cancer and acute PE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Introduction: Blood doping (BD) is the use of Erythropoietic Stimulating Agents (ESAs) and/or transfusion to increase aerobic performance in athletes. Direct toxicologic techniques are insufficient to unmask sophisticated doping protocols. The Hematological module of the ABP (World Anti-Doping Agency), associates decision support technology and expert assessment to indirectly detect BD hematological effects. Methods: The ABP module is based on blood parameters, under strict pre-analytical and analytical rules for collection, storage and transport at 2-12°C, internal and external QC. Accuracy, reproducibility and interlaboratory harmonization fulfill forensic standard. Blood samples are collected in competition and out-ofcompetition. Primary parameters for longitudinal monitoring are: - hemoglobin (HGB); - reticulocyte percentage (RET); - OFF score, indicator of suppressed erythropoiesis, calculated as [HGB(g/L) * 60-√RET%]. Statistical calculation predicts individual expected limits by probabilistic inference. Secondary parameters are RBC, HCT, MCHC-MCH-MCV-RDW-IFR. ABP profiles flagged as atypical are review by experts in hematology, pharmacology, sports medicine or physiology, and classified as: - normal - suspect (to target) - likely due to BD - likely due to pathology. Results: Thousands of athletes worldwide are currently monitored. Since 2010, at least 35 athletes have been sanctioned and others are prosecuted on the sole basis of abnormal ABP, with a 240% increase of positivity to direct tests for ESA, thanks to improved targeting of suspicious athletes (WADA data). Specific doping scenarios have been identified by the Experts (Table and Figure). Figure. Typical HGB and RET profiles in two highly suspicious athletes. A. Sample 2: simultaneous increases in HGB and RET (likely ESA stimulation) in a male. B. Samples 3, 6 and 7: "OFF" picture, with high HGB and low RET in a female. Sample 10: normal HGB and increased RET (ESA or blood withdrawal). Conclusions: ABP is a powerful tool for indirect doping detection, based on the recognition of specific, unphysiological changes triggered by blood doping. The effect of factors of heterogeneity, such as sex and altitude, must also be considered. Schumacher YO, et al. Drug Test Anal 2012, 4:846-853. Sottas PE, et al. Clin Chem 2011, 57:969-976.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: Maintaining therapeutic concentrations of drugs with a narrow therapeutic window is a complex task. Several computer systems have been designed to help doctors determine optimum drug dosage. Significant improvements in health care could be achieved if computer advice improved health outcomes and could be implemented in routine practice in a cost effective fashion. This is an updated version of an earlier Cochrane systematic review, by Walton et al, published in 2001. OBJECTIVES: To assess whether computerised advice on drug dosage has beneficial effects on the process or outcome of health care. SEARCH STRATEGY: We searched the Cochrane Effective Practice and Organisation of Care Group specialized register (June 1996 to December 2006), MEDLINE (1966 to December 2006), EMBASE (1980 to December 2006), hand searched the journal Therapeutic Drug Monitoring (1979 to March 2007) and the Journal of the American Medical Informatics Association (1996 to March 2007) as well as reference lists from primary articles. SELECTION CRITERIA: Randomized controlled trials, controlled trials, controlled before and after studies and interrupted time series analyses of computerized advice on drug dosage were included. The participants were health professionals responsible for patient care. The outcomes were: any objectively measured change in the behaviour of the health care provider (such as changes in the dose of drug used); any change in the health of patients resulting from computerized advice (such as adverse reactions to drugs). DATA COLLECTION AND ANALYSIS: Two reviewers independently extracted data and assessed study quality. MAIN RESULTS: Twenty-six comparisons (23 articles) were included (as compared to fifteen comparisons in the original review) including a wide range of drugs in inpatient and outpatient settings. Interventions usually targeted doctors although some studies attempted to influence prescriptions by pharmacists and nurses. Although all studies used reliable outcome measures, their quality was generally low. Computerized advice for drug dosage gave significant benefits by:1.increasing the initial dose (standardised mean difference 1.12, 95% CI 0.33 to 1.92)2.increasing serum concentrations (standradised mean difference 1.12, 95% CI 0.43 to 1.82)3.reducing the time to therapeutic stabilisation (standardised mean difference -0.55, 95%CI -1.03 to -0.08)4.reducing the risk of toxic drug level (rate ratio 0.45, 95% CI 0.30 to 0.70)5.reducing the length of hospital stay (standardised mean difference -0.35, 95% CI -0.52 to -0.17). AUTHORS' CONCLUSIONS: This review suggests that computerized advice for drug dosage has some benefits: it increased the initial dose of drug, increased serum drug concentrations and led to a more rapid therapeutic control. It also reduced the risk of toxic drug levels and the length of time spent in the hospital. However, it had no effect on adverse reactions. In addition, there was no evidence to suggest that some decision support technical features (such as its integration into a computer physician order entry system) or aspects of organization of care (such as the setting) could optimise the effect of computerised advice.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: The writing of prescriptions is an important aspect of medical practice. This activity presents some specific problems given a danger of misinterpretation and dispensing errors in community pharmacies. The objective of this study was to determine the evolution of the prescription practice and writing quality in the outpatient clinics of our paediatric university hospital.¦METHODS: Copies of prescriptions written by physicians were collected from community pharmacies in the region of our hospital for a two-month period in 2005 and 2010. They were analysed according to standard criteria, including both formal and pharmaceutical aspects.¦RESULTS: A total of 597 handwritten prescriptions were reviewed in 2005 and 633 in 2010. They contained 1,456 drug prescriptions in 2005 and 1,348 in 2010. Fifteen drugs accounted for 80% of all prescriptions and the most common drugs were paracetamol and ibuprofen. A higher proportion of drugs were prescribed as International Nonproprietary Names (INN) or generics in 2010 (24.7%) compared with 2005 (20.9%). Of the drug prescriptions examined, 55.5% were incomplete in 2005 and 69.2% in 2010. Moreover in 2005, 3.2% were legible only with difficulty, 22.9% were ambiguous, and 3.0% contained an error. These proportions rose respectively to 5.2%, 27.8%, and 6.8% in 2010.¦CONCLUSION: This study showed that fifteen different drugs represented the majority of prescriptions, and a quarter of them were prescribed as INN or generics in 2010; and that handwritten prescriptions contained numerous omissions and preventable errors. In our hospital computerised prescribing coupled with advanced decision support is eagerly awaited.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The new recommendations on the pharmacological treatment of type 2 diabetes have introduced two important changes. The first is to have common strategies between European and American diabetes societies. The second, which is certainly the most significant, is to develop a patient centred approach suggesting therapies that take into account the patient's preferences and use of decision support tools. The individual approach integrates six factors: the capacity and motivation of the patient to manage his illness and its treatment, the risks of hypoglycemia, the life expectancy, the presence of co-morbidities and vascular complications, as well as the financial resources of the patient and the healthcare system. Treatment guidelines for cardiovascular risk reduction in diabetic remains the last point to develop.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In recent years many clinical prediction rules (CPR) have been developed. Before a CPR can be used in clinical practice, different methodical steps are necessary, from the development of the score, the internal and external validation to the impact study. Before using a CPR in daily practice family doctors have to verify how the rules have been developed and whether this has been done in a population similar to the population in which they would use them. The aim of this paper is to describe the development of a CPR, and to discuss advantages and risks related to the use of CPR in order to help family doctors in their choice of scores for use in their daily practice.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The research considers the problem of spatial data classification using machine learning algorithms: probabilistic neural networks (PNN) and support vector machines (SVM). As a benchmark model simple k-nearest neighbor algorithm is considered. PNN is a neural network reformulation of well known nonparametric principles of probability density modeling using kernel density estimator and Bayesian optimal or maximum a posteriori decision rules. PNN is well suited to problems where not only predictions but also quantification of accuracy and integration of prior information are necessary. An important property of PNN is that they can be easily used in decision support systems dealing with problems of automatic classification. Support vector machine is an implementation of the principles of statistical learning theory for the classification tasks. Recently they were successfully applied for different environmental topics: classification of soil types and hydro-geological units, optimization of monitoring networks, susceptibility mapping of natural hazards. In the present paper both simulated and real data case studies (low and high dimensional) are considered. The main attention is paid to the detection and learning of spatial patterns by the algorithms applied.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: The Marburg Heart Score (MHS) aims to assist GPs in safely ruling out coronary heart disease (CHD) in patients presenting with chest pain, and to guide management decisions. AIM: To investigate the diagnostic accuracy of the MHS in an independent sample and to evaluate the generalisability to new patients. DESIGN AND SETTING: Cross-sectional diagnostic study with delayed-type reference standard in general practice in Hesse, Germany. METHOD: Fifty-six German GPs recruited 844 males and females aged ≥ 35 years, presenting between July 2009 and February 2010 with chest pain. Baseline data included the items of the MHS. Data on the subsequent course of chest pain, investigations, hospitalisations, and medication were collected over 6 months and were reviewed by an independent expert panel. CHD was the reference condition. Measures of diagnostic accuracy included the area under the receiver operating characteristic curve (AUC), sensitivity, specificity, likelihood ratios, and predictive values. RESULTS: The AUC was 0.84 (95% confidence interval [CI] = 0.80 to 0.88). For a cut-off value of 3, the MHS showed a sensitivity of 89.1% (95% CI = 81.1% to 94.0%), a specificity of 63.5% (95% CI = 60.0% to 66.9%), a positive predictive value of 23.3% (95% CI = 19.2% to 28.0%), and a negative predictive value of 97.9% (95% CI = 96.2% to 98.9%). CONCLUSION: Considering the diagnostic accuracy of the MHS, its generalisability, and ease of application, its use in clinical practice is recommended.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Introduction The writing of prescriptions is an important aspect of medical practice. Since 2006, the Swiss authorities have decided to impose incentives to prescribe generic drugs. The objectives of this study were 1) to determine the evolution of the outpatient prescription practice in our paediatric university hospital during 2 periods separated by 5 years; 2) to assess the writing quality of outpatient prescriptions during the same period.Materials & Methods Design: Copies of prescriptions written by physicians were collected twice from community pharmacies in the region of our hospital for a 2-month period in 2005 and 2010. They were analysed according to standard criteria regarding both formal and pharmaceutical aspects. Drug prescriptions were classified as a) complete when all criteria for safety were fulfilled, b) ambiguous when there was a danger of a dispensing error because of one or more missing criteria, or c) containing an error.Setting: Paediatric university hospital.Main outcome measures: Proportion of generic drugs; outpatient prescription writing quality.Results: A total of 651 handwritten prescriptions were reviewed in 2005 and 693 in 2010. They contained 1570 drug prescriptions in 2005 (2.4 ± 1.2 drugs per patient) and 1462 in 2010 (2.1 ± 1.1). The most common drugs were paracetamol, ibuprofen, and sodium chloride. A higher proportion of drugs were prescribed as generic names or generics in 2010. Formal data regarding the physicians and the patients were almost complete, except for the patients' weight. Of the drug prescriptions, 48.5% were incomplete, 11.3% were ambiguous, and 3.0% contained an error in 2005. These proportions rose to 64.2%, 15.5% and 7.4% in 2010, respectively.Discussions, Conclusion This study showed that physicians' prescriptions comprised numerous omissions and errors with minimal potential for harm. Computerized prescription coupled with advanced decision support is eagerly awaited.Disclosure of Interest None Declared

Relevância:

60.00% 60.00%

Publicador:

Resumo:

It is estimated that around 230 people die each year due to radon (222Rn) exposure in Switzerland. 222Rn occurs mainly in closed environments like buildings and originates primarily from the subjacent ground. Therefore it depends strongly on geology and shows substantial regional variations. Correct identification of these regional variations would lead to substantial reduction of 222Rn exposure of the population based on appropriate construction of new and mitigation of already existing buildings. Prediction of indoor 222Rn concentrations (IRC) and identification of 222Rn prone areas is however difficult since IRC depend on a variety of different variables like building characteristics, meteorology, geology and anthropogenic factors. The present work aims at the development of predictive models and the understanding of IRC in Switzerland, taking into account a maximum of information in order to minimize the prediction uncertainty. The predictive maps will be used as a decision-support tool for 222Rn risk management. The construction of these models is based on different data-driven statistical methods, in combination with geographical information systems (GIS). In a first phase we performed univariate analysis of IRC for different variables, namely the detector type, building category, foundation, year of construction, the average outdoor temperature during measurement, altitude and lithology. All variables showed significant associations to IRC. Buildings constructed after 1900 showed significantly lower IRC compared to earlier constructions. We observed a further drop of IRC after 1970. In addition to that, we found an association of IRC with altitude. With regard to lithology, we observed the lowest IRC in sedimentary rocks (excluding carbonates) and sediments and the highest IRC in the Jura carbonates and igneous rock. The IRC data was systematically analyzed for potential bias due to spatially unbalanced sampling of measurements. In order to facilitate the modeling and the interpretation of the influence of geology on IRC, we developed an algorithm based on k-medoids clustering which permits to define coherent geological classes in terms of IRC. We performed a soil gas 222Rn concentration (SRC) measurement campaign in order to determine the predictive power of SRC with respect to IRC. We found that the use of SRC is limited for IRC prediction. The second part of the project was dedicated to predictive mapping of IRC using models which take into account the multidimensionality of the process of 222Rn entry into buildings. We used kernel regression and ensemble regression tree for this purpose. We could explain up to 33% of the variance of the log transformed IRC all over Switzerland. This is a good performance compared to former attempts of IRC modeling in Switzerland. As predictor variables we considered geographical coordinates, altitude, outdoor temperature, building type, foundation, year of construction and detector type. Ensemble regression trees like random forests allow to determine the role of each IRC predictor in a multidimensional setting. We found spatial information like geology, altitude and coordinates to have stronger influences on IRC than building related variables like foundation type, building type and year of construction. Based on kernel estimation we developed an approach to determine the local probability of IRC to exceed 300 Bq/m3. In addition to that we developed a confidence index in order to provide an estimate of uncertainty of the map. All methods allow an easy creation of tailor-made maps for different building characteristics. Our work is an essential step towards a 222Rn risk assessment which accounts at the same time for different architectural situations as well as geological and geographical conditions. For the communication of 222Rn hazard to the population we recommend to make use of the probability map based on kernel estimation. The communication of 222Rn hazard could for example be implemented via a web interface where the users specify the characteristics and coordinates of their home in order to obtain the probability to be above a given IRC with a corresponding index of confidence. Taking into account the health effects of 222Rn, our results have the potential to substantially improve the estimation of the effective dose from 222Rn delivered to the Swiss population.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Automatic environmental monitoring networks enforced by wireless communication technologies provide large and ever increasing volumes of data nowadays. The use of this information in natural hazard research is an important issue. Particularly useful for risk assessment and decision making are the spatial maps of hazard-related parameters produced from point observations and available auxiliary information. The purpose of this article is to present and explore the appropriate tools to process large amounts of available data and produce predictions at fine spatial scales. These are the algorithms of machine learning, which are aimed at non-parametric robust modelling of non-linear dependencies from empirical data. The computational efficiency of the data-driven methods allows producing the prediction maps in real time which makes them superior to physical models for the operational use in risk assessment and mitigation. Particularly, this situation encounters in spatial prediction of climatic variables (topo-climatic mapping). In complex topographies of the mountainous regions, the meteorological processes are highly influenced by the relief. The article shows how these relations, possibly regionalized and non-linear, can be modelled from data using the information from digital elevation models. The particular illustration of the developed methodology concerns the mapping of temperatures (including the situations of Föhn and temperature inversion) given the measurements taken from the Swiss meteorological monitoring network. The range of the methods used in the study includes data-driven feature selection, support vector algorithms and artificial neural networks.