961 resultados para medical applications
Resumo:
So-called online Voting Advice Applications (VAAs) have become very popular all over Europe. Millions of voters are using them as an assistance to make up their minds for which party they should vote. Despite this popularity there are only very few studies about the impact of these tools on individual electoral choice. On the basis of the Swiss VAA smartvote we present some first findings about the question whether VAAs do have a direct impact on the actual vote of their users. In deed, we find strong evidence that Swiss voters were affected by smartvote. However, our findings are somewhat contrary to the results of previous studies from other countries. Furthermore, the quality of available data for such studies needs to be improved. Future studies should pay attention to both: the improvement of the available data, as well as the explanation of the large variance of findings between the specific European countries.
Resumo:
L'éthique, en particulier dans sa version théologique, est exposée aujourd'hui à des mises à l'épreuve redoutables. Sans cesse solicitée par le public et les médias, elle engage une intelligence de la foi, une capacité analytique, une mobilisation de la raison et une implication des émotions. Le présent ouvrage entend conjoindre trois exigences: théorique, figurative et pratique. L'exigence théorique reprend à nouveaux frais la question des fondements de l'éthique, à l'interface de la rationalité, de la foi et de la théologie. L'exigence figurative, non sans rapport avec la démarche généalogique, précise les enjeux du dialogue que le théologien mène avec différentes formes de réflexion philosophique. L'exigence pratique, enfin, renoue les fils, jamais perdus de vue, avec l'expérience et l'existence des humains et des sociétés, sur la base de plusieurs cas exemplaires d'éthique appliquée: statut de l'embryon, compréhension de la maladie et de la santé, définition de la mort, transplantation d'organes, engagement social, toxicomanie, etc.
Resumo:
The recent advances in sequencing technologies have given all microbiology laboratories access to whole genome sequencing. Providing that tools for the automated analysis of sequence data and databases for associated meta-data are developed, whole genome sequencing will become a routine tool for large clinical microbiology laboratories. Indeed, the continuing reduction in sequencing costs and the shortening of the 'time to result' makes it an attractive strategy in both research and diagnostics. Here, we review how high-throughput sequencing is revolutionizing clinical microbiology and the promise that it still holds. We discuss major applications, which include: (i) identification of target DNA sequences and antigens to rapidly develop diagnostic tools; (ii) precise strain identification for epidemiological typing and pathogen monitoring during outbreaks; and (iii) investigation of strain properties, such as the presence of antibiotic resistance or virulence factors. In addition, recent developments in comparative metagenomics and single-cell sequencing offer the prospect of a better understanding of complex microbial communities at the global and individual levels, providing a new perspective for understanding host-pathogen interactions. Being a high-resolution tool, high-throughput sequencing will increasingly influence diagnostics, epidemiology, risk management, and patient care.
Resumo:
We prove a criterion for the irreducibility of an integral group representation p over the fraction field of a noetherian domain R in terms of suitably defined reductions of p at prime ideals of R. As applications, we give irreducibility results for universal deformations of residual representations, with a special attention to universal deformations of residual Galois representations associated with modular forms of weight at least 2.
Resumo:
Résumé: L'évaluation de l'exposition aux nuisances professionnelles représente une étape importante dans l'analyse de poste de travail. Les mesures directes sont rarement utilisées sur les lieux même du travail et l'exposition est souvent estimée sur base de jugements d'experts. Il y a donc un besoin important de développer des outils simples et transparents, qui puissent aider les spécialistes en hygiène industrielle dans leur prise de décision quant aux niveaux d'exposition. L'objectif de cette recherche est de développer et d'améliorer les outils de modélisation destinés à prévoir l'exposition. Dans un premier temps, une enquête a été entreprise en Suisse parmi les hygiénistes du travail afin d'identifier les besoins (types des résultats, de modèles et de paramètres observables potentiels). Il a été constaté que les modèles d'exposition ne sont guère employés dans la pratique en Suisse, l'exposition étant principalement estimée sur la base de l'expérience de l'expert. De plus, l'émissions de polluants ainsi que leur dispersion autour de la source ont été considérés comme des paramètres fondamentaux. Pour tester la flexibilité et la précision des modèles d'exposition classiques, des expériences de modélisations ont été effectuées dans des situations concrètes. En particulier, des modèles prédictifs ont été utilisés pour évaluer l'exposition professionnelle au monoxyde de carbone et la comparer aux niveaux d'exposition répertoriés dans la littérature pour des situations similaires. De même, l'exposition aux sprays imperméabilisants a été appréciée dans le contexte d'une étude épidémiologique sur une cohorte suisse. Dans ce cas, certains expériences ont été entreprises pour caractériser le taux de d'émission des sprays imperméabilisants. Ensuite un modèle classique à deux-zone a été employé pour évaluer la dispersion d'aérosol dans le champ proche et lointain pendant l'activité de sprayage. D'autres expériences ont également été effectuées pour acquérir une meilleure compréhension des processus d'émission et de dispersion d'un traceur, en se concentrant sur la caractérisation de l'exposition du champ proche. Un design expérimental a été développé pour effectuer des mesures simultanées dans plusieurs points d'une cabine d'exposition, par des instruments à lecture directe. Il a été constaté que d'un point de vue statistique, la théorie basée sur les compartiments est sensée, bien que l'attribution à un compartiment donné ne pourrait pas se faire sur la base des simples considérations géométriques. Dans une étape suivante, des données expérimentales ont été collectées sur la base des observations faites dans environ 100 lieux de travail différents: des informations sur les déterminants observés ont été associées aux mesures d'exposition des informations sur les déterminants observés ont été associé. Ces différentes données ont été employées pour améliorer le modèle d'exposition à deux zones. Un outil a donc été développé pour inclure des déterminants spécifiques dans le choix du compartiment, renforçant ainsi la fiabilité des prévisions. Toutes ces investigations ont servi à améliorer notre compréhension des outils des modélisations ainsi que leurs limitations. L'intégration de déterminants mieux adaptés aux besoins des experts devrait les inciter à employer cet outil dans leur pratique. D'ailleurs, en augmentant la qualité des outils des modélisations, cette recherche permettra non seulement d'encourager leur utilisation systématique, mais elle pourra également améliorer l'évaluation de l'exposition basée sur les jugements d'experts et, par conséquent, la protection de la santé des travailleurs. Abstract Occupational exposure assessment is an important stage in the management of chemical exposures. Few direct measurements are carried out in workplaces, and exposures are often estimated based on expert judgements. There is therefore a major requirement for simple transparent tools to help occupational health specialists to define exposure levels. The aim of the present research is to develop and improve modelling tools in order to predict exposure levels. In a first step a survey was made among professionals to define their expectations about modelling tools (what types of results, models and potential observable parameters). It was found that models are rarely used in Switzerland and that exposures are mainly estimated from past experiences of the expert. Moreover chemical emissions and their dispersion near the source have also been considered as key parameters. Experimental and modelling studies were also performed in some specific cases in order to test the flexibility and drawbacks of existing tools. In particular, models were applied to assess professional exposure to CO for different situations and compared with the exposure levels found in the literature for similar situations. Further, exposure to waterproofing sprays was studied as part of an epidemiological study on a Swiss cohort. In this case, some laboratory investigation have been undertaken to characterize the waterproofing overspray emission rate. A classical two-zone model was used to assess the aerosol dispersion in the near and far field during spraying. Experiments were also carried out to better understand the processes of emission and dispersion for tracer compounds, focusing on the characterization of near field exposure. An experimental set-up has been developed to perform simultaneous measurements through direct reading instruments in several points. It was mainly found that from a statistical point of view, the compartmental theory makes sense but the attribution to a given compartment could ñó~be done by simple geometric consideration. In a further step the experimental data were completed by observations made in about 100 different workplaces, including exposure measurements and observation of predefined determinants. The various data obtained have been used to improve an existing twocompartment exposure model. A tool was developed to include specific determinants in the choice of the compartment, thus largely improving the reliability of the predictions. All these investigations helped improving our understanding of modelling tools and identify their limitations. The integration of more accessible determinants, which are in accordance with experts needs, may indeed enhance model application for field practice. Moreover, while increasing the quality of modelling tool, this research will not only encourage their systematic use, but might also improve the conditions in which the expert judgments take place, and therefore the workers `health protection.
Resumo:
Background: The modulation of energetic homeostasis by pollutants has recently emerged as a potential contributor to the onset of metabolic disorders. Diethylhexyl phthalate (DEHP) is a widely used industrial plasticizer to which humans are widely exposed. Phthalates can activate the three peroxisome proliferatoractivated receptor (PPAR) isotypes on cellular models and induce peroxisome proliferation in rodents.Objectives: In this study, we aimed to evaluate the systemic and metabolic consequences of DEHP exposure that have remained so far unexplored and to characterize the underlying molecular mechanisms of action.Methods: As a proof of concept and mechanism, genetically engineered mouse models of PPARs were exposed to high doses of DEHP, followed by metabolic and molecular analyses.Results: DEHP-treated mice were protected from diet-induced obesity via PPARalpha-dependent activation of hepatic fatty acid catabolism, whereas the activity of neither PPARbeta nor PPARgamma was affected. However, the lean phenotype observed in response to DEHP in wild-type mice was surprisingly abolished in PPARalpha-humanized mice. These species differences are associated with a different pattern of coregulator recruitment.Conclusion: These results demonstrate that DEHP exerts species-specific metabolic actions that rely to a large extent on PPARalpha signaling and highlight the metabolic importance of the species-specific activation of PPARalpha by xenobiotic compounds. Editor's SummaryDiethylhexyl phthalate (DEHP) is an industrial plasticizer used in cosmetics, medical devices, food packaging, and other applications. Evidence that DEHP metabolites can activate peroxisome proliferatoractivated receptors (PPARs) involved in fatty acid oxidation (PPARalpha and PPARbeta) and adiposite function and insulin resistance (PPARgamma) has raised concerns about potential effects of DEHP on metabolic homeostasis. In rodents, PPARalpha activation also induces hepatic peroxisome proliferation, but this response to PPARalpha activation is not observed in humans. Feige et al. (p. 234) evaluated systemic and metabolic consequences of high-dose oral DEHP in combination with a high-fat diet in wild-type mice and genetically engineered mouse PPAR models. The authors report that mice exposed to DEHP gained less weight than controls, without modifying their feeding behavior; they also exhibited lower triglyceride levels, smaller adipocytes, and improved glucose tolerance compared with controls. These effects, which were observed in mice fed both high-fat and standard diets, appeared to be mediated by PPARalpha-dependent activation of hepatic fatty acid catabolism without apparent involvement of PPARbeta or PPARgamma. However, mouse models that expressed human (versus mouse) PPARalpha tended to gain more weight on a high-fat diet than their DHEP-unexposed counterparts. The authors conclude that findings support species-specific metabolic effects of DEHP mediated by PPARalpha activation.
Resumo:
A review article of the The New England Journal of Medicine refers that almost a century ago, Abraham Flexner, a research scholar at the Carnegie Foundation for the Advancement of Teaching, undertook an assessment of medical education in 155 medical schools in operation in the United States and Canada. Flexner’s report emphasized the nonscientific approach of American medical schools to preparation for the profession, which contrasted with the university-based system of medical education in Germany. At the core of Flexner’s view was the notion that formal analytic reasoning, the kind of thinking integral to the natural sciences, should hold pride of place in the intellectual training of physicians. This idea was pioneered at Harvard University, the University of Michigan, and the University of Pennsylvania in the 1880s, but was most fully expressed in the educational program at Johns Hopkins University, which Flexner regarded as the ideal for medical education. (...)
Resumo:
BACKGROUND: QT interval prolongation carries an increased risk of torsade de pointes and death. AIM: We sought to determine the prevalence of QT prolongation in medical inpatients and to identify determinants of this condition. METHODS: We enrolled consecutive patients who were admitted to the internal medicine ward and who had an electrocardiogram performed within 24 h of admission. We collected information on baseline patient characteristics and the use of QT-prolonging drugs. Two blinded readers manually measured the QT intervals. QT intervals were corrected for heart rate using the traditional Bazett formula and the linear regression-based Framingham formula. We used logistic regression to identify patient characteristics and drugs that were independently associated with QTc prolongation. RESULTS: Of 537 inpatients, 22.3% had a prolonged QTc based on the Bazett formula. The adjusted odds for QTc prolongation based on the Bazett correction were significantly higher in patients who had liver disease (OR 2.9, 95% CI: 1.5-5.6), hypokalaemia (OR 3.3, 95% CI: 1.9-5.6) and who were taking ≥1 QT-prolonging drug at admission (OR 1.7, 95% CI: 1.1-2.6). Overall, 50.8% of patients with QTc prolongation received additional QT-prolonging drugs during hospitalisation. CONCLUSIONS: The prevalence of QTc prolongation was high among medical inpatients but depended on the method used to correct for heart rate. The use of QT-prolonging drugs, hypokalaemia and liver disease increased the risk of QTc prolongation. Many patients with QTc prolongation received additional QT-prolonging drugs during hospitalisation, further increasing the risk of torsade de pointes and death.
Resumo:
BACKGROUND: Chest pain is a common complaint in primary care, with coronary heart disease (CHD) being the most concerning of many potential causes. Systematic reviews on the sensitivity and specificity of symptoms and signs summarize the evidence about which of them are most useful in making a diagnosis. Previous meta-analyses are dominated by studies of patients referred to specialists. Moreover, as the analysis is typically based on study-level data, the statistical analyses in these reviews are limited while meta-analyses based on individual patient data can provide additional information. Our patient-level meta-analysis has three unique aims. First, we strive to determine the diagnostic accuracy of symptoms and signs for myocardial ischemia in primary care. Second, we investigate associations between study- or patient-level characteristics and measures of diagnostic accuracy. Third, we aim to validate existing clinical prediction rules for diagnosing myocardial ischemia in primary care. This article describes the methods of our study and six prospective studies of primary care patients with chest pain. Later articles will describe the main results. METHODS/DESIGN: We will conduct a systematic review and IPD meta-analysis of studies evaluating the diagnostic accuracy of symptoms and signs for diagnosing coronary heart disease in primary care. We will perform bivariate analyses to determine the sensitivity, specificity and likelihood ratios of individual symptoms and signs and multivariate analyses to explore the diagnostic value of an optimal combination of all symptoms and signs based on all data of all studies. We will validate existing clinical prediction rules from each of the included studies by calculating measures of diagnostic accuracy separately by study. DISCUSSION: Our study will face several methodological challenges. First, the number of studies will be limited. Second, the investigators of original studies defined some outcomes and predictors differently. Third, the studies did not collect the same standard clinical data set. Fourth, missing data, varying from partly missing to fully missing, will have to be dealt with.Despite these limitations, we aim to summarize the available evidence regarding the diagnostic accuracy of symptoms and signs for diagnosing CHD in patients presenting with chest pain in primary care. REVIEW REGISTRATION: Centre for Reviews and Dissemination (University of York): CRD42011001170.
Resumo:
The assessment of medical technologies has to answer several questions ranging from safety and effectiveness to complex economical, social, and health policy issues. The type of data needed to carry out such evaluation depends on the specific questions to be answered, as well as on the stage of development of a technology. Basically two types of data may be distinguished: (a) general demographic, administrative, or financial data which has been collected not specifically for technology assessment; (b) the data collected with respect either to a specific technology or to a disease or medical problem. On the basis of a pilot inquiry in Europe and bibliographic research, the following categories of type (b) data bases have been identified: registries, clinical data bases, banks of factual and bibliographic knowledge, and expert systems. Examples of each category are discussed briefly. The following aims for further research and practical goals are proposed: criteria for the minimal data set required, improvement to the registries and clinical data banks, and development of an international clearinghouse to enhance information diffusion on both existing data bases and available reports on medical technology assessments.
Resumo:
Con la creciente generación de resonancias magnéticas, los servicios de radiología necesitan aplicaciones que les faciliten el trabajo de acceso remoto a los datos y a las herramientas que utilicen para la extracción de datos para realizar sus diagnósticos. El objetivo de este proyecto es el de estudiar e integrar en la plataforma web del grupo de Imagen Médica del PIC llamada PICNIC (PIC NeuroImaging Center) un conjunto de aplicaciones para el estudio y procesamiento de neuroimagen con la implementación de herramientas software en la plataforma grid del PIC.