11 resultados para Rotary engines.
em Université de Lausanne, Switzerland
Resumo:
Purpose The purpose of our multidisciplinary study was to define a pragmatic and secure alternative to the creation of a national centralised medical record which could gather together the different parts of the medical record of a patient scattered in the different hospitals where he was hospitalised without any risk of breaching confidentiality. Methods We first analyse the reasons for the failure and the dangers of centralisation (i.e. difficulty to define a European patients' identifier, to reach a common standard for the contents of the medical record, for data protection) and then propose an alternative that uses the existing available data on the basis that setting up a safe though imperfect system could be better than continuing a quest for a mythical perfect information system that we have still not found after a search that has lasted two decades. Results We describe the functioning of Medical Record Search Engines (MRSEs), using pseudonymisation of patients' identity. The MRSE will be able to retrieve and to provide upon an MD's request all the available information concerning a patient who has been hospitalised in different hospitals without ever having access to the patient's identity. The drawback of this system is that the medical practitioner then has to read all of the information and to create his own synthesis and eventually to reject extra data. Conclusions Faced with the difficulties and the risks of setting up a centralised medical record system, a system that gathers all of the available information concerning a patient could be of great interest. This low-cost pragmatic alternative which could be developed quickly should be taken into consideration by health authorities.
Resumo:
The use of the Internet now has a specific purpose: to find information. Unfortunately, the amount of data available on the Internet is growing exponentially, creating what can be considered a nearly infinite and ever-evolving network with no discernable structure. This rapid growth has raised the question of how to find the most relevant information. Many different techniques have been introduced to address the information overload, including search engines, Semantic Web, and recommender systems, among others. Recommender systems are computer-based techniques that are used to reduce information overload and recommend products likely to interest a user when given some information about the user's profile. This technique is mainly used in e-Commerce to suggest items that fit a customer's purchasing tendencies. The use of recommender systems for e-Government is a research topic that is intended to improve the interaction among public administrations, citizens, and the private sector through reducing information overload on e-Government services. More specifically, e-Democracy aims to increase citizens' participation in democratic processes through the use of information and communication technologies. In this chapter, an architecture of a recommender system that uses fuzzy clustering methods for e-Elections is introduced. In addition, a comparison with the smartvote system, a Web-based Voting Assistance Application (VAA) used to aid voters in finding the party or candidate that is most in line with their preferences, is presented.
Resumo:
Purpose: Although young males encounter sexually-related concerns, they are mostly absent from specialized services. Our objective is to assess whether the internet is used by boys to find answers to these types of problems and questions. Methods: In the context of a qualitative study assessing young males' barriers to access sexual and reproductive health facilities, we conducted two focus groups gathering 12 boys aged 17-20. Discussions were triggered through the presentation of four vignettes corresponding to questions posted by 17-20 year old boys and girls on an information website for adolescents (www.ciao.ch), concerning various sexual dysfunction situations. In order to avoid having to talk about their own experience, participants were asked what they would do in those cases. Results: In general, the internet was mentioned quite thoroughly as a means of searching for information through research engines and a place to address professionals for advice.Within the hierarchy of consultation possibilities, the internet was given the first place as a way to deal with these types of problems presenting many advantages: (1) the internet enables to maintain intimacy; (2) it is anonymous (use of a pseudo); (3) it avoids having to confront someone face-to-face with personal problems which can be embarrassing and challenging for one's pride; (4) it is free; and (5) it is accessible at all times. In other words, participants value the internet as a positive tool to avoid many barriers which prevent offline consultations to take place. Most participants consider the internet at least as a first step in trying to solve a problem; for instance, by better defining the seriousness of a problem and judging if it is worth consulting a doctor. However, despite the positive qualities of the internet, they do put forward the importance of having specialists answering questions, trustworthiness, and being followed-up by the same person. Participants suggested that a strategy to break down barriers for boys to consult in face-to-face settings is to have a consultation on the internet as a first step which could then guide the person to an in-person consultation if necessary. Conclusions: The internet as a means of obtaining information or consulting received high marks overall. Although the internet cannot replace an in-person consultation, the screen and the keyboard have the advantage of not involving a face-to-face encounter and raise the possibility of discussing sexual problems anonymously and in private. The internet tools together with other new technologies should continue to develop in a secure manner as a space providing prevention messages and to become an easy access door to sexual and reproductive health services for young men, which can then guide youths to appropriate resource persons. Sources of support: This study was supported by the Maurice Chalumeau Foundation, Switzerland.
Resumo:
INTRODUCTION: The presence of a pre-existing narrow spinal canal may have an important place in the ethiopathogenesis of lumbar spinal stenosis. By consequence the study of the development of the spinal canal is crucial. The first goal of this work is to do a comprehensive literature search and to give an essential view on the development of spinal canal and its depending factors studied until now. The second goal is to give some considerations and hypothesize new leads for clinically useful researches. MATERIALS AND METHODS: A bibliographical research was executed using different search engines: PubMed, Google Schoolar ©, Ovid ® and Web Of Science ©. Free sources and avaible from the University of Lausanne (UNIL) and Centre Hospitalier Universitaire Vaudois (CHUV) were used. At the end of the bibliographic researches 114 references were found, 85 were free access and just 41 were cited in this work. Most of the found references are in English or in French. RESULTS AND DISCUSSION: The spinal canal is principally limited by the vertebrae which have a mesodermal origin. The nervous (ectodermal) tissue significantly influences the growth of the canal. The most important structure participating in the spinal canal growth is the neurocentral synchondrosis in almost the entire vertebral column. The fusion of the half posterior arches seems to have less importance for the canal size. The growth is not homogeneous but, depends on the vertebral level. Timing, rate and growth potentials differ by regions. Especially in the case of the lumbar segment, there is a craniocaudal tendency which entails a greater post-natal catch-up growth for distal vertebrae. Trefoil-shape of the L5 canal is the consequence of a sagittal growth deficiency. The spinal canal shares some developmental characteristics with different structures and systems, especially with the central nervous system. It may be the consequence of the embryological origin. It is supposed that not all the related structures would be affected by a growth impairment because of the different catch-up potentials. Studies found that narrower spinal canals might be related with cardiovascular and gastrointestinal symptoms, lower thymic function, bone mineral content, dental hypoplasia and Harris' lines. Anthropometric correlations found at birth disappear during the pediatric age. All factors which can affect bone and nervous growth might be relevant. Genetic predispositions are the only factors that can never be changed but the real impact is to ascertain. During the antenatal period, all the elements determining a good supply of blood and oxygen may influence the vertebral canal development, for example smoking during pregnancy. Diet is a crucial factor having an impact on both antenatal and postnatal growth. Proteins intake is the only proved dietetic relationship found in the bibliographic research of this work. The mechanical effects due to locomotion changes are unknown. Socioeconomic situation has an impact on several influencing factors and it is difficult to study it owing to numerous bias. CONCLUSIONS: A correct growth of spinal canal is evidently relevant to prevent not-degenerative stenotic conditions. But a "congenital" narrower canal may aggravate degenerative stenosis. This concerns specific groups of patient. If the size of the canal is highly involved in the pathogenesis of common back pains, a hypothetical measure to prevent developmental impairments could have a not- negligible impact on the society. It would be interesting to study more about dietetic necessities for a good spinal canal development. Understanding the relationship between nervous tissues and vertebra it might be useful in identifying what is needed for the ideal development. Genetic importance and the post-natal influences of upright standing on the canal growth remain unsolved questions. All these tracks may have a double purpose: knowing if it is possible to decrease the incidence of narrower spinal canal and consequently finding possible preventive measures. The development of vertebral canal is a complex subject which ranges over a wide variety of fields. The knowledge of this subject is an indispensable tool to understand and hypothesize the influencing factors that might lead to stenotic conditions. Unfortunately, a lack of information makes difficult to have a complete and satisfactory interdisciplinary vision.
Resumo:
OBJECTIVE: To evaluate the power of various parameters of the vestibulo-ocular reflex (VOR) in detecting unilateral peripheral vestibular dysfunction and in characterizing certain inner ear pathologies. STUDY DESIGN: Prospective study of consecutive ambulatory patients presenting with acute onset of peripheral vertigo and spontaneous nystagmus. SETTING: Tertiary referral center. PATIENTS: Seventy-four patients (40 females, 34 males) and 22 normal subjects (11 females, 11 males) were included in the study. Patients were classified in three main diagnoses: vestibular neuritis: 40; viral labyrinthitis: 22; Meniere's disease: 12. METHODS: The VOR function was evaluated by standard caloric and impulse rotary tests (velocity step). A mathematical model of vestibular function was used to characterize the VOR response to rotational stimulation. The diagnostic value of the different VOR parameters was assessed by uni- and multivariable logistic regression. RESULTS: In univariable analysis, caloric asymmetry emerged as the most powerful VOR parameter in identifying unilateral vestibular deficit, with a boundary limit set at 20%. In multivariable analysis, the combination of caloric asymmetry and rotational time constant asymmetry significantly improved the discriminatory power over caloric alone (p<0.0001) and produced a detection score with a correct classification of 92.4%. In discriminating labyrinthine diseases, different combinations of the VOR parameters were obtained for each diagnosis (p<0.003) supporting that the VOR characteristics differ between the three inner ear disorders. However, the clinical usefulness of these characteristics in separating the pathologies was limited. CONCLUSION: We propose a powerful logistic model combining the indices of caloric and time constant asymmetries to detect a peripheral vestibular loss, with an accuracy of 92.4%. Based on vestibular data only, the discrimination between the different inner ear diseases is statistically possible, which supports different pathophysiologic changes in labyrinthine pathologies.
Resumo:
In order to study the various health influencing parameters related to engineered nanoparticles as well as to soot emitted b diesel engines, there is an urgent need for appropriate sampling devices and methods for cell exposure studies that simulate the respiratory system and facilitate associated biological and toxicological tests. The objective of the present work was the further advancement of a Multiculture Exposure Chamber (MEC) into a dose-controlled system for efficient delivery of nanoparticles to cells. It was validated with various types of nanoparticles (diesel engine soot aggregates, engineered nanoparticles for various applications) and with state-of-the-art nanoparticle measurement instrumentation to assess the local deposition of nanoparticles on the cell cultures. The dose of nanoparticles to which cell cultures are being exposed was evaluated in the normal operation of the in vitro cell culture exposure chamber based on measurements of the size specific nanoparticle collection efficiency of a cell free device. The average efficiency in delivering nanoparticles in the MEC was approximately 82%. The nanoparticle deposition was demonstrated by Transmission Electron Microscopy (TEM). Analysis and design of the MEC employs Computational Fluid Dynamics (CFD) and true to geometry representations of nanoparticles with the aim to assess the uniformity of nanoparticle deposition among the culture wells. Final testing of the dose-controlled cell exposure system was performed by exposing A549 lung cell cultures to fluorescently labeled nanoparticles. Delivery of aerosolized nanoparticles was demonstrated by visualization of the nanoparticle fluorescence in the cell cultures following exposure. Also monitored was the potential of the aerosolized nanoparticles to generate reactive oxygen species (ROS) (e.g. free radicals and peroxides generation), thus expressing the oxidative stress of the cells which can cause extensive cellular damage or damage on DNA.
Resumo:
Through this article, we propose a mixed management of patients' medical records, so as to share responsibilities between the patient and the Medical Practitioner by making Patients responsible for the validation of their administrative information, and MPs responsible for the validation of their Patients' medical information. Our proposal can be considered a solution to the main problem faced by patients, health practitioners and the authorities, namely the gathering and updating of administrative and medical data belonging to the patient in order to accurately reconstitute a patient's medical history. This method is based on two processes. The aim of the first process is to provide a patient's administrative data, in order to know where and when the patient received care (name of the health structure or health practitioner, type of care: out patient or inpatient). The aim of the second process is to provide a patient's medical information and to validate it under the accountability of the Medical Practitioner with the help of the patient if needed. During these two processes, the patient's privacy will be ensured through cryptographic hash functions like the Secure Hash Algorithm, which allows pseudonymisation of a patient's identity. The proposed Medical Record Search Engines will be able to retrieve and to provide upon a request formulated by the Medical ractitioner all the available information concerning a patient who has received care in different health structures without divulging the patient's identity. Our method can lead to improved efficiency of personal medical record management under the mixed responsibilities of the patient and the MP.
Resumo:
Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.
Resumo:
OBJECTIVE: To evaluate web-based information on bipolar disorder and to assess particular content quality indicators. METHODS: Two keywords, "bipolar disorder" and "manic depressive illness" were entered into popular World Wide Web search engines. Websites were assessed with a standardized proforma designed to rate sites on the basis of accountability, presentation, interactivity, readability and content quality. "Health on the Net" (HON) quality label, and DISCERN scale scores were used to verify their efficiency as quality indicators. RESULTS: Of the 80 websites identified, 34 were included. Based on outcome measures, the content quality of the sites turned-out to be good. Content quality of web sites dealing with bipolar disorder is significantly explained by readability, accountability and interactivity as well as a global score. CONCLUSIONS: The overall content quality of the studied bipolar disorder websites is good.
Resumo:
The objective of this article is to systematically assess the quality of web-based information in French language on the alcohol dependence. The authors analysed, using a standardised pro forma, the 20 most highly ranked pages identified by 3 common internet search engines using 2 keywords. Results show that a total of 45 sites were analysed. The authors conclude that the overall quality of the sites was relatively poor, especially for the description of possible treatments, however with a wide variability. Content quality was not correlated with other aspects of quality such as interactivity, aesthetic or accountability.
Resumo:
Patients with chronic heart failure who are not eligible for heart transplant and whose life expectancy depends mainly on the heart disease may benefit from mechanical circulatory support. Mechanical circulatory support restores adequate cardiac output and organ perfusion and eventually improves patients' clinical condition, quality of life and life expectancy. This treatment is called destination therapy (DT) and we estimate that in Switzerland more than 120 patients per year could benefit from it. In the last 10 years, design of the devices, implantation techniques and prognoses have changed dramatically. The key to successful therapy with a left ventricular assist device is appropriate patient selection, although we are still working on the definition of reliable inclusion and exclusion criteria and optimal timing for surgical implantation. Devices providing best long-term results are continuous flow, rotary or axial blood pumps implanted using minimally invasive techniques on a beating heart. These new devices (Thoratec HeartMate II and HeartWare HVAD) have only a single moving part, and have improved durability with virtually 10 years freedom from mechanical failure. In selected patients, the overall actuarial survival of DT patients is 75% at 1 year and 62% at 2 years, with a clear improvement in quality of life compared with medical management only. Complications include bleeding and infections; their overall incidence is significantly lower than with previous devices and their management is well defined. DT is evolving into an effective and reasonably cost-effective treatment option for a growing population of patients not eligible for heart transplant, showing encouraging survival rates at 2 years and providing clear improvement in quality of life. The future is bright for people suffering from chronic heart failure.