956 resultados para Probabilistic robotics


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Laparoscopic surgery has become a standard approach for many interventions, including oncologic surgery. Laparoscopic instruments have been developed to allow advanced surgical procedure. Imaging and computer assistance in virtual reality or robotic procedure will certainly improve access to this surgery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chronic atrial fibrillation affects millions of people worldwide. Its surgical treatment often fails to restore the transport function of the atrium. This study first introduces the concept of an atrial assist device (AAD) to restore the pump function of the atrium. The AAD is developed to be totally implantable in the human body with a transcutaneous energy transfer system to recharge the implanted battery. The ADD consists of a motorless pump based on artificial muscle technology, positioned on the external surface of the atrium to compress it and restore its muscular activity. A bench model reproduces the function of a fibrillating atrium to assess the circulatory support that this pump can provide. Atripump (Nanopowers SA, Switzerland) is a dome-shaped silicone-coated nitinol actuator 5 mm high, sutured on the external surface of the atrium. A pacemaker-like control unit drives the actuator that compresses the atrium, providing the mechanical support to the blood circulation. Electrical characteristics: the system is composed of one actuator that needs a minimal tension of 15 V and has a maximum current of 1.5 A with a 50% duty cycle. The implantable rechargeable battery is made of a cell having the following specifications: nominal tension of a cell: 4.1 V, tension after 90% of discharge: 3.5 V, nominal capacity of a cell: 163 mA h. The bench model consists of an open circuit made of latex bladder 60 mm in diameter filled with water. The bladder is connected to a vertically positioned tube that is filled to different levels, reproducing changes in cardiac preload. The Atripump is placed on the outer surface of the bladder. Pressure, volume and temperature changes were recorded. The contraction rate was 1 Hz with a power supply of 12 V, 400 mA for 200 ms. Preload ranged from 15 to 21 cm H(2)O. Maximal silicone membrane temperature was 55 degrees C and maximal temperature of the liquid environment was 35 degrees C. The pump produced a maximal work of 16 x 10(-3) J. Maximal volume pumped was 492 ml min(-1). This artificial muscle pump is compact, follows the Starling law and reproduces the hemodynamic performances of a normal atrium. It could represent a new tool to restore the atrial kick in persistent atrial fibrillation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The success of combination antiretroviral therapy is limited by the evolutionary escape dynamics of HIV-1. We used Isotonic Conjunctive Bayesian Networks (I-CBNs), a class of probabilistic graphical models, to describe this process. We employed partial order constraints among viral resistance mutations, which give rise to a limited set of mutational pathways, and we modeled phenotypic drug resistance as monotonically increasing along any escape pathway. Using this model, the individualized genetic barrier (IGB) to each drug is derived as the probability of the virus not acquiring additional mutations that confer resistance. Drug-specific IGBs were combined to obtain the IGB to an entire regimen, which quantifies the virus' genetic potential for developing drug resistance under combination therapy. The IGB was tested as a predictor of therapeutic outcome using between 2,185 and 2,631 treatment change episodes of subtype B infected patients from the Swiss HIV Cohort Study Database, a large observational cohort. Using logistic regression, significant univariate predictors included most of the 18 drugs and single-drug IGBs, the IGB to the entire regimen, the expert rules-based genotypic susceptibility score (GSS), several individual mutations, and the peak viral load before treatment change. In the multivariate analysis, the only genotype-derived variables that remained significantly associated with virological success were GSS and, with 10-fold stronger association, IGB to regimen. When predicting suppression of viral load below 400 cps/ml, IGB outperformed GSS and also improved GSS-containing predictors significantly, but the difference was not significant for suppression below 50 cps/ml. Thus, the IGB to regimen is a novel data-derived predictor of treatment outcome that has potential to improve the interpretation of genotypic drug resistance tests.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introducció: Els errors de medicació són definits com qualsevol incident prevenible que pot causar dany al pacient o donar lloc a una utilització inapropiada dels medicaments, quan aquests estan sota el control dels professionals sanitaris o del pacient. Els errors en la preparació i l’administració de medicació són els més comuns de l’àrea hospitalària i, tot i la llarga cadena per la qual passa el fàrmac, el professional d’infermeria és el últim responsable de l’acció, tenint així, un paper molt important en la seguretat del pacient. Les infermeres dediquen el 40% del temps de la seva jornada laboral en tasques relacionades amb la medicació. Objectiu: Determinar si les infermeres produeixen més errors si treballen amb sistemes de distribució de medicació de stock o en sistemes de distribució unidosis de medicació. Metodologia: Estudi quantitatiu, observacional i descriptiu, on la notificació d’errors (o oportunitats d’error) realitzats per la infermera, en les fases de preparació i administració de medicació, es farà mitjançant un qüestionari autoelaborat. Els elements a identificar seran: el tipus d’error, les causes que poden haver--‐lo produït, la seva potencial gravetat i qui l’ha pogut evitar; així com el tipus de professional que l’ha produït. Altres dades rellevants són: el medicament implicat junt amb la dosis i la via d’administració i el sistema de distribució utilitzat. Mostreig i mostra: El mostreig serà no probabilístic i per conveniència. S’escolliran aquelles infermeres que l’investigador consideri amb les característiques necessàries per participar en l’estudi, així que la mostra estarà formada per les infermeres les quals treballen a la unitat 40 de l’Hospital del Mar i utilitzen un sistema de distribució de medicació de dosis unitàries i les infermeres que treballen a urgències (concretament a l’àrea de nivell dos) de l’Hospital del Mar les quals treballen amb un sistema de distribució de medicació de stock.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En el treball que es mostra a continuació podrem observar un estudi sobre la preparació física aplicada a l’esport del futbol. A més, farem un estudi dels nivells del volum màxim d’Oxigen (Vo2màx) a través d’un procés d’obtenció de resultats indirecte (Test de Course Navette) tenint present el nombre de sessions setmanals. Observarem les diferències en la definició del paràmetre segons Zintl (1991), Diéguez (1997), Gómez, Aranda i Ferrer (2010), entre d’altres. També observarem els nivells que s’han de tenir a partir d’autors com Zintl (1991), Rivera i Avella (1992), Sánchez i Salas (2008). Per dur-ho a terme, farem un estudi hipotètic-deductiu a través d’un mostreig no probabilístic. Observarem els valors del Vo2màx. assolits en el període preparatori, comparats amb el manteniment, la millora o la pèrdua d’aquest després d’un període d’entrenament diferenciat entre una i tres sessions. Un cop realitzat el procés, hem pogut veure que els nivells assolits després del període preparatori (60.87+ 8.81 ml/kg/min) són majors respecte l’inici del període (44.38 +8.92 ml/kg/min.). Posteriorment, al observar les diferències entre els dos grups de la mostra, podem afirmar que amb tres sessions realitzades no es poden mantenir els nivells anteriors (57.80 +10.16) respecte els (56.73 +9.24) de l’últim test. Per últim, el grup 2, amb una única sessió setmanal els valors baixen més, es passa dels (63.93 + 6.24 ml/kg/min.) als (56.50 + 7.52 ml/kg/min)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Normal and abnormal brains can be segmented by registering the target image with an atlas. Here, an atlas is defined as the combination of an intensity image (template) and its segmented image (the atlas labels). After registering the atlas template and the target image, the atlas labels are propagated to the target image. We define this process as atlas-based segmentation. In recent years, researchers have investigated registration algorithms to match atlases to query subjects and also strategies for atlas construction. In this paper we present a review of the automated approaches for atlas-based segmentation of magnetic resonance brain images. We aim to point out the strengths and weaknesses of atlas-based methods and suggest new research directions. We use two different criteria to present the methods. First, we refer to the algorithms according to their atlas-based strategy: label propagation, multi-atlas methods, and probabilistic techniques. Subsequently, we classify the methods according to their medical target: the brain and its internal structures, tissue segmentation in healthy subjects, tissue segmentation in fetus, neonates and elderly subjects, and segmentation of damaged brains. A quantitative comparison of the results reported in the literature is also presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Both, Bayesian networks and probabilistic evaluation are gaining more and more widespread use within many professional branches, including forensic science. Notwithstanding, they constitute subtle topics with definitional details that require careful study. While many sophisticated developments of probabilistic approaches to evaluation of forensic findings may readily be found in published literature, there remains a gap with respect to writings that focus on foundational aspects and on how these may be acquired by interested scientists new to these topics. This paper takes this as a starting point to report on the learning about Bayesian networks for likelihood ratio based, probabilistic inference procedures in a class of master students in forensic science. The presentation uses an example that relies on a casework scenario drawn from published literature, involving a questioned signature. A complicating aspect of that case study - proposed to students in a teaching scenario - is due to the need of considering multiple competing propositions, which is an outset that may not readily be approached within a likelihood ratio based framework without drawing attention to some additional technical details. Using generic Bayesian networks fragments from existing literature on the topic, course participants were able to track the probabilistic underpinnings of the proposed scenario correctly both in terms of likelihood ratios and of posterior probabilities. In addition, further study of the example by students allowed them to derive an alternative Bayesian network structure with a computational output that is equivalent to existing probabilistic solutions. This practical experience underlines the potential of Bayesian networks to support and clarify foundational principles of probabilistic procedures for forensic evaluation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since 2000 and the commercialisation of the Da Vinci robotic system, indications for robotic surgery are rapidly increasing. Recent publications proved superior functional outcomes with equal oncologic safety in comparison to conventional open surgery. Its field of application may extend to the nasopharynx and skull base surgery. The preliminary results are encouraging. This article reviews the current literature on the role of transoral robotic surgery in head and neck cancer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

MOTIVATION: The analysis of molecular coevolution provides information on the potential functional and structural implication of positions along DNA sequences, and several methods are available to identify coevolving positions using probabilistic or combinatorial approaches. The specific nucleotide or amino acid profile associated with the coevolution process is, however, not estimated, but only known profiles, such as the Watson-Crick constraint, are usually considered a priori in current measures of coevolution. RESULTS: Here, we propose a new probabilistic model, Coev, to identify coevolving positions and their associated profile in DNA sequences while incorporating the underlying phylogenetic relationships. The process of coevolution is modeled by a 16 × 16 instantaneous rate matrix that includes rates of transition as well as a profile of coevolution. We used simulated, empirical and illustrative data to evaluate our model and to compare it with a model of 'independent' evolution using Akaike Information Criterion. We showed that the Coev model is able to discriminate between coevolving and non-coevolving positions and provides better specificity and specificity than other available approaches. We further demonstrate that the identification of the profile of coevolution can shed new light on the process of dependent substitution during lineage evolution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent single-cell studies in monkeys (Romo et al., 2004) show that the activity of neurons in the ventral premotor cortex covaries with the animal's decisions in a perceptual comparison task regarding the frequency of vibrotactile events. The firing rate response of these neurons was dependent only on the frequency differences between the two applied vibrations, the sign of that difference being the determining factor for correct task performance. We present a biophysically realistic neurodynamical model that can account for the most relevant characteristics of this decision-making-related neural activity. One of the nontrivial predictions of this model is that Weber's law will underlie the perceptual discrimination behavior. We confirmed this prediction in behavioral tests of vibrotactile discrimination in humans and propose a computational explanation of perceptual discrimination that accounts naturally for the emergence of Weber's law. We conclude that the neurodynamical mechanisms and computational principles underlying the decision-making processes in this perceptual discrimination task are consistent with a fluctuation-driven scenario in a multistable regime.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES: Clinical staging is widespread in medicine - it informs prognosis, clinical course, and treatment, and assists individualized care. Staging places an individual on a probabilistic continuum of increasing potential disease severity, ranging from clinically at-risk or latency stage through first threshold episode of illness or recurrence, and, finally, to late or end-stage disease. The aim of the present paper was to examine and update the evidence regarding staging in bipolar disorder, and how this might inform targeted and individualized intervention approaches. METHODS: We provide a narrative review of the relevant information. RESULTS: In bipolar disorder, the validity of staging is informed by a range of findings that accompany illness progression, including neuroimaging data suggesting incremental volume loss, cognitive changes, and a declining likelihood of response to pharmacological and psychosocial treatments. Staging informs the adoption of a number of approaches, including the active promotion of both indicated prevention for at-risk individuals and early intervention strategies for newly diagnosed individuals, and the tailored implementation of treatments according to the stage of illness. CONCLUSIONS: The nature of bipolar disorder implies the presence of an active process of neuroprogression that is considered to be at least partly mediated by inflammation, oxidative stress, apoptosis, and changes in neurogenesis. It further supports the concept of neuroprotection, in that a diversity of agents have putative effects against these molecular targets. Clinically, staging suggests that the at-risk state or first episode is a period that requires particularly active and broad-based treatment, consistent with the hope that the temporal trajectory of the illness can be altered. Prompt treatment may be potentially neuroprotective and attenuate the neurostructural and neurocognitive changes that emerge with chronicity. Staging highlights the need for interventions at a service delivery level and implementing treatments at the earliest stage of illness possible.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

"MotionMaker (TM)" is a stationary programmable test and training system for the lower limbs developed at the 'Ecole Polytechnique Federale de Lausanne' with the 'Fondation Suisse pour les Cybertheses'.. The system is composed of two robotic orthoses comprising motors and sensors, and a control unit managing the trans-cutaneous electrical muscle stimulation with real-time regulation. The control of the Functional Electrical Stimulation (FES) induced muscle force necessary to mimic natural exercise is ensured by the control unit which receives a continuous input from the position and force sensors mounted on the robot. First results with control subjects showed the feasibility of creating movements by such closed-loop controlled FES induced muscle contractions. To make exercising with the MotionMaker (TM) safe for clinical trials with Spinal Cord Injured (SCI) volunteers, several original safety features have been introduced. The MotionMaker (TM) is able to identify and manage the occurrence of spasms. Fatigue can also be detected and overfatigue during exercise prevented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a seabed profile estimation and following method for close proximity inspection of 3D underwater structures using autonomous underwater vehicles (AUVs). The presented method is used to determine a path allowing the AUV to pass its sensors over all points of the target structure, which is known as coverage path planning. Our profile following method goes beyond traditional seabed following at a safe altitude and exploits hovering capabilities of recent AUV developments. A range sonar is used to incrementally construct a local probabilistic map representation of the environment and estimates of the local profile are obtained via linear regression. Two behavior-based controllers use these estimates to perform horizontal and vertical profile following. We build upon these tools to address coverage path planning for 3D underwater structures using a (potentially inaccurate) prior map and following cross-section profiles of the target structure. The feasibility of the proposed method is demonstrated using the GIRONA 500 AUV both in simulation using synthetic and real-world bathymetric data and in pool trials