147 resultados para Multi-layer Perceptron

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose:To describe the indications, the surgical procedure and the clinical outcome of MLAM in the treatment of non traumatic corneal perforations and descemetoceles . Methods:A prospective, non comparative, interventional case series of eight consecutive patients (mean age 59 years old, 6 men and 2 women) with non traumatic corneal perforations or descemetoceles.The surgery consisted in a MLAM transplantation of a cryopreservated human amniotic membrane. The series included: three active herpetic keratitis, one rosacea, one perforation of an hydrops, one cicatricial pemphigoid, one perforation after an abcess in a corneal graft and one perforation after protonbeamtherapy. The clinical outcome included: the follow-up, the integrity of the eye, corneal epithelialization, inflammation and neovascularization, and the integration of the MLAM. Stromal thickness was followed precisely with the slit lamp. A corneal graft was performed at one patient after the MLAM, allowing microscopic investigation of the removed MLAM integrated in the cornea. Results:The mean follow-up was 8.78 months (range 3.57 to 30.17). Amniotic membrane transplantation was successful and reduced inflammation in 7 patients out of 8 ,after one procedure.One patient who presented a large herpetic keratitis epithelial defect with corneal anaesthesia had his MLAM dissolved after two weeks with an aqueous leakage. Epithelium healed within 3 weeks above 7 MLAM and remained stable at 3 months in 7 out of 8 patients. MLAM opacification gradually disappeared over a few months, however, stromal layers filling in the corneal perforations or above the descemetoceles remained stable. Conclusions:MLAM transplantation is a safe, effective and useful technique to cure non traumatic corneal perforations and descemetoceles. It can be performed in emergency despite the presence of an active inflammation or infection. By facilitating epithelialization, reducing inflammation and neovascularization, it allows corneal surface reconstruction in patients with persistent epithelial defects and corneal melting that usually ends in a perforation. For full visual rehabilitation, a delayed penetrating keratoplasty is required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Genetically engineered bioreporters are an excellent complement to traditional methods of chemical analysis. The application of fluorescence flow cytometry to detection of bioreporter response enables rapid and efficient characterization of bacterial bioreporter population response on a single-cell basis. In the present study, intrapopulation response variability was used to obtain higher analytical sensitivity and precision. We have analyzed flow cytometric data for an arsenic-sensitive bacterial bioreporter using an artificial neural network-based adaptive clustering approach (a single-layer perceptron model). Results for this approach are far superior to other methods that we have applied to this fluorescent bioreporter (e.g., the arsenic detection limit is 0.01 microM, substantially lower than for other detection methods/algorithms). The approach is highly efficient computationally and can be implemented on a real-time basis, thus having potential for future development of high-throughput screening applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many complex systems may be described by not one but a number of complex networks mapped on each other in a multi-layer structure. Because of the interactions and dependencies between these layers, the state of a single layer does not necessarily reflect well the state of the entire system. In this paper we study the robustness of five examples of two-layer complex systems: three real-life data sets in the fields of communication (the Internet), transportation (the European railway system), and biology (the human brain), and two models based on random graphs. In order to cover the whole range of features specific to these systems, we focus on two extreme policies of system's response to failures, no rerouting and full rerouting. Our main finding is that multi-layer systems are much more vulnerable to errors and intentional attacks than they appear from a single layer perspective.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, aggregate migration patterns during fluid concrete castings are studied through experiments, dimensionless approach and numerical modeling. The experimental results obtained on two beams show that gravity induced migration is primarily affecting the coarsest aggregates resulting in a decrease of coarse aggregates volume fraction with the horizontal distance from the pouring point and in a puzzling vertical multi-layer structure. The origin of this multi layer structure is discussed and analyzed with the help of numerical simulations of free surface flow. Our results suggest that it finds its origin in the non Newtonian nature of fresh concrete and that increasing casting rate shall decrease the magnitude of gravity induced particle migration. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Permo-Triassic crisis was a major turning point in geological history. Following the end-Guadalupian extinction phase, the Palaeozoic biota underwent a steady decline through the Lopingian (Late Permian), resulting in their decimation at the level that is adopted as the Permian-Triassic boundary (PTB). This trend coincided with the greatest Phanerozoic regression. The extinction at the end of the Guadalupian and that marking the end of the Permian are therefore related. The subsequent recovery of the biota occupied the whole of the Early Triassic. Several phases of perturbations in [delta]13Ccarb occurred through a similar period, from the late Wuchiapingian to the end of the Early Triassic. Therefore, the Permian-Triassic crisis was protracted, and spanned Late Permian and Early Triassic time. The extinction associated with the PTB occurred in two episodes, the main act with a prelude and the epilogue. The prelude commenced prior to beds 25 and 26 at Meishan and coincided with the end-Permian regression. The main act itself happened in beds 25 and 26 at Meishan. The epilogue occurred in the late Griesbachian and coincided with the second volcanogenic layer (bed 28) at Meishan. The temporal distribution of these episodes constrains the interpretation of mechanisms responsible for the greatest Phanerozoic mass extinction, particularly the significance of a postulated bolide impact that to our view may have occurred about 50,000[no-break space]Myr after the prelude. The prolonged and multi-phase nature of the Permo-Triassic crisis favours the mechanisms of the Earth's intrinsic evolution rather than extraterrestrial catastrophe. The most significant regression in the Phanerozoic, the palaeomagnetic disturbance of the Permo-Triassic Mixed Superchron, widespread extensive volcanism, and other events, may all be related, through deep-seated processes that occurred during the integration of Pangea. These combined processes could be responsible for the profound changes in marine, terrestrial and atmospheric environments that resulted in the end-Permian mass extinction. Bolide impact is possible but is neither an adequate nor a necessary explanation for these changes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of herbicides in agriculture may lead to environmental problems, such as surface water pollution, with a potential risk for aquatic organisms. The herbicide glyphosate is the most used active ingredient in the world and in Switzerland. In the Lavaux vineyards it is nearly the only molecule applied. This work aimed at studying its fate in soils and its transfer to surface waters, using a multi-scale approach: from molecular (10-9 m) and microscopic scales (10-6 m), to macroscopic (m) and landscape ones (103 m). First of all, an analytical method was developed for the trace level quantification of this widely used herbicide and its main by-product, aminomethylphosphonic acid (AMPA). Due to their polar nature, their derivatization with 9-fluorenylmethyl chloroformate (FMOC-Cl) was done prior to their concentration and purification by solid phase extraction. They were then analyzed by ultra performance liquid chromatography coupled with tandem mass spectrometry (UPLC-MS/MS). The method was tested in different aqueous matrices with spiking tests and validated for the matrix effect correction in relevant environmental samples. Calibration curves established between 10 and 1000ng/l showed r2 values above 0.989, mean recoveries varied between 86 and 133% and limits of detection and quantification of the method were as low as 5 and 10ng/l respectively. At the parcel scale, two parcels of the Lavaux vineyard area, located near the Lutrive River at 6km to the east of Lausanne, were monitored to assess to which extent glyphosate and AMPA were retained in the soil or exported to surface waters. They were equipped at their bottom with porous ceramic cups and runoff collectors, which allowed retrieving water samples for the growing seasons 2010 and 2011. Results revealed that the mobility of glyphosate and AMPA in the unsaturated zone was likely driven by the precipitation regime and the soil characteristics, such as slope, porosity structure and layer permeability discrepancy. Elevated glyphosate and AMPA concentrations were measured at 60 and 80 cm depth at parcel bottoms, suggesting their infiltration in the upper parts of the parcels and the presence of preferential flow in the studied parcels. Indeed, the succession of rainy days induced the gradual saturation of the soil porosity, leading to rapid infiltration through macropores, as well as surface runoff formation. Furthermore, the presence of more impervious weathered marls at 100 cm depth induced throughflows, the importance of which for the lateral transport of the herbicide molecules was determined by the slope steepness. Important rainfall events (>10 mm/day) were clearly exporting molecules from the soil top layer, as indicated by important concentrations in runoff samples. A mass balance showed that total loss (10-20%) mainly occurred through surface runoff (96%) and, to a minor extent, by throughflows in soils (4%), with subsequent exfiltration to surface waters. Observations made in the Lutrive River revealed interesting details of glyphosate and AMPA dynamics in urbanized landscapes, such as the Lavaux vineyards. Indeed, besides their physical and chemical properties, herbicide dynamics at the catchment level strongly depend on application rates, precipitation regime, land use and also on the presence of drains or constructed channels. Elevated concentrations, up to 4970 ng/l, observed just after the application, confirmed the diffuse export of these compounds from the vineyard area by surface runoff during main rain events. From April to September 2011, a total load of 7.1 kg was calculated, with 85% coming from vineyards and minor urban sources and 15% from arable crops. Small vineyard surfaces could generate high concentrations of herbicides and contribute considerably to the total load calculated at the outlet, due to their steep slopes (~10%). The extrapolated total amount transferred yearly from the Lavaux vineyards to the Lake of Geneva was of 190kg. At the molecular scale, the possible involvement of dissolved organic matter (DOM) in glyphosate and copper transport was studied using UV/Vis fluorescence spectroscopy. Combined with parallel factor (PARAFAC) analysis, this technique allowed characterizing DOM of soil and surface water samples from the studied vineyard area. Glyphosate concentrations were linked to the fulvic-like spectroscopic signature of DOM in soil water samples, as well as to copper, suggesting the formation of ternary complexes. In surface water samples, its concentrations were also correlated to copper ones, but not in a significant way to the fulvic-like signature. Quenching experiments with standards confirmed field tendencies in the laboratory, with a stronger decrease in fluorescence intensity for fulvic-like fluorophore than for more aromatic ones. Lastly, based on maximum concentrations measured in the river, an environmental risk for these compounds was assessed, using laboratory tests and ecotoxicity data from the literature. In our case and with the methodology applied, the risk towards aquatic species was found negligible (RF<1).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this work was to develop an easily applicable technique and a standardized protocol for high-quality post-mortem angiography. This protocol should (1) increase the radiological interpretation by decreasing artifacts due to the perfusion and by reaching a complete filling of the vascular system and (2) ease and standardize the execution of the examination. To this aim, 45 human corpses were investigated by post-mortem computed tomography (CT) angiography using different perfusion protocols, a modified heart-lung machine and a new contrast agent mixture, specifically developed for post-mortem investigations. The quality of the CT angiographies was evaluated radiologically by observing the filling of the vascular system and assessing the interpretability of the resulting images and by comparing radiological diagnoses to conventional autopsy conclusions. Post-mortem angiography yielded satisfactory results provided that the volumes of the injected contrast agent mixture were high enough to completely fill the vascular system. In order to avoid artifacts due to the post-mortem perfusion, a minimum of three angiographic phases and one native scan had to be performed. These findings were taken into account to develop a protocol for quality post-mortem CT angiography that minimizes the risk of radiological misinterpretation. The proposed protocol is easy applicable in a standardized way and yields high-quality radiologically interpretable visualization of the vascular system in post-mortem investigations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The algorithmic approach to data modelling has developed rapidly these last years, in particular methods based on data mining and machine learning have been used in a growing number of applications. These methods follow a data-driven methodology, aiming at providing the best possible generalization and predictive abilities instead of concentrating on the properties of the data model. One of the most successful groups of such methods is known as Support Vector algorithms. Following the fruitful developments in applying Support Vector algorithms to spatial data, this paper introduces a new extension of the traditional support vector regression (SVR) algorithm. This extension allows for the simultaneous modelling of environmental data at several spatial scales. The joint influence of environmental processes presenting different patterns at different scales is here learned automatically from data, providing the optimum mixture of short and large-scale models. The method is adaptive to the spatial scale of the data. With this advantage, it can provide efficient means to model local anomalies that may typically arise in situations at an early phase of an environmental emergency. However, the proposed approach still requires some prior knowledge on the possible existence of such short-scale patterns. This is a possible limitation of the method for its implementation in early warning systems. The purpose of this paper is to present the multi-scale SVR model and to illustrate its use with an application to the mapping of Cs137 activity given the measurements taken in the region of Briansk following the Chernobyl accident.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, multi-atlas fusion methods have gainedsignificant attention in medical image segmentation. Inthis paper, we propose a general Markov Random Field(MRF) based framework that can perform edge-preservingsmoothing of the labels at the time of fusing the labelsitself. More specifically, we formulate the label fusionproblem with MRF-based neighborhood priors, as an energyminimization problem containing a unary data term and apairwise smoothness term. We present how the existingfusion methods like majority voting, global weightedvoting and local weighted voting methods can be reframedto profit from the proposed framework, for generatingmore accurate segmentations as well as more contiguoussegmentations by getting rid of holes and islands. Theproposed framework is evaluated for segmenting lymphnodes in 3D head and neck CT images. A comparison ofvarious fusion algorithms is also presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of Geographic Information Systems has revolutionalized the handling and the visualization of geo-referenced data and has underlined the critic role of spatial analysis. The usual tools for such a purpose are geostatistics which are widely used in Earth science. Geostatistics are based upon several hypothesis which are not always verified in practice. On the other hand, Artificial Neural Network (ANN) a priori can be used without special assumptions and are known to be flexible. This paper proposes to discuss the application of ANN in the case of the interpolation of a geo-referenced variable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we unify, simplify, and extend previous work on the evolutionary dynamics of symmetric N-player matrix games with two pure strategies. In such games, gains from switching strategies depend, in general, on how many other individuals in the group play a given strategy. As a consequence, the gain function determining the gradient of selection can be a polynomial of degree N-1. In order to deal with the intricacy of the resulting evolutionary dynamics, we make use of the theory of polynomials in Bernstein form. This theory implies a tight link between the sign pattern of the gains from switching on the one hand and the number and stability of the rest points of the replicator dynamics on the other hand. While this relationship is a general one, it is most informative if gains from switching have at most two sign changes, as is the case for most multi-player matrix games considered in the literature. We demonstrate that previous results for public goods games are easily recovered and extended using this observation. Further examples illustrate how focusing on the sign pattern of the gains from switching obviates the need for a more involved analysis.