42 resultados para Functional data analysis
Resumo:
Geophysical techniques can help to bridge the inherent gap with regard to spatial resolution and the range of coverage that plagues classical hydrological methods. This has lead to the emergence of the new and rapidly growing field of hydrogeophysics. Given the differing sensitivities of various geophysical techniques to hydrologically relevant parameters and their inherent trade-off between resolution and range the fundamental usefulness of multi-method hydrogeophysical surveys for reducing uncertainties in data analysis and interpretation is widely accepted. A major challenge arising from such endeavors is the quantitative integration of the resulting vast and diverse database in order to obtain a unified model of the probed subsurface region that is internally consistent with all available data. To address this problem, we have developed a strategy towards hydrogeophysical data integration based on Monte-Carlo-type conditional stochastic simulation that we consider to be particularly suitable for local-scale studies characterized by high-resolution and high-quality datasets. Monte-Carlo-based optimization techniques are flexible and versatile, allow for accounting for a wide variety of data and constraints of differing resolution and hardness and thus have the potential of providing, in a geostatistical sense, highly detailed and realistic models of the pertinent target parameter distributions. Compared to more conventional approaches of this kind, our approach provides significant advancements in the way that the larger-scale deterministic information resolved by the hydrogeophysical data can be accounted for, which represents an inherently problematic, and as of yet unresolved, aspect of Monte-Carlo-type conditional simulation techniques. We present the results of applying our algorithm to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the local-scale porosity structure. Our procedure is first tested on pertinent synthetic data and then applied to corresponding field data collected at the Boise Hydrogeophysical Research Site near Boise, Idaho, USA.
Resumo:
INTRODUCTION: infants hospitalised in neonatology are inevitably exposed to pain repeatedly. Premature infants are particularly vulnerable, because they are hypersensitive to pain and demonstrate diminished behavioural responses to pain. They are therefore at risk of developing short and long-term complications if pain remains untreated. CONTEXT: compared to acute pain, there is limited evidence in the literature on prolonged pain in infants. However, the prevalence is reported between 20 and 40 %. OBJECTIVE : this single case study aimed to identify the bio-contextual characteristics of neonates who experienced prolonged pain. METHODS : this study was carried out in the neonatal unit of a tertiary referral centre in Western Switzerland. A retrospective data analysis of seven infants' profile, who experienced prolonged pain ,was performed using five different data sources. RESULTS : the mean gestational age of the seven infants was 32weeks. The main diagnosis included prematurity and respiratory distress syndrome. The total observations (N=55) showed that the participants had in average 21.8 (SD 6.9) painful procedures that were estimated to be of moderate to severe intensity each day. Out of the 164 recorded pain scores (2.9 pain assessment/day/infant), 14.6 % confirmed acute pain. Out of those experiencing acute pain, analgesia was given in 16.6 % of them and 79.1 % received no analgesia. CONCLUSION: this study highlighted the difficulty in managing pain in neonates who are exposed to numerous painful procedures. Pain in this population remains underevaluated and as a result undertreated.Results of this study showed that nursing documentation related to pain assessment is not systematic.Regular assessment and documentation of acute and prolonged pain are recommended. This could be achieved with clear guidelines on the Assessment Intervention Reassessment (AIR) cyclewith validated measures adapted to neonates. The adequacy of pain assessment is a pre-requisite for appropriate pain relief in neonates.
Resumo:
Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.
Resumo:
We conducted this study to determine the relative influence of various mechanical and patient-related factors on the incidence of dislocation after primary total hip asthroplasty (THA). Of 2,023 THAs, 21 patients who had at least 1 dislocation were compared with a control group of 21 patients without dislocation, matched for age, gender, pathology, and year of surgery. Implant positioning, seniority of the surgeon, American Society of Anesthesiologists (ASA) score, and diminished motor coordination were recorded. Data analysis included univariate and multivariate methods. The dislocation risk was 6.9 times higher if total anteversion was not between 40 degrees and 60 degrees and 10 times higher in patients with high ASA scores. Surgeons should pay attention to total anteversion (cup and stem) of THA. The ASA score should be part of the preoperative assessment of the dislocation risk.
Resumo:
The paper presents some contemporary approaches to spatial environmental data analysis. The main topics are concentrated on the decision-oriented problems of environmental spatial data mining and modeling: valorization and representativity of data with the help of exploratory data analysis, spatial predictions, probabilistic and risk mapping, development and application of conditional stochastic simulation models. The innovative part of the paper presents integrated/hybrid model-machine learning (ML) residuals sequential simulations-MLRSS. The models are based on multilayer perceptron and support vector regression ML algorithms used for modeling long-range spatial trends and sequential simulations of the residuals. NIL algorithms deliver non-linear solution for the spatial non-stationary problems, which are difficult for geostatistical approach. Geostatistical tools (variography) are used to characterize performance of ML algorithms, by analyzing quality and quantity of the spatially structured information extracted from data with ML algorithms. Sequential simulations provide efficient assessment of uncertainty and spatial variability. Case study from the Chernobyl fallouts illustrates the performance of the proposed model. It is shown that probability mapping, provided by the combination of ML data driven and geostatistical model based approaches, can be efficiently used in decision-making process. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Linezolid is used off-label to treat multidrug-resistant tuberculosis (MDR-TB) in absence of systematic evidence. We performed a systematic review and meta-analysis on efficacy, safety and tolerability of linezolid-containing regimes based on individual data analysis. 12 studies (11 countries from three continents) reporting complete information on safety, tolerability, efficacy of linezolid-containing regimes in treating MDR-TB cases were identified based on Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. Meta-analysis was performed using the individual data of 121 patients with a definite treatment outcome (cure, completion, death or failure). Most MDR-TB cases achieved sputum smear (86 (92.5%) out of 93) and culture (100 (93.5%) out of 107) conversion after treatment with individualised regimens containing linezolid (median (inter-quartile range) times for smear and culture conversions were 43.5 (21-90) and 61 (29-119) days, respectively) and 99 (81.8%) out of 121 patients were successfully treated. No significant differences were detected in the subgroup efficacy analysis (daily linezolid dosage ≤600 mg versus >600 mg). Adverse events were observed in 63 (58.9%) out of 107 patients, of which 54 (68.4%) out of 79 were major adverse events that included anaemia (38.1%), peripheral neuropathy (47.1%), gastro-intestinal disorders (16.7%), optic neuritis (13.2%) and thrombocytopenia (11.8%). The proportion of adverse events was significantly higher when the linezolid daily dosage exceeded 600 mg. The study results suggest an excellent efficacy but also the necessity of caution in the prescription of linezolid.
Resumo:
Extensive defects of the pelvis and genitoperineal region are a reconstructive challenge. We discuss a consecutive series of 25 reconstructions with the pedicled anterolateral thigh (ALT) flap including muscle part of the vastus lateralis (VL) in 23 patients from October 1999 to September 2012.Only surface defects larger than 100 cm and reconstructions by composite ALT + VL were included in this retrospective analysis. Of the 23 patients, 19 underwent oncologic resection, whereas 4 cases presented Fournier gangrene. Three patients did not reach 6 months of follow-up and were excluded from further data analysis. Among the remaining 20 patients (22 reconstructions), average follow-up period was 14 months (range, 10-18 months). Patient's average age was 60 years. Average size of the defect was 182 cm.Postoperative complications included 1 (4.5%) flap necrosis out of 22 raised flaps, 1 partial flap necrosis after venous congestion, and 2 cases where a complementary reconstructive procedure was performed due to remaining defect or partial flap failure. In 6 cases, peripheral wound dehiscence (27%) was treated by debridement followed by split-thickness skin graft or advancement local flaps. Defect size was significantly related to postoperative complications and increased hospital stay, especially in those patients who underwent preoperative radiotherapy. At the end of the follow-up period, a long-term and satisfactory coverage was obtained in all patients without functional deficits.This consecutive series of composite ALT + VL flap shows that, in case of extended defects, the flap provides an excellent and adjustable muscle mass, is reliable with minimal donor-site morbidity, and can even be designed as a sensate flap.
Resumo:
The enhanced functional sensitivity offered by ultra-high field imaging may significantly benefit simultaneous EEG-fMRI studies, but the concurrent increases in artifact contamination can strongly compromise EEG data quality. In the present study, we focus on EEG artifacts created by head motion in the static B0 field. A novel approach for motion artifact detection is proposed, based on a simple modification of a commercial EEG cap, in which four electrodes are non-permanently adapted to record only magnetic induction effects. Simultaneous EEG-fMRI data were acquired with this setup, at 7T, from healthy volunteers undergoing a reversing-checkerboard visual stimulation paradigm. Data analysis assisted by the motion sensors revealed that, after gradient artifact correction, EEG signal variance was largely dominated by pulse artifacts (81-93%), but contributions from spontaneous motion (4-13%) were still comparable to or even larger than those of actual neuronal activity (3-9%). Multiple approaches were tested to determine the most effective procedure for denoising EEG data incorporating motion sensor information. Optimal results were obtained by applying an initial pulse artifact correction step (AAS-based), followed by motion artifact correction (based on the motion sensors) and ICA denoising. On average, motion artifact correction (after AAS) yielded a 61% reduction in signal power and a 62% increase in VEP trial-by-trial consistency. Combined with ICA, these improvements rose to a 74% power reduction and an 86% increase in trial consistency. Overall, the improvements achieved were well appreciable at single-subject and single-trial levels, and set an encouraging quality mark for simultaneous EEG-fMRI at ultra-high field.
Resumo:
Essential tremor (ET) is a common movement disorder with an estimated prevalence of 5% of the population aged over 65 years. In spite of intensive efforts, the genetic architecture of ET remains unknown. We used a combination of whole-exome sequencing and targeted resequencing in three ET families. In vitro and in vivo experiments in oligodendrocyte precursor cells and zebrafish were performed to test our findings. Whole-exome sequencing revealed a missense mutation in TENM4 segregating in an autosomal-dominant fashion in an ET family. Subsequent targeted resequencing of TENM4 led to the discovery of two novel missense mutations. Not only did these two mutations segregate with ET in two additional families, but we also observed significant over transmission of pathogenic TENM4 alleles across the three families. Consistent with a dominant mode of inheritance, in vitro analysis in oligodendrocyte precursor cells showed that mutant proteins mislocalize. Finally, expression of human mRNA harboring any of three patient mutations in zebrafish embryos induced defects in axon guidance, confirming a dominant-negative mode of action for these mutations. Our genetic and functional data, which is corroborated by the existence of a Tenm4 knockout mouse displaying an ET phenotype, implicates TENM4 in ET. Together with previous studies of TENM4 in model organisms, our studies intimate that processes regulating myelination in the central nervous system and axon guidance might be significant contributors to the genetic burden of this disorder.
Resumo:
Sport betting is a lucrative business for bookmakers, for the lucky (or wise) punters, but also for governments and for sport. While not new or even recent, the deviances linked to sport betting, primarily match-fixing, have gained increased media exposure in the past decade. This exploratory study is a qualitative content analysis of the press coverage of sport betting-related deviances in football in two countries (UK and France), using in each case two leading national publications over a period of five years. Data analysis indicates a mounting coverage of sport betting scandals, with teams, players and criminals increasingly framed as culprits, while authorities and federations primarily assume a positive role. As for the origin of sport betting deviances, French newspapers tend to blame the system (in an abstract way); British newspapers, in contrast, focus more on individual weaknesses, notably greed. This article contributed to the growing body of literature on the importance of these deviances and on the way they are perceived by sport organizations, legislators and the public at large.