942 resultados para Current Density Mapping Method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work deals with the elaboration of flood hazard maps. These maps reflect the areas prone to floods based on the effects of Hurricane Mitch in the Municipality of Jucuarán of El Salvador. Stream channels located in the coastal range in the SE of El Salvador flow into the Pacific Ocean and generate alluvial fans. Communities often inhabit these fans can be affected by floods. The geomorphology of these stream basins is associated with small areas, steep slopes, well developed regolite and extensive deforestation. These features play a key role in the generation of flash-floods. This zone lacks comprehensive rainfall data and gauging stations. The most detailed topographic maps are on a scale of 1:25 000. Given that the scale was not sufficiently detailed, we used aerial photographs enlarged to the scale of 1:8000. The effects of Hurricane Mitch mapped on these photographs were regarded as the reference event. Flood maps have a dual purpose (1) community emergency plans, (2) regional land use planning carried out by local authorities. The geomorphological method is based on mapping the geomorphological evidence (alluvial fans, preferential stream channels, erosion and sedimentation, man-made terraces). Following the interpretation of the photographs this information was validated on the field and complemented by eyewitness reports such as the height of water and flow typology. In addition, community workshops were organized to obtain information about the evolution and the impact of the phenomena. The superimposition of this information enables us to obtain a comprehensive geomorphological map. Another aim of the study was the calculation of the peak discharge using the Manning and the paleohydraulic methods and estimates based on geomorphologic criterion. The results were compared with those obtained using the rational method. Significant differences in the order of magnitude of the calculated discharges were noted. The rational method underestimated the results owing to short and discontinuous periods of rainfall data with the result that probabilistic equations cannot be applied. The Manning method yields a wide range of results because of its dependence on the roughness coefficient. The paleohydraulic method yielded higher values than the rational and Manning methods. However, it should be pointed out that it is possible that bigger boulders could have been moved had they existed. These discharge values are lower than those obtained by the geomorphological estimates, i.e. much closer to reality. The flood hazard maps were derived from the comprehensive geomorphological map. Three categories of hazard were established (very high, high and moderate) using flood energy, water height and velocity flow deduced from geomorphological and eyewitness reports.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article extends existing discussion in literature on probabilistic inference and decision making with respect to continuous hypotheses that are prevalent in forensic toxicology. As a main aim, this research investigates the properties of a widely followed approach for quantifying the level of toxic substances in blood samples, and to compare this procedure with a Bayesian probabilistic approach. As an example, attention is confined to the presence of toxic substances, such as THC, in blood from car drivers. In this context, the interpretation of results from laboratory analyses needs to take into account legal requirements for establishing the 'presence' of target substances in blood. In a first part, the performance of the proposed Bayesian model for the estimation of an unknown parameter (here, the amount of a toxic substance) is illustrated and compared with the currently used method. The model is then used in a second part to approach-in a rational way-the decision component of the problem, that is judicial questions of the kind 'Is the quantity of THC measured in the blood over the legal threshold of 1.5 μg/l?'. This is pointed out through a practical example.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: As an important modifiable lifestyle factor in osteoporosis prevention, physical activity has been shown to positively influence bone mass accrual during growth. We have previously shown that a nine month general school based physical activity intervention increased bone mineral content (BMC) and density (aBMD) in primary school children. From a public health perspective, a major key issue is whether these effects persist during adolescence. We therefore measured BMC and aBMD three years after cessation of the intervention to investigate whether the beneficial short-term effects persisted. METHODS: All children from 28 randomly selected first and fifth grade classes (intervention group (INT): 16 classes, n=297; control group (CON): 12 classes, n=205) who had participated in KISS (Kinder-und Jugendsportstudie) were contacted three years after cessation of the intervention program. The intervention included daily physical education with daily impact loading activities over nine months. Measurements included anthropometry, vigorous physical activity (VPA) by accelerometers, and BMC/aBMD for total body, femoral neck, total hip, and lumbar spine by dual-energy X-ray absorptiometry (DXA). Sex- and age-adjusted Z-scores of BMC or aBMD at follow-up were regressed on intervention (1 vs. 0), the respective Z-score at baseline, gender, follow-up height and weight, pubertal stage at follow-up, previous and current VPA, adjusting for clustering within schools. RESULTS: 377 of 502 (75%) children participated in baseline DXA measurements and of those, 214 (57%) participated to follow-up. At follow-up INT showed significantly higher Z-scores of BMC at total body (adjusted group difference: 0.157 units (0.031-0.283); p=0.015), femoral neck (0.205 (0.007-0.402); p=0.042) and at total hip (0.195 (0.036 to 0.353); p=0.016) and higher Z-scores of aBMD for total body (0.167 (0.016 to 0.317); p=0.030) compared to CON, representing 6-8% higher values for children in the INT. No differences could be found for the remaining bone parameters. For the subpopulation with baseline VPA (n=163), effect sizes became stronger after baseline VPA adjustment. After adjustment for baseline and current VPA (n=101), intervention effects were no longer significant, while effect sizes remained the same as without adjustment for VPA. CONCLUSION: Beneficial effects on BMC of a nine month general physical activity intervention appeared to persist over three years. Part of the maintained effects may be explained by current physical activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The major problems associated with the use of corticosteroids for the treatment of ocular diseases are their poor intraocular penetration to the posterior segment when administered locally and their secondary side effects when given systemically. To circumvent these problems more efficient methods and techniques of local delivery are being developed. The purposes of this study were: (1) to investigate the pharmacokinetics of intraocular penetration of hemisuccinate methyl prednisolone (HMP) after its delivery using the transscleral Coulomb controlled iontophoresis (CCI) system applied to the eye or after intravenous (i.v.) injection in the rabbit, (2) to test the safety of the CCI system for the treated eyes and (3) to compare the pharmacokinetic profiles of HMP intraocular distribution after CCI delivery to i.v. injection. For each parameter evaluated, six rabbit eyes were used. For the CCI system, two concentrations of HMP (62.5 and 150mg ml(-1)), various intensities of current and duration of treatment were analyzed. In rabbits serving as controls the HMP was infused in the CCI device but without applied electric current. For the i.v. delivery, HMP at 10mg kg(-1)as a 62.5mg ml(-1)solution was used. The rabbits were observed clinically for evidence of ocular toxicity. At various time points after the administration of drug, rabbits were killed and intraocular fluids and tissues were sampled for methylprednisolone (MP) concentrations by high pressure liquid chromatography (HPLC). Histology examinations were performed on six eyes of each group. Among groups that received CCI, the concentrations of MP increased in all ocular tissues and fluids in relation to the intensities of current used (0.4, 1.0 and 2.0mA/0.5cm(2)) and its duration (4 and 10min). Sustained and highest levels of MP were achieved in the choroid and the retina of rabbit eyes treated with the highest current and 10min duration of CCI. No clinical toxicity or histological lesions were observed following CCI. Negligible amounts of MP were found in ocular tissues in the CCI control group without application of current. Compared to i.v. administration, CCI achieved higher and more sustained tissue concentrations with negligible systemic absorption. These data demonstrate that high levels of MP can be safely achieved in intraocular tissues and fluids of the rabbit eye, using CCI. With this system, intraocular tissues levels of MP are higher than those achieved after i.v. injection. Furthermore, if needed, the drug levels achieved with CCI can be modulated as a function of current intensity and duration of treatment. CCI could therefore be used as an alternative method for the delivery of high levels of MP to the intraocular tissues of both the anterior and posterior segments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim  Background The expected benefit of transvaginal specimen extraction is reduced incision-related morbidity. Objectives A systematic review of transvaginal specimen extraction in colorectal surgery was carried out to assess this expectation. Method  Search strategy The following keywords, in various combinations, were searched: NOSE (natural orifices specimen extraction), colorectal, colon surgery, transvaginal, right hemicolectomy, left hemicolectomy, low anterior resection, sigmoidectomy, ileocaecal resection, proctocolectomy, colon cancer, sigmoid diverticulitis and inflammatory bowel diseases. Selection criteria Selection criteria included large bowel resection with transvaginal specimen extraction, laparoscopic approach, human studies and English language. Exclusion criteria were experimental studies and laparotomic approach or local excision. All articles published up to February 2011 were included. Results  Twenty-three articles (including a total of 130 patients) fulfilled the search criteria. The primary diagnosis was colorectal cancer in 51% (67) of patients, endometriosis in 46% (60) of patients and other conditions in the remaining patients. A concurrent gynaecological procedure was performed in 17% (22) of patients. One case of conversion to laparotomy was reported. In two patients, transvaginal extraction failed. In left- and right-sided resections, the rate of severe complications was 3.7% and 2%, respectively. Two significant complications, one of pelvic seroma and one of rectovaginal fistula, were likely to have been related to transvaginal extraction. The degree of follow up was specified in only one study. Harvested nodes and negative margins were adequate and reported in 70% of oncological cases. Conclusion  Vaginal extraction of a colorectal surgery specimen shows potential benefit, particularly when associated with a gynaecological procedure. Data from prospective randomized trials are needed to support the routine use of this technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: A rapid phage display method for the elucidation of cognate peptide specific ligand for receptors is described. The approach may be readily integrated into the interface of genomic and proteomic studies to identify biologically relevant ligands.Methods: A gene fragment library from influenza coat protein haemagglutinin (HA) gene was constructed by treating HA cDNA with DNAse I to create 50 ¿ 100 bp fragments. These fragments were cloned into plasmid pORFES IV and in-frame inserts were selected. These in-frame fragment inserts were subsequently cloned into a filamentous phage display vector JC-M13-88 for surface display as fusions to a synthetic copy of gene VIII. Two well characterized antibodies, mAb 12CA5 and pAb 07431, directed against distinct known regions of HA were used to pan the library. Results: Two linear epitopes, HA peptide 112 ¿ 126 and 162¿173, recognized by mAb 12CA5 and pAb 07431, respectively, were identified as the cognate epitopes.Conclusion: This approach is a useful alternative to conventional methods such as screening of overlapping synthetic peptide libraries or gene fragment expression libraries when searching for precise peptide protein interactions, and may be applied to functional proteomics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of this work was to compare two methods to estimate the deposition of pesticide applied by aerial spraying. Hundred and fifty pieces of water sensitive paper were distributed over an area of 50 m length by 75 m width for sampling droplets sprayed by an aircraft calibrated to apply a spray volume of 32 L/ha. The samples were analysed by visual microscopic method using NG 2 Porton graticule and by an image analyser computer program. The results reached by visual microscopic method were the following: volume median diameter, 398±62 mum; number median diameter, 159±22 mum; droplet density, 22.5±7.0 droplets/cm² and estimated deposited volume, 22.2±9.4 L/ha. The respective ones reached with the computer program were: 402±58 mum, 161±32 mum, 21.9±7.5 droplets/cm² and 21.9±9.2 L/ha. Graphs of the spatial distribution of droplet density and deposited spray volume on the area were produced by the computer program.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasing anthropogenic pressures urge enhanced knowledge and understanding of the current state of marine biodiversity. This baseline information is pivotal to explore present trends, detect future modifications and propose adequate management actions for marine ecosystems. Coralligenous outcrops are a highly diverse and structurally complex deep-water habitat faced with major threats in the Mediterranean Sea. Despite its ecological, aesthetic and economic value, coralligenous biodiversity patterns are still poorly understood. There is currently no single sampling method that has been demonstrated to be sufficiently representative to ensure adequate community assessment and monitoring in this habitat. Therefore, we propose a rapid non-destructive protocol for biodiversity assessment and monitoring of coralligenous outcrops providing good estimates of its structure and species composition, based on photographic sampling and the determination of presence/absence of macrobenthic species. We used an extensive photographic survey, covering several spatial scales (100s of m to 100s of km) within the NW Mediterranean and including 2 different coralligenous assemblages: Paramuricea clavata (PCA) and Corallium rubrum assemblage (CRA). This approach allowed us to determine the minimal sampling area for each assemblage (5000 cm² for PCA and 2500 cm²for CRA). In addition, we conclude that 3 replicates provide an optimal sampling effort in order to maximize the species number and to assess the main biodiversity patterns of studied assemblages in variability studies requiring replicates. We contend that the proposed sampling approach provides a valuable tool for management and conservation planning, monitoring and research programs focused on coralligenous outcrops, potentially also applicable in other benthic ecosystems

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Bone health is a concern when treating early stage breast cancer patients with adjuvant aromatase inhibitors. Early detection of patients (pts) at risk of osteoporosis and fractures may be helpful for starting preventive therapies and selecting the most appropriate endocrine therapy schedule. We present statistical models describing the evolution of lumbar and hip bone mineral density (BMD) in pts treated with tamoxifen (T), letrozole (L) and sequences of T and L. Methods: Available dual-energy x-ray absorptiometry exams (DXA) of pts treated in trial BIG 1-98 were retrospectively collected from Swiss centers. Treatment arms: A) T for 5 years, B) L for 5 years, C) 2 years of T followed by 3 years of L and, D) 2 years of L followed by 3 years of T. Pts without DXA were used as a control for detecting selection biases. Patients randomized to arm A were subsequently allowed an unplanned switch from T to L. Allowing for variations between DXA machines and centres, two repeated measures models, using a covariance structure that allow for different times between DXA, were used to estimate changes in hip and lumbar BMD (g/cm2) from trial randomization. Prospectively defined covariates, considered as fixed effects in the multivariable models in an intention to treat analysis, at the time of trial randomization were: age, height, weight, hysterectomy, race, known osteoporosis, tobacco use, prior bone fracture, prior hormone replacement therapy (HRT), bisphosphonate use and previous neo-/adjuvant chemotherapy (ChT). Similarly, the T-scores for lumbar and hip BMD measurements were modeled using a per-protocol approach (allowing for treatment switch in arm A), specifically studying the effect of each therapy upon T-score percentage. Results: A total of 247 out of 546 pts had between 1 and 5 DXA; a total of 576 DXA were collected. Number of DXA measurements per arm were; arm A 133, B 137, C 141 and D 135. The median follow-up time was 5.8 years. Significant factors positively correlated with lumbar and hip BMD in the multivariate analysis were weight, previous HRT use, neo-/adjuvant ChT, hysterectomy and height. Significant negatively correlated factors in the models were osteoporosis, treatment arm (B/C/D vs. A), time since endocrine therapy start, age and smoking (current vs. never).Modeling the T-score percentage, differences from T to L were -4.199% (p = 0.036) and -4.907% (p = 0.025) for the hip and lumbar measurements respectively, before any treatment switch occurred. Conclusions: Our statistical models describe the lumbar and hip BMD evolution for pts treated with L and/or T. The results of both localisations confirm that, contrary to expectation, the sequential schedules do not seem less detrimental for the BMD than L monotherapy. The estimated difference in BMD T-score percent is at least 4% from T to L.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the advances in sensor networks and remote sensing technologies, the acquisition and storage rates of meteorological and climatological data increases every day and ask for novel and efficient processing algorithms. A fundamental problem of data analysis and modeling is the spatial prediction of meteorological variables in complex orography, which serves among others to extended climatological analyses, for the assimilation of data into numerical weather prediction models, for preparing inputs to hydrological models and for real time monitoring and short-term forecasting of weather.In this thesis, a new framework for spatial estimation is proposed by taking advantage of a class of algorithms emerging from the statistical learning theory. Nonparametric kernel-based methods for nonlinear data classification, regression and target detection, known as support vector machines (SVM), are adapted for mapping of meteorological variables in complex orography.With the advent of high resolution digital elevation models, the field of spatial prediction met new horizons. In fact, by exploiting image processing tools along with physical heuristics, an incredible number of terrain features which account for the topographic conditions at multiple spatial scales can be extracted. Such features are highly relevant for the mapping of meteorological variables because they control a considerable part of the spatial variability of meteorological fields in the complex Alpine orography. For instance, patterns of orographic rainfall, wind speed and cold air pools are known to be correlated with particular terrain forms, e.g. convex/concave surfaces and upwind sides of mountain slopes.Kernel-based methods are employed to learn the nonlinear statistical dependence which links the multidimensional space of geographical and topographic explanatory variables to the variable of interest, that is the wind speed as measured at the weather stations or the occurrence of orographic rainfall patterns as extracted from sequences of radar images. Compared to low dimensional models integrating only the geographical coordinates, the proposed framework opens a way to regionalize meteorological variables which are multidimensional in nature and rarely show spatial auto-correlation in the original space making the use of classical geostatistics tangled.The challenges which are explored during the thesis are manifolds. First, the complexity of models is optimized to impose appropriate smoothness properties and reduce the impact of noisy measurements. Secondly, a multiple kernel extension of SVM is considered to select the multiscale features which explain most of the spatial variability of wind speed. Then, SVM target detection methods are implemented to describe the orographic conditions which cause persistent and stationary rainfall patterns. Finally, the optimal splitting of the data is studied to estimate realistic performances and confidence intervals characterizing the uncertainty of predictions.The resulting maps of average wind speeds find applications within renewable resources assessment and opens a route to decrease the temporal scale of analysis to meet hydrological requirements. Furthermore, the maps depicting the susceptibility to orographic rainfall enhancement can be used to improve current radar-based quantitative precipitation estimation and forecasting systems and to generate stochastic ensembles of precipitation fields conditioned upon the orography.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this work was to evaluate the efficiency of a new method, developed for predicting density and floristic composition of weed communities in field crops. Based on the use of solaria (100 mm transparent plastic tarps lying on the soil) to stimulate weed seedlings emergence, the method was tested in Tandil, Argentina, from 1998 to 2001. The system involved corn and sunflower in commercial no-till system. Major weeds in the experiments included Digitaria sanguinalis, Setaria verticillata and S. viridis, which accounted for 98% of the weed community in the three years of experiments since 1998. Large numbers of Tagetes minuta, Chenopodium album and Ammi majus were present in 2001. Comparison of weed communities under solaria with communities in field crops indicated that the method is useful for predicting the presence and density of some major weed species, at both high and low densities, of individuals in areas of 10 ha using only five solaria. Low density of weed species makes the method particularly useful to help deciding the time for herbicide applications to avoid soil contamination.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Iowa state, county, and city engineering offices expend considerable effort monitoring the state’s approximately 25,000 bridges, most of which span small waterways. In fact, the need for monitoring is actually greater for bridges over small waterways because scour processes are exacerbated by the close proximity of abutments, piers, channel banks, approach embankments, and other local obstructions. The bridges are customarily inspected biennially by the county’s road department bridge inspectors. It is extremely time consuming and difficult to obtain consistent, reliable, and timely information on bridge-waterway conditions for so many bridges. Moreover, the current approaches to gather survey information is not uniform, complete, and quantitative. The methodology and associated software (DIGIMAP) developed through the present project enable a non-intrusive means to conduct fast, efficient, and accurate inspection of the waterways in the vicinity of the bridges and culverts using one technique. The technique combines algorithms image of registration and velocimetry using images acquired with conventional devices at the inspection site. The comparison of the current bridge inspection and monitoring methods with the DIGIMAP methodology enables to conclude that the new procedure assembles quantitative information on the waterway hydrodynamic and morphologic features with considerable reduced effort, time, and cost. It also improves the safety of the bridge and culvert inspections conducted during normal and extreme hydrologic events. The data and information are recorded in a digital format, enabling immediate and convenient tracking of the waterway changes over short or long time intervals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CONTEXT: In the Health Outcomes and Reduced Incidence with Zoledronic Acid Once Yearly - Pivotal Fracture Trial (HORIZON-PFT), zoledronic acid (ZOL) 5 mg significantly reduced fracture risk. OBJECTIVE: The aim of the study was to identify factors associated with greater efficacy during ZOL 5 mg treatment. DESIGN, SETTING, AND PATIENTS: We conducted a subgroup analysis (preplanned and post hoc) of a multicenter, double-blind, placebo-controlled, 36-month trial in 7765 women with postmenopausal osteoporosis. Intervention: A single infusion of ZOL 5 mg or placebo was administered at baseline, 12, and 24 months. MAIN OUTCOME MEASURES: Primary endpoints were new vertebral fracture and hip fracture. Secondary endpoints were nonvertebral fracture and change in femoral neck bone mineral density (BMD). Baseline risk factor subgroups were age, BMD T-score and vertebral fracture status, total hip BMD, race, weight, geographical region, smoking, height loss, history of falls, physical activity, prior bisphosphonates, creatinine clearance, body mass index, and concomitant osteoporosis medications. RESULTS: Greater ZOL induced effects on vertebral fracture risk were seen with younger age (treatment-by-subgroup interaction, P = 0.05), normal creatinine clearance (P = 0.04), and body mass index >or= 25 kg/m(2) (P = 0.02). There were no significant treatment-factor interactions for hip or nonvertebral fracture or for change in BMD. CONCLUSIONS: ZOL appeared more effective in preventing vertebral fracture in younger women, overweight/obese women, and women with normal renal function. ZOL had similar effects irrespective of fracture risk factors or femoral neck BMD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The global structural connectivity of the brain, the human connectome, is now accessible at millimeter scale with the use of MRI. In this paper, we describe an approach to map the connectome by constructing normalized whole-brain structural connection matrices derived from diffusion MRI tractography at 5 different scales. Using a template-based approach to match cortical landmarks of different subjects, we propose a robust method that allows (a) the selection of identical cortical regions of interest of desired size and location in different subjects with identification of the associated fiber tracts (b) straightforward construction and interpretation of anatomically organized whole-brain connection matrices and (c) statistical inter-subject comparison of brain connectivity at various scales. The fully automated post-processing steps necessary to build such matrices are detailed in this paper. Extensive validation tests are performed to assess the reproducibility of the method in a group of 5 healthy subjects and its reliability is as well considerably discussed in a group of 20 healthy subjects.