915 resultados para building information model
Resumo:
Résumé Suite aux recentes avancées technologiques, les archives d'images digitales ont connu une croissance qualitative et quantitative sans précédent. Malgré les énormes possibilités qu'elles offrent, ces avancées posent de nouvelles questions quant au traitement des masses de données saisies. Cette question est à la base de cette Thèse: les problèmes de traitement d'information digitale à très haute résolution spatiale et/ou spectrale y sont considérés en recourant à des approches d'apprentissage statistique, les méthodes à noyau. Cette Thèse étudie des problèmes de classification d'images, c'est à dire de catégorisation de pixels en un nombre réduit de classes refletant les propriétés spectrales et contextuelles des objets qu'elles représentent. L'accent est mis sur l'efficience des algorithmes, ainsi que sur leur simplicité, de manière à augmenter leur potentiel d'implementation pour les utilisateurs. De plus, le défi de cette Thèse est de rester proche des problèmes concrets des utilisateurs d'images satellite sans pour autant perdre de vue l'intéret des méthodes proposées pour le milieu du machine learning dont elles sont issues. En ce sens, ce travail joue la carte de la transdisciplinarité en maintenant un lien fort entre les deux sciences dans tous les développements proposés. Quatre modèles sont proposés: le premier répond au problème de la haute dimensionalité et de la redondance des données par un modèle optimisant les performances en classification en s'adaptant aux particularités de l'image. Ceci est rendu possible par un système de ranking des variables (les bandes) qui est optimisé en même temps que le modèle de base: ce faisant, seules les variables importantes pour résoudre le problème sont utilisées par le classifieur. Le manque d'information étiquétée et l'incertitude quant à sa pertinence pour le problème sont à la source des deux modèles suivants, basés respectivement sur l'apprentissage actif et les méthodes semi-supervisées: le premier permet d'améliorer la qualité d'un ensemble d'entraînement par interaction directe entre l'utilisateur et la machine, alors que le deuxième utilise les pixels non étiquetés pour améliorer la description des données disponibles et la robustesse du modèle. Enfin, le dernier modèle proposé considère la question plus théorique de la structure entre les outputs: l'intègration de cette source d'information, jusqu'à présent jamais considérée en télédétection, ouvre des nouveaux défis de recherche. Advanced kernel methods for remote sensing image classification Devis Tuia Institut de Géomatique et d'Analyse du Risque September 2009 Abstract The technical developments in recent years have brought the quantity and quality of digital information to an unprecedented level, as enormous archives of satellite images are available to the users. However, even if these advances open more and more possibilities in the use of digital imagery, they also rise several problems of storage and treatment. The latter is considered in this Thesis: the processing of very high spatial and spectral resolution images is treated with approaches based on data-driven algorithms relying on kernel methods. In particular, the problem of image classification, i.e. the categorization of the image's pixels into a reduced number of classes reflecting spectral and contextual properties, is studied through the different models presented. The accent is put on algorithmic efficiency and the simplicity of the approaches proposed, to avoid too complex models that would not be used by users. The major challenge of the Thesis is to remain close to concrete remote sensing problems, without losing the methodological interest from the machine learning viewpoint: in this sense, this work aims at building a bridge between the machine learning and remote sensing communities and all the models proposed have been developed keeping in mind the need for such a synergy. Four models are proposed: first, an adaptive model learning the relevant image features has been proposed to solve the problem of high dimensionality and collinearity of the image features. This model provides automatically an accurate classifier and a ranking of the relevance of the single features. The scarcity and unreliability of labeled. information were the common root of the second and third models proposed: when confronted to such problems, the user can either construct the labeled set iteratively by direct interaction with the machine or use the unlabeled data to increase robustness and quality of the description of data. Both solutions have been explored resulting into two methodological contributions, based respectively on active learning and semisupervised learning. Finally, the more theoretical issue of structured outputs has been considered in the last model, which, by integrating outputs similarity into a model, opens new challenges and opportunities for remote sensing image processing.
Resumo:
The observation that real complex networks have internal structure has important implication for dynamic processes occurring on such topologies. Here we investigate the impact of community structure on a model of information transfer able to deal with both search and congestion simultaneously. We show that networks with fuzzy community structure are more efficient in terms of packet delivery than those with pronounced community structure. We also propose an alternative packet routing algorithm which takes advantage of the knowledge of communities to improve information transfer and show that in the context of the model an intermediate level of community structure is optimal. Finally, we show that in a hierarchical network setting, providing knowledge of communities at the level of highest modularity will improve network capacity by the largest amount.
Resumo:
The Bridges Decision Support Model is a geographic information system (GIS) that assembles existing data on archaeological sites, surveys, and their geologic contexts to assess the risk of bridge replacement projects encountering 13,000- to 150-year-old Native American sites. This project identifies critical variables for assessing prehistoric sites potential, examines the quality of available data about the variables, and applies the data to creating a decision support framework for use by the Iowa Department of Transportation (Iowa DOT) and others. An analysis of previous archaeological surveys indicates that subsurface testing to discover buried sites became increasingly common after 1980, but did not become routine until after the adoption of guidelines recommending such testing, in 1993. Even then, the average depth of testing has been relatively shallow. Alluvial deposits of sufficient age, deposited in depositional environments conducive to human habitation, are considerably thicker than archaeologists have routinely tested.
Resumo:
Neuropathic pain is a major health issue and is frequently accompanied by allodynia (painful sensations in response to normally non-painful stimulations), and unpleasant paresthesia/dysesthesia, pointing to alterations in sensory pathways normally dedicated to the processing of non-nociceptive information. Interestingly, mounting evidence indicate that central glial cells are key players in allodynia, partly due to changes in the astrocytic capacity to scavenge extracellular glutamate and gamma-aminobutyric acid (GABA), through changes in their respective transporters (EAAT and GAT). In the present study, we investigated the glial changes occurring in the dorsal column nuclei, the major target of normally innocuous sensory information, in the rat spared nerve injury (SNI) model of neuropathic pain. We report that together with a robust microglial and astrocytic reaction in the ipsilateral gracile nucleus, the GABA transporter GAT-1 is upregulated with no change in GAT-3 or glutamate transporters. Furthermore, [(3)H] GABA reuptake on crude synaptosome preparation shows that transporter activity is functionally increased ipsilaterally in SNI rats. This GAT-1 upregulation appears evenly distributed in the gracile nucleus and colocalizes with astrocytic activation. Neither glial activation nor GAT-1 modulation was detected in the cuneate nucleus. Together, the present results point to GABA transport in the gracile nucleus as a putative therapeutic target against abnormal sensory perceptions related to neuropathic pain.
Resumo:
This research provides a description of the process followed in order to assemble a "Social Accounting Matrix" for Spain corresponding to the year 2000 (SAMSP00). As argued in the paper, this process attempts to reconcile ESA95 conventions with requirements of applied general equilibrium modelling. Particularly, problems related to the level of aggregation of net taxation data, and to the valuation system used for expressing the monetary value of input-output transactions have deserved special attention. Since the adoption of ESA95 conventions, input-output transactions have been preferably valued at basic prices, which impose additional difficulties on modellers interested in computing applied general equilibrium models. This paper addresses these difficulties by developing a procedure that allows SAM-builders to change the valuation system of input-output transactions conveniently. In addition, this procedure produces new data related to net taxation information.
Resumo:
AIM: Specific factors responsible for interindividual variability should be identified and their contribution quantified to improve the usefulness of biological monitoring. Among others, age is an easily identifiable determinant, which could play an important impact on biological variability. MATERIALS AND METHODS: A compartmental toxicokinetic model developed in previous studies for a series of metallic and organic compounds was applied to the description of age differences. Young male physiological and metabolic parameters, based on Reference Man information, were taken from preceding studies and were modified to take into account age based on available information about age differences. RESULTS: Numerical simulation using the kinetic model with the modified parameters indicates in some cases important differences due to age. The expected changes are mostly of the order of 10-20%, but differences up to 50% were observed in some cases. CONCLUSION: These differences appear to depend on the chemical and on the biological entity considered. Further work should be done to improve our estimates of these parameters, by considering for example uncertainty and variability in these parameters. [Authors]
Resumo:
Understanding and anticipating biological invasions can focus either on traits that favour species invasiveness or on features of the receiving communities, habitats or landscapes that promote their invasibility. Here, we address invasibility at the regional scale, testing whether some habitats and landscapes are more invasible than others by fitting models that relate alien plant species richness to various environmental predictors. We use a multi-model information-theoretic approach to assess invasibility by modelling spatial and ecological patterns of alien invasion in landscape mosaics and testing competing hypotheses of environmental factors that may control invasibility. Because invasibility may be mediated by particular characteristics of invasiveness, we classified alien species according to their C-S-R plant strategies. We illustrate this approach with a set of 86 alien species in Northern Portugal. We first focus on predictors influencing species richness and expressing invasibility and then evaluate whether distinct plant strategies respond to the same or different groups of environmental predictors. We confirmed climate as a primary determinant of alien invasions and as a primary environmental gradient determining landscape invasibility. The effects of secondary gradients were detected only when the area was sub-sampled according to predictions based on the primary gradient. Then, multiple predictor types influenced patterns of alien species richness, with some types (landscape composition, topography and fire regime) prevailing over others. Alien species richness responded most strongly to extreme land management regimes, suggesting that intermediate disturbance induces biotic resistance by favouring native species richness. Land-use intensification facilitated alien invasion, whereas conservation areas hosted few invaders, highlighting the importance of ecosystem stability in preventing invasions. Plants with different strategies exhibited different responses to environmental gradients, particularly when the variations of the primary gradient were narrowed by sub-sampling. Such differential responses of plant strategies suggest using distinct control and eradication approaches for different areas and alien plant groups.
Resumo:
Information technology will affect academic activities as well as the nature of the high education sector. This sector besides the need to assimilate these technologies will need to attend the requisites of market globalization and, as consequence, all theses changes will be reflected in the university library. Prospectives impacts will affect the structure (emphasis in user services, outsourcing of several services), in the financing aspect (growing of consortia in order to reduce costs), in services (electronic reference, support to long distance education programs, intelligent agents) and in the clientele (attending the great demand por high education which implies a diversity of people).
Resumo:
A number of geophysical methods, such as ground-penetrating radar (GPR), have the potential to provide valuable information on hydrological properties in the unsaturated zone. In particular, the stochastic inversion of such data within a coupled geophysical-hydrological framework may allow for the effective estimation of vadose zone hydraulic parameters and their corresponding uncertainties. A critical issue in stochastic inversion is choosing prior parameter probability distributions from which potential model configurations are drawn and tested against observed data. A well chosen prior should reflect as honestly as possible the initial state of knowledge regarding the parameters and be neither overly specific nor too conservative. In a Bayesian context, combining the prior with available data yields a posterior state of knowledge about the parameters, which can then be used statistically for predictions and risk assessment. Here we investigate the influence of prior information regarding the van Genuchten-Mualem (VGM) parameters, which describe vadose zone hydraulic properties, on the stochastic inversion of crosshole GPR data collected under steady state, natural-loading conditions. We do this using a Bayesian Markov chain Monte Carlo (MCMC) inversion approach, considering first noninformative uniform prior distributions and then more informative priors derived from soil property databases. For the informative priors, we further explore the effect of including information regarding parameter correlation. Analysis of both synthetic and field data indicates that the geophysical data alone contain valuable information regarding the VGM parameters. However, significantly better results are obtained when we combine these data with a realistic, informative prior.
Resumo:
For radiotherapy treatment planning of retinoblastoma inchildhood, Computed Tomography (CT) represents thestandard method for tumor volume delineation, despitesome inherent limitations. CT scan is very useful inproviding information on physical density for dosecalculation and morphological volumetric information butpresents a low sensitivity in assessing the tumorviability. On the other hand, 3D ultrasound (US) allows ahigh accurate definition of the tumor volume thanks toits high spatial resolution but it is not currentlyintegrated in the treatment planning but used only fordiagnosis and follow-up. Our ultimate goal is anautomatic segmentation of gross tumor volume (GTV) in the3D US, the segmentation of the organs at risk (OAR) inthe CT and the registration of both. In this paper, wepresent some preliminary results in this direction. Wepresent 3D active contour-based segmentation of the eyeball and the lens in CT images; the presented approachincorporates the prior knowledge of the anatomy by usinga 3D geometrical eye model. The automated segmentationresults are validated by comparing with manualsegmentations. Then, for the fusion of 3D CT and USimages, we present two approaches: (i) landmark-basedtransformation, and (ii) object-based transformation thatmakes use of eye ball contour information on CT and USimages.
Resumo:
Résumé La réalisation d'une seconde ligne de métro (M2) dès 2004, passant dans le centre ville de Lausanne, a été l'opportunité de développer une méthodologie concernant des campagnes microgravimétriques dans un environnement urbain perturbé. Les corrections topographiques prennent une dimension particulière dans un tel milieu, car de nombreux objets non géologiques d'origine anthropogénique comme toutes sortes de sous-sols vides viennent perturber les mesures gravimétriques. Les études de génie civil d'avant projet de ce métro nous ont fournis une quantité importante d'informations cadastrales, notamment sur les contours des bâtiments, sur la position prévue du tube du M2, sur des profondeurs de sous-sol au voisinage du tube, mais aussi sur la géologie rencontré le long du corridor du M2 (issue des données lithologiques de forages géotechniques). La planimétrie des sous-sols a été traitée à l'aide des contours des bâtiments dans un SIG (Système d'Information Géographique), alors qu'une enquête de voisinage fut nécessaire pour mesurer la hauteur des sous-sols. Il a été alors possible, à partir d'un MNT (Modèle Numérique de Terrain) existant sur une grille au mètre, de mettre à jour celui ci avec les vides que représentent ces sous-sols. Les cycles de mesures gravimétriques ont été traités dans des bases de données Ac¬cess, pour permettre un plus grand contrôle des données, une plus grande rapidité de traitement, et une correction de relief rétroactive plus facile, notamment lorsque des mises à jour de la topographie ont lieu durant les travaux. Le quartier Caroline (entre le pont Bessières et la place de l'Ours) a été choisi comme zone d'étude. Le choix s'est porté sur ce quartier du fait que, durant ce travail de thèse, nous avions chronologiquement les phases pré et post creusement du tunnel du M2. Cela nous a permis d'effectuer deux campagnes gravimétriques (avant le creu¬sement durant l'été 2005 et après le creusement durant l'été 2007). Ces réitérations nous ont permis de tester notre modélisation du tunnel. En effet, en comparant les mesures des deux campagnes et la réponse gravifique du modèle du tube discrétisé en prismes rectangulaires, nous avons pu valider notre méthode de modélisation. La modélisation que nous avons développée nous permet de construire avec détail la forme de l'objet considéré avec la possibilité de recouper plusieurs fois des interfaces de terrains géologiques et la surface topographique. Ce type de modélisation peut s'appliquer à toutes constructions anthropogéniques de formes linéaires. Abstract The realization of a second underground (M2) in 2004, in downtown Lausanne, was the opportunity to develop a methodology of microgravity in urban environment. Terrain corrections take on special meaning in such environment. Many non-geologic anthropogenic objects like basements act as perturbation of gravity measurements. Civil engineering provided a large amount of cadastral informations, including out¬lines of buildings, M2 tube position, depths of some basements in the vicinity of the M2 corridor, and also on the geology encountered along the M2 corridor (from the lithological data from boreholes). Geometry of basements was deduced from building outlines in a GIS (Geographic Information System). Field investigation was carried out to measure or estimate heights of basements. A DEM (Digital Elevation Model) of the city of Lausanne is updated from voids of basements. Gravity cycles have been processed in Access database, to enable greater control of data, enhance speed processing, and retroactive terrain correction easier, when update of topographic surface are available. Caroline area (between the bridge Saint-Martin and Place de l'Ours) was chosen as the study area. This area was in particular interest because it was before and after digging in this thesis. This allowed us to conduct two gravity surveys (before excavation during summer 2005 and after excavation during summer 2007). These re-occupations enable us to test our modélisation of the tube. Actually, by comparing the difference of measurements between the both surveys and the gravity response of our model (by rectangular prisms), we were able to validate our modeling. The modeling method we developed allows us to construct detailed shape of an object with possibility to cross land geological interfaces and surface topography. This type of modélisation can be applied to all anthropogenic structures.
Resumo:
Identifying the geographic distribution of populations is a basic, yet crucial step in many fundamental and applied ecological projects, as it provides key information on which many subsequent analyses depend. However, this task is often costly and time consuming, especially where rare species are concerned and where most sampling designs generally prove inefficient. At the same time, rare species are those for which distribution data are most needed for their conservation to be effective. To enhance fieldwork sampling, model-based sampling (MBS) uses predictions from species distribution models: when looking for the species in areas of high habitat suitability, chances should be higher to find them. We thoroughly tested the efficiency of MBS by conducting an important survey in the Swiss Alps, assessing the detection rate of three rare and five common plant species. For each species, habitat suitability maps were produced following an ensemble modeling framework combining two spatial resolutions and two modeling techniques. We tested the efficiency of MBS and the accuracy of our models by sampling 240 sites in the field (30 sitesx8 species). Across all species, the MBS approach proved to be effective. In particular, the MBS design strictly led to the discovery of six sites of presence of one rare plant, increasing chances to find this species from 0 to 50%. For common species, MBS doubled the new population discovery rates as compared to random sampling. Habitat suitability maps coming from the combination of four individual modeling methods predicted well the species' distribution and more accurately than the individual models. As a conclusion, using MBS for fieldwork could efficiently help in increasing our knowledge of rare species distribution. More generally, we recommend using habitat suitability models to support conservation plans.
Resumo:
Background: Conventional magnetic resonance imaging (MRI) techniques are highly sensitive to detect multiple sclerosis (MS) plaques, enabling a quantitative assessment of inflammatory activity and lesion load. In quantitative analyses of focal lesions, manual or semi-automated segmentations have been widely used to compute the total number of lesions and the total lesion volume. These techniques, however, are both challenging and time-consuming, being also prone to intra-observer and inter-observer variability.Aim: To develop an automated approach to segment brain tissues and MS lesions from brain MRI images. The goal is to reduce the user interaction and to provide an objective tool that eliminates the inter- and intra-observer variability.Methods: Based on the recent methods developed by Souplet et al. and de Boer et al., we propose a novel pipeline which includes the following steps: bias correction, skull stripping, atlas registration, tissue classification, and lesion segmentation. After the initial pre-processing steps, a MRI scan is automatically segmented into 4 classes: white matter (WM), grey matter (GM), cerebrospinal fluid (CSF) and partial volume. An expectation maximisation method which fits a multivariate Gaussian mixture model to T1-w, T2-w and PD-w images is used for this purpose. Based on the obtained tissue masks and using the estimated GM mean and variance, we apply an intensity threshold to the FLAIR image, which provides the lesion segmentation. With the aim of improving this initial result, spatial information coming from the neighbouring tissue labels is used to refine the final lesion segmentation.Results:The experimental evaluation was performed using real data sets of 1.5T and the corresponding ground truth annotations provided by expert radiologists. The following values were obtained: 64% of true positive (TP) fraction, 80% of false positive (FP) fraction, and an average surface distance of 7.89 mm. The results of our approach were quantitatively compared to our implementations of the works of Souplet et al. and de Boer et al., obtaining higher TP and lower FP values.Conclusion: Promising MS lesion segmentation results have been obtained in terms of TP. However, the high number of FP which is still a well-known problem of all the automated MS lesion segmentation approaches has to be improved in order to use them for the standard clinical practice. Our future work will focus on tackling this issue.
Resumo:
Aim Species distribution models (SDMs) based on current species ranges underestimate the potential distribution when projected in time and/or space. A multi-temporal model calibration approach has been suggested as an alternative, and we evaluate this using 13,000 years of data. Location Europe. Methods We used fossil-based records of presence for Picea abies, Abies alba and Fagus sylvatica and six climatic variables for the period 13,000 to 1000yr bp. To measure the contribution of each 1000-year time step to the total niche of each species (the niche measured by pooling all the data), we employed a principal components analysis (PCA) calibrated with data over the entire range of possible climates. Then we projected both the total niche and the partial niches from single time frames into the PCA space, and tested if the partial niches were more similar to the total niche than random. Using an ensemble forecasting approach, we calibrated SDMs for each time frame and for the pooled database. We projected each model to current climate and evaluated the results against current pollen data. We also projected all models into the future. Results Niche similarity between the partial and the total-SDMs was almost always statistically significant and increased through time. SDMs calibrated from single time frames gave different results when projected to current climate, providing evidence of a change in the species realized niches through time. Moreover, they predicted limited climate suitability when compared with the total-SDMs. The same results were obtained when projected to future climates. Main conclusions The realized climatic niche of species differed for current and future climates when SDMs were calibrated considering different past climates. Building the niche as an ensemble through time represents a way forward to a better understanding of a species' range and its ecology in a changing climate.