192 resultados para Additive hazards

em Université de Lausanne, Switzerland


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents general problems and approaches for the spatial data analysis using machine learning algorithms. Machine learning is a very powerful approach to adaptive data analysis, modelling and visualisation. The key feature of the machine learning algorithms is that they learn from empirical data and can be used in cases when the modelled environmental phenomena are hidden, nonlinear, noisy and highly variable in space and in time. Most of the machines learning algorithms are universal and adaptive modelling tools developed to solve basic problems of learning from data: classification/pattern recognition, regression/mapping and probability density modelling. In the present report some of the widely used machine learning algorithms, namely artificial neural networks (ANN) of different architectures and Support Vector Machines (SVM), are adapted to the problems of the analysis and modelling of geo-spatial data. Machine learning algorithms have an important advantage over traditional models of spatial statistics when problems are considered in a high dimensional geo-feature spaces, when the dimension of space exceeds 5. Such features are usually generated, for example, from digital elevation models, remote sensing images, etc. An important extension of models concerns considering of real space constrains like geomorphology, networks, and other natural structures. Recent developments in semi-supervised learning can improve modelling of environmental phenomena taking into account on geo-manifolds. An important part of the study deals with the analysis of relevant variables and models' inputs. This problem is approached by using different feature selection/feature extraction nonlinear tools. To demonstrate the application of machine learning algorithms several interesting case studies are considered: digital soil mapping using SVM, automatic mapping of soil and water system pollution using ANN; natural hazards risk analysis (avalanches, landslides), assessments of renewable resources (wind fields) with SVM and ANN models, etc. The dimensionality of spaces considered varies from 2 to more than 30. Figures 1, 2, 3 demonstrate some results of the studies and their outputs. Finally, the results of environmental mapping are discussed and compared with traditional models of geostatistics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Every year, debris flows cause huge damage in mountainous areas. Due to population pressure in hazardous zones, the socio-economic impact is much higher than in the past. Therefore, the development of indicative susceptibility hazard maps is of primary importance, particularly in developing countries. However, the complexity of the phenomenon and the variability of local controlling factors limit the use of processbased models for a first assessment. A debris flow model has been developed for regional susceptibility assessments using digital elevation model (DEM) with a GIS-based approach.. The automatic identification of source areas and the estimation of debris flow spreading, based on GIS tools, provide a substantial basis for a preliminary susceptibility assessment at a regional scale. One of the main advantages of this model is its workability. In fact, everything is open to the user, from the data choice to the selection of the algorithms and their parameters. The Flow-R model was tested in three different contexts: two in Switzerland and one in Pakistan, for indicative susceptibility hazard mapping. It was shown that the quality of the DEM is the most important parameter to obtain reliable results for propagation, but also to identify the potential debris flows sources.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dendritic cells (DCs) are central player in immunity by bridging the innate and adaptive arms of the immune system (IS). Interferons (IFNs) are one of the most important factors that regulate both innate and adaptive immunity too. Thus, the understanding of how type II and I IFNs modulate the immune-regulatory properties of DCs is a central issue in immunology. In this paper, we will address this point in the light of the most recent literature, also highlighting the controversial data reported in the field. According to the wide literature available, type II as well as type I IFNs appear, at the same time, to collaborate, to induce additive effects or overlapping functions, as well as to counterregulate each one's effects on DC biology and, in general, the immune response. The knowledge of these effects has important therapeutic implications in the treatment of infectious/autoimmune diseases and cancer and indicates strategies for using IFNs as vaccine adjuvants and in DC-based immune therapeutic approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigated the role of the number of loci coding for a neutral trait on the release of additive variance for this trait after population bottlenecks. Different bottleneck sizes and durations were tested for various matrices of genotypic values, with initial conditions covering the allele frequency space. We used three different types of matrices. First, we extended Cheverud and Routman's model by defining matrices of "pure" epistasis for three and four independent loci; second, we used genotypic values drawn randomly from uniform, normal, and exponential distributions; and third we used two models of simple metabolic pathways leading to physiological epistasis. For all these matrices of genotypic values except the dominant metabolic pathway, we find that, as the number of loci increases from two to three and four, an increase in the release of additive variance is occurring. The amount of additive variance released for a given set of genotypic values is a function of the inbreeding coefficient, independently of the size and duration of the bottleneck. The level of inbreeding necessary to achieve maximum release in additive variance increases with the number of loci. We find that additive-by-additive epistasis is the type of epistasis most easily converted into additive variance. For a wide range of models, our results show that epistasis, rather than dominance, plays a significant role in the increase of additive variance following bottlenecks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Patients with glioblastoma (GBM) have variable clinical courses, but the factors that underlie this heterogeneity are not understood. To determine whether the presence of the telomerase-independent alternative lengthening of telomeres (ALTs) mechanism is a significant prognostic factor for survival, we performed a retrospective analysis of 573 GBM patients. The presence of ALT was identified in paraffin sections using a combination of immunofluorescence for promyelocytic leukemia body and telomere fluorescence in situ hybridization. Alternative lengthening of telomere was present in 15% of the GBM patients. Patients with ALT had longer survival that was independent of age, surgery, and other treatments. Mutations in isocitrate dehydrogenase (IDH1mut) 1 frequently accompanied ALT, and in the presence of both molecular events, there was significantly longer overall survival. These data suggest that most ALT+ tumors may be less aggressive proneural GBMs, and the better prognosis may relate to the set of genetic changes associated with this tumor subtype. Despite improved overall survival of patients treated with the addition of chemotherapy to radiotherapy and surgery, ALT and chemotherapy independently provided a survival advantage, but these factors were not found to be additive. These results suggest a critical need for developing new therapies to target these specific GBM subtypes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gene-on-gene regulations are key components of every living organism. Dynamical abstract models of genetic regulatory networks help explain the genome's evolvability and robustness. These properties can be attributed to the structural topology of the graph formed by genes, as vertices, and regulatory interactions, as edges. Moreover, the actual gene interaction of each gene is believed to play a key role in the stability of the structure. With advances in biology, some effort was deployed to develop update functions in Boolean models that include recent knowledge. We combine real-life gene interaction networks with novel update functions in a Boolean model. We use two sub-networks of biological organisms, the yeast cell-cycle and the mouse embryonic stem cell, as topological support for our system. On these structures, we substitute the original random update functions by a novel threshold-based dynamic function in which the promoting and repressing effect of each interaction is considered. We use a third real-life regulatory network, along with its inferred Boolean update functions to validate the proposed update function. Results of this validation hint to increased biological plausibility of the threshold-based function. To investigate the dynamical behavior of this new model, we visualized the phase transition between order and chaos into the critical regime using Derrida plots. We complement the qualitative nature of Derrida plots with an alternative measure, the criticality distance, that also allows to discriminate between regimes in a quantitative way. Simulation on both real-life genetic regulatory networks show that there exists a set of parameters that allows the systems to operate in the critical region. This new model includes experimentally derived biological information and recent discoveries, which makes it potentially useful to guide experimental research. The update function confers additional realism to the model, while reducing the complexity and solution space, thus making it easier to investigate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by processbased modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM). The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under www.flow-r.org) and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws.We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10m DEM resolution as a good compromise between processing time and quality of results. However, valuable results have still been obtained on the basis of lower quality DEMs with 25m resolution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An important statistical development of the last 30 years has been the advance in regression analysis provided by generalized linear models (GLMs) and generalized additive models (GAMs). Here we introduce a series of papers prepared within the framework of an international workshop entitled: Advances in GLMs/GAMs modeling: from species distribution to environmental management, held in Riederalp, Switzerland, 6-11 August 2001.We first discuss some general uses of statistical models in ecology, as well as provide a short review of several key examples of the use of GLMs and GAMs in ecological modeling efforts. We next present an overview of GLMs and GAMs, and discuss some of their related statistics used for predictor selection, model diagnostics, and evaluation. Included is a discussion of several new approaches applicable to GLMs and GAMs, such as ridge regression, an alternative to stepwise selection of predictors, and methods for the identification of interactions by a combined use of regression trees and several other approaches. We close with an overview of the papers and how we feel they advance our understanding of their application to ecological modeling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Animals and plants are associated with symbiotic microbes whose roles range from mutualism to commensalism to parasitism. These roles may not only be taxon-specific but also dependent on environmental conditions and host factors. To experimentally test these possibilities, we drew a random sample of adult whitefish from a natural population, bred them in vitro in a full-factorial design in order to separate additive genetic from maternal environmental effects on offspring, and tested the performance of the resulting embryos under different environmental conditions. Enhancing the growth of symbiotic microbes with supplemental nutrients released cryptic additive genetic variance for viability in the fish host. These effects vanished with the concurrent addition of the water mould Saprolegnia ferax. Our findings demonstrate that the heritability of host fitness is environment-specific and critically depends on the interaction between symbiotic microbes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Few subjects have caught the attention of the entire world as much as those dealing with natural hazards. The first decade of this new millennium provides a litany of tragic examples of various hazards that turned into disasters affecting millions of individuals around the globe. The human losses (some 225,000 people) associated with the 2004 Indian Ocean earthquake and tsunami, the economic costs (approximately 200 billion USD) of the 2011 Tohoku Japan earthquake, tsunami and reactor event, and the collective social impacts of human tragedies experienced during Hurricane Katrina in 2005 all provide repetitive reminders that we humans are temporary guests occupying a very active and angry planet. Any examples may have been cited here to stress the point that natural events on Earth may, and often do, lead to disasters and catastrophes when humans place themselves into situations of high risk. Few subjects share the true interdisciplinary dependency that characterizes the field of natural hazards. From geology and geophysics to engineering and emergency response to social psychology and economics, the study of natural hazards draws input from an impressive suite of unique and previously independent specializations. Natural hazards provide a common platform to reduce disciplinary boundaries and facilitate a beneficial synergy in the provision of timely and useful information and action on this critical subject matter. As social norms change regarding the concept of acceptable risk and human migration leads to an explosion in the number of megacities, coastal over-crowding and unmanaged habitation in precarious environments such as mountainous slopes, the vulnerability of people and their susceptibility to natural hazards increases dramatically. Coupled with the concerns of changing climates, escalating recovery costs, a growing divergence between more developed and less developed countries, the subject of natural hazards remains on the forefront of issues that affect all people, nations, and environments all the time.This treatise provides a compendium of critical, timely and very detailed information and essential facts regarding the basic attributes of natural hazards and concomitant disasters. The Encyclopedia of Natural Hazards effectively captures and integrates contributions from an international portfolio of almost 300 specialists whose range of expertise addresses over 330 topics pertinent to the field of natural hazards. Disciplinary barriers are overcome in this comprehensive treatment of the subject matter. Clear illustrations and numerous color images enhance the primary aim to communicate and educate. The inclusion of a series of unique ?classic case study? events interspersed throughout the volume provides tangible examples linking concepts, issues, outcomes and solutions. These case studies illustrate different but notable recent, historic and prehistoric events that have shaped the world as we now know it. They provide excellent focal points linking the remaining terms in the volume to the primary field of study. This Encyclopedia of Natural Hazards will remain a standard reference of choice for many years.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasing availability and precision of digital elevation model (DEM) helps in the assessment of landslide prone areas where only few data are available. This approach is performed in 6 main steps which include: DEM creation; identification of geomorphologic features; determination of the main sets of discontinuities; mapping of the most likely dangerous structures; preliminary rock-fall assessment; estimation of the large instabilities volumes. The method is applied to two the cases studies in the Oppstadhornet mountain (730m alt): (1) a 10 millions m3 slow-moving rockslide and (2) a potential high-energy rock falling prone area. The orientations of the foliation and of the major discontinuities have been determined directly from the DEM. These results are in very good agreement with field measurements. Spatial arrangements of discontinuities and foliation with the topography revealed hazardous structures. Maps of potential occurrence of these hazardous structures show highly probable sliding areas at the foot of the main landslide and potential rock falls in the eastern part of the mountain.