72 resultados para Eclipse modeling framework (EMF)
Resumo:
PECUBE is a three-dimensional thermal-kinematic code capable of solving the heat production-diffusion-advection equation under a temporally varying surface boundary condition. It was initially developed to assess the effects of time-varying surface topography (relief) on low-temperature thermochronological datasets. Thermochronometric ages are predicted by tracking the time-temperature histories of rock-particles ending up at the surface and by combining these with various age-prediction models. In the decade since its inception, the PECUBE code has been under continuous development as its use became wider and addressed different tectonic-geomorphic problems. This paper describes several major recent improvements in the code, including its integration with an inverse-modeling package based on the Neighborhood Algorithm, the incorporation of fault-controlled kinematics, several different ways to address topographic and drainage change through time, the ability to predict subsurface (tunnel or borehole) data, prediction of detrital thermochronology data and a method to compare these with observations, and the coupling with landscape-evolution (or surface-process) models. Each new development is described together with one or several applications, so that the reader and potential user can clearly assess and make use of the capabilities of PECUBE. We end with describing some developments that are currently underway or should take place in the foreseeable future. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
1. Species distribution modelling is used increasingly in both applied and theoretical research to predict how species are distributed and to understand attributes of species' environmental requirements. In species distribution modelling, various statistical methods are used that combine species occurrence data with environmental spatial data layers to predict the suitability of any site for that species. While the number of data sharing initiatives involving species' occurrences in the scientific community has increased dramatically over the past few years, various data quality and methodological concerns related to using these data for species distribution modelling have not been addressed adequately. 2. We evaluated how uncertainty in georeferences and associated locational error in occurrences influence species distribution modelling using two treatments: (1) a control treatment where models were calibrated with original, accurate data and (2) an error treatment where data were first degraded spatially to simulate locational error. To incorporate error into the coordinates, we moved each coordinate with a random number drawn from the normal distribution with a mean of zero and a standard deviation of 5 km. We evaluated the influence of error on the performance of 10 commonly used distributional modelling techniques applied to 40 species in four distinct geographical regions. 3. Locational error in occurrences reduced model performance in three of these regions; relatively accurate predictions of species distributions were possible for most species, even with degraded occurrences. Two species distribution modelling techniques, boosted regression trees and maximum entropy, were the best performing models in the face of locational errors. The results obtained with boosted regression trees were only slightly degraded by errors in location, and the results obtained with the maximum entropy approach were not affected by such errors. 4. Synthesis and applications. To use the vast array of occurrence data that exists currently for research and management relating to the geographical ranges of species, modellers need to know the influence of locational error on model quality and whether some modelling techniques are particularly robust to error. We show that certain modelling techniques are particularly robust to a moderate level of locational error and that useful predictions of species distributions can be made even when occurrence data include some error.
Resumo:
The dynamical analysis of large biological regulatory networks requires the development of scalable methods for mathematical modeling. Following the approach initially introduced by Thomas, we formalize the interactions between the components of a network in terms of discrete variables, functions, and parameters. Model simulations result in directed graphs, called state transition graphs. We are particularly interested in reachability properties and asymptotic behaviors, which correspond to terminal strongly connected components (or "attractors") in the state transition graph. A well-known problem is the exponential increase of the size of state transition graphs with the number of network components, in particular when using the biologically realistic asynchronous updating assumption. To address this problem, we have developed several complementary methods enabling the analysis of the behavior of large and complex logical models: (i) the definition of transition priority classes to simplify the dynamics; (ii) a model reduction method preserving essential dynamical properties, (iii) a novel algorithm to compact state transition graphs and directly generate compressed representations, emphasizing relevant transient and asymptotic dynamical properties. The power of an approach combining these different methods is demonstrated by applying them to a recent multilevel logical model for the network controlling CD4+ T helper cell response to antigen presentation and to a dozen cytokines. This model accounts for the differentiation of canonical Th1 and Th2 lymphocytes, as well as of inflammatory Th17 and regulatory T cells, along with many hybrid subtypes. All these methods have been implemented into the software GINsim, which enables the definition, the analysis, and the simulation of logical regulatory graphs.
Resumo:
Cell elongation during seedling development is antagonistically regulated by light and gibberellins (GAs). Light induces photomorphogenesis, leading to inhibition of hypocotyl growth, whereas GAs promote etiolated growth, characterized by increased hypocotyl elongation. The mechanism underlying this antagonistic interaction remains unclear. Here we report on the central role of the Arabidopsis thaliana nuclear transcription factor PIF4 (encoded by PHYTOCHROME INTERACTING FACTOR 4) in the positive control of genes mediating cell elongation and show that this factor is negatively regulated by the light photoreceptor phyB (ref. 4) and by DELLA proteins that have a key repressor function in GA signalling. Our results demonstrate that PIF4 is destabilized by phyB in the light and that DELLAs block PIF4 transcriptional activity by binding the DNA-recognition domain of this factor. We show that GAs abrogate such repression by promoting DELLA destabilization, and therefore cause a concomitant accumulation of free PIF4 in the nucleus. Consistent with this model, intermediate hypocotyl lengths were observed in transgenic plants over-accumulating both DELLAs and PIF4. Destabilization of this factor by phyB, together with its inactivation by DELLAs, constitutes a protein interaction framework that explains how plants integrate both light and GA signals to optimize growth and development in response to changing environments.
Multimodel inference and multimodel averaging in empirical modeling of occupational exposure levels.
Resumo:
Empirical modeling of exposure levels has been popular for identifying exposure determinants in occupational hygiene. Traditional data-driven methods used to choose a model on which to base inferences have typically not accounted for the uncertainty linked to the process of selecting the final model. Several new approaches propose making statistical inferences from a set of plausible models rather than from a single model regarded as 'best'. This paper introduces the multimodel averaging approach described in the monograph by Burnham and Anderson. In their approach, a set of plausible models are defined a priori by taking into account the sample size and previous knowledge of variables influent on exposure levels. The Akaike information criterion is then calculated to evaluate the relative support of the data for each model, expressed as Akaike weight, to be interpreted as the probability of the model being the best approximating model given the model set. The model weights can then be used to rank models, quantify the evidence favoring one over another, perform multimodel prediction, estimate the relative influence of the potential predictors and estimate multimodel-averaged effects of determinants. The whole approach is illustrated with the analysis of a data set of 1500 volatile organic compound exposure levels collected by the Institute for work and health (Lausanne, Switzerland) over 20 years, each concentration having been divided by the relevant Swiss occupational exposure limit and log-transformed before analysis. Multimodel inference represents a promising procedure for modeling exposure levels that incorporates the notion that several models can be supported by the data and permits to evaluate to a certain extent model selection uncertainty, which is seldom mentioned in current practice.
Resumo:
Aim: The relative effectiveness of different methods of prevention of HIV transmission is a subject of debate that is renewed with the integration of each new method. The relative weight of values and evidence in decision-making is not always clearly defined. Debate is often confused, as the proponents of different approaches address the issue at different levels of implementation. This paper defines and delineates the successive levels of analysis of effectiveness, and proposes a conceptual framework to clarify debate. Method / Issue: Initially inspired from work on contraceptive effectiveness, a first version of the conceptual framework was published in 1993 with definition of the Condom Effectiveness Matrix (Spencer, 1993). The framework has since integrated and further developed thinking around distinctions made between efficacy and effectiveness and has been applied to HIV prevention in general. Three levels are defined: theoretical effectiveness (ThE), use-effectiveness (UseE) and population use-effectiveness (PopUseE). For example, abstinence and faithfulness, as proposed in the ABC strategy, have relatively high theoretical effectiveness but relatively low effectiveness at subsequent levels of implementation. The reverse is true of circumcision. Each level is associated with specific forms of scientific enquiry and associated research questions: basic and clinical sciences with ThE; clinical and social sciences with UseE; epidemiology and social, economic and political sciences with PopUseE. Similarly, the focus of investigation moves from biological organisms, to the individual at the physiological and then psychological, social and ecological level, and finally takes as perspective populations and societies as a whole. The framework may be applied to analyse issues on any approach. Hence, regarding consideration of HIV treatment as a means of prevention, examples of issues at each level would be: ThE: achieving adequate viral suppression and non-transmission to partners; UseE: facility and degree of adherence to treatment and medical follow-up; PopUseE: perceived validity of strategy, feasibility of achieving adequate population coverage. Discussion: Use of the framework clarifies the questions that need to be addressed at all levels in order to improve effectiveness. Furthermore, the interconnectedness and complementary nature of research from the different scientific disciplines and the relative contribution of each become apparent. The proposed framework could bring greater rationality to the prevention effectiveness debate and facilitate communication between stakeholders.
Resumo:
The capacity to learn to associate sensory perceptions with appropriate motor actions underlies the success of many animal species, from insects to humans. The evolutionary significance of learning has long been a subject of interest for evolutionary biologists who emphasize the bene¬fit yielded by learning under changing environmental conditions, where it is required to flexibly switch from one behavior to another. However, two unsolved questions are particularly impor¬tant for improving our knowledge of the evolutionary advantages provided by learning, and are addressed in the present work. First, because it is possible to learn the wrong behavior when a task is too complex, the learning rules and their underlying psychological characteristics that generate truly adaptive behavior must be identified with greater precision, and must be linked to the specific ecological problems faced by each species. A framework for predicting behavior from the definition of a learning rule is developed here. Learning rules capture cognitive features such as the tendency to explore, or the ability to infer rewards associated to unchosen actions. It is shown that these features interact in a non-intuitive way to generate adaptive behavior in social interactions where individuals affect each other's fitness. Such behavioral predictions are used in an evolutionary model to demonstrate that, surprisingly, simple trial-and-error learn¬ing is not always outcompeted by more computationally demanding inference-based learning, when population members interact in pairwise social interactions. A second question in the evolution of learning is its link with and relative advantage compared to other simpler forms of phenotypic plasticity. After providing a conceptual clarification on the distinction between genetically determined vs. learned responses to environmental stimuli, a new factor in the evo¬lution of learning is proposed: environmental complexity. A simple mathematical model shows that a measure of environmental complexity, the number of possible stimuli in one's environ¬ment, is critical for the evolution of learning. In conclusion, this work opens roads for modeling interactions between evolving species and their environment in order to predict how natural se¬lection shapes animals' cognitive abilities. - La capacité d'apprendre à associer des sensations perceptives à des actions motrices appropriées est sous-jacente au succès évolutif de nombreuses espèces, depuis les insectes jusqu'aux êtres hu¬mains. L'importance évolutive de l'apprentissage est depuis longtemps un sujet d'intérêt pour les biologistes de l'évolution, et ces derniers mettent l'accent sur le bénéfice de l'apprentissage lorsque les conditions environnementales sont changeantes, car dans ce cas il est nécessaire de passer de manière flexible d'un comportement à l'autre. Cependant, deux questions non résolues sont importantes afin d'améliorer notre savoir quant aux avantages évolutifs procurés par l'apprentissage. Premièrement, puisqu'il est possible d'apprendre un comportement incorrect quand une tâche est trop complexe, les règles d'apprentissage qui permettent d'atteindre un com¬portement réellement adaptatif doivent être identifiées avec une plus grande précision, et doivent être mises en relation avec les problèmes écologiques spécifiques rencontrés par chaque espèce. Un cadre théorique ayant pour but de prédire le comportement à partir de la définition d'une règle d'apprentissage est développé ici. Il est démontré que les caractéristiques cognitives, telles que la tendance à explorer ou la capacité d'inférer les récompenses liées à des actions non ex¬périmentées, interagissent de manière non-intuitive dans les interactions sociales pour produire des comportements adaptatifs. Ces prédictions comportementales sont utilisées dans un modèle évolutif afin de démontrer que, de manière surprenante, l'apprentissage simple par essai-et-erreur n'est pas toujours battu par l'apprentissage basé sur l'inférence qui est pourtant plus exigeant en puissance de calcul, lorsque les membres d'une population interagissent socialement par pair. Une deuxième question quant à l'évolution de l'apprentissage concerne son lien et son avantage relatif vis-à-vis d'autres formes plus simples de plasticité phénotypique. Après avoir clarifié la distinction entre réponses aux stimuli génétiquement déterminées ou apprises, un nouveau fac¬teur favorisant l'évolution de l'apprentissage est proposé : la complexité environnementale. Un modèle mathématique permet de montrer qu'une mesure de la complexité environnementale - le nombre de stimuli rencontrés dans l'environnement - a un rôle fondamental pour l'évolution de l'apprentissage. En conclusion, ce travail ouvre de nombreuses perspectives quant à la mo¬délisation des interactions entre les espèces en évolution et leur environnement, dans le but de comprendre comment la sélection naturelle façonne les capacités cognitives des animaux.
Resumo:
Despite the tremendous amount of data collected in the field of ambulatory care, political authorities still lack synthetic indicators to provide them with a global view of health services utilization and costs related to various types of diseases. Moreover, public health indicators fail to provide useful information for physicians' accountability purposes. The approach is based on the Swiss context, which is characterized by the greatest frequency of medical visits in Europe, the highest rate of growth for care expenditure, poor public information but a lot of structured data (new fee system introduced in 2004). The proposed conceptual framework is universal and based on descriptors of six entities: general population, people with poor health, patients, services, resources and effects. We show that most conceptual shortcomings can be overcome and that the proposed indicators can be achieved without threatening privacy protection, using modern cryptographic techniques. Twelve indicators are suggested for the surveillance of the ambulatory care system, almost all based on routinely available data: morbidity, accessibility, relevancy, adequacy, productivity, efficacy (from the points of view of the population, people with poor health, and patients), effectiveness, efficiency, health services coverage and financing. The additional costs of this surveillance system should not exceed Euro 2 million per year (Euro 0.3 per capita).
Resumo:
A factor limiting preliminary rockfall hazard mapping at regional scale is often the lack of knowledge of potential source areas. Nowadays, high resolution topographic data (LiDAR) can account for realistic landscape details even at large scale. With such fine-scale morphological variability, quantitative geomorphometric analyses become a relevant approach for delineating potential rockfall instabilities. Using digital elevation model (DEM)-based ?slope families? concept over areas of similar lithology and cliffs and screes zones available from the 1:25,000 topographic map, a susceptibility rockfall hazard map was drawn up in the canton of Vaud, Switzerland, in order to provide a relevant hazard overview. Slope surfaces over morphometrically-defined thresholds angles were considered as rockfall source zones. 3D modelling (CONEFALL) was then applied on each of the estimated source zones in order to assess the maximum runout length. Comparison with known events and other rockfall hazard assessments are in good agreement, showing that it is possible to assess rockfall activities over large areas from DEM-based parameters and topographical elements.