890 resultados para phylogeography, consensus approach, ensemble modeling, Pleistocene, ENM, ecological niche modeling
Resumo:
El objetivo general de este proyecto es desarrollar nuevos modelos multi-dominio de máquinas eléctricas para aplicaciones al control y al diagnóstico de fallas. Se propone comenzar con el modelo electromagnético del motor de inducción en base a circuitos magnéticos equivalentes (MEC) validándolo por medio de simulación y de resultados experimentales. Como segundo paso se pretende desarrollas modelos térmicos y mecánicos con el objetivo que puedan ser acoplados al modelo electromagnético y de esta estudiar la interacción de los dominios y se validará mediante resultados de simulación y experimentales el modelo completo. Finalmente se pretende utilizar el modelo multi-dominio como una herramienta para la prueba de nuevas estrategias de control y diagnóstico de fallas. The main objective of this project is the development of new multi-domain models of electric machines for control and fault diagnosis applications. The electromagnetic modeling of the induction motor (IM) will be done using the magnetic equivalent circuits approach. This model will be validated by simulation and by experimental results. As a second step of this project, new mechanical and thermal models for the IM will be developed, with the objective of coupling these models with the electromagnetic one. With this multi-domain model it will be possible to study the interaction between each others. After that, the complete model will be validated by simulation and experimental results. Finally, the model will be used as a tool for testing new control and fault diagnosis strategies.
Resumo:
Economies are open complex adaptive systems far from thermodynamic equilibrium, and neo-classical environmental economics seems not to be the best way to describe the behaviour of such systems. Standard econometric analysis (i.e. time series) takes a deterministic and predictive approach, which encourages the search for predictive policy to ‘correct’ environmental problems. Rather, it seems that, because of the characteristics of economic systems, an ex-post analysis is more appropriate, which describes the emergence of such systems’ properties, and which sees policy as a social steering mechanism. With this background, some of the recent empirical work published in the field of ecological economics that follows the approach defended here is presented. Finally, the conclusion is reached that a predictive use of econometrics (i.e. time series analysis) in ecological economics should be limited to cases in which uncertainty decreases, which is not the normal situation when analysing the evolution of economic systems. However, that does not mean we should not use empirical analysis. On the contrary, this is to be encouraged, but from a structural and ex-post point of view.
Resumo:
The environmental input-output approach reveals the channels through which the environmental burdens of production activities are transmitted throughout the economy. This paper uses the input-output framework and analyses the changes in Spanish emission multipliers during the period 1995-2000. By decomposing the global changes in multipliers into different components, it is possible to evaluate separately the economic and ecological impacts captured by the environmental input-output model. Specifically, in this study we distinguish between the effects on multipliers caused by changes in emission coefficients (the ecological impacts) and the effects on multipliers caused by changes in technical coefficients (the economic impacts). Our results show a significant improvement in the ecological impacts of production activities, which contributed negatively to changes in emission multipliers. They also show a deterioration in the economic impacts, which contributed positively to changes in emission multipliers. Together, these two effects lead to a small reduction in global multipliers during the period of analysis. Our results also show significant differences in the individual behaviour of different sectors in terms of their contribution to multiplier changes. Since there are considerable differences in the way individual sectors affect the changes in emission levels, and in the intensity of these effects, this means that the final effects will basically depend on the activity considered. Keywords: emission multipliers, multipliers' changes, ecological impacts, economic impacts.
Resumo:
Identifying key sectors or key locations in an interconnected economy is of paramount importance for improving policy planning and directing economic strategy. Hence the relevance of categorizing them and hence the corresponding need of evaluating their potential synergies in terms of their global economic thrust. We explain in this paper that standard measures based on gross outputs do not and cannot capture the relevant impact due to self- imposed modeling limitations. In fact, common gross output measures will be systematically downward biased. We argue that an economy wide Computable General Equilibrium (CGE) approach provides a modeling platform that overcomes these limitations since it provides (i) a more comprehensive measure of linkages and (ii) an alternate way of accounting for links' relevance that is in consonance with standard macromagnitudes in the National Income and Product Accounts.
Ab initio modeling and molecular dynamics simulation of the alpha 1b-adrenergic receptor activation.
Resumo:
This work describes the ab initio procedure employed to build an activation model for the alpha 1b-adrenergic receptor (alpha 1b-AR). The first version of the model was progressively modified and complicated by means of a many-step iterative procedure characterized by the employment of experimental validations of the model in each upgrading step. A combined simulated (molecular dynamics) and experimental mutagenesis approach was used to determine the structural and dynamic features characterizing the inactive and active states of alpha 1b-AR. The latest version of the model has been successfully challenged with respect to its ability to interpret and predict the functional properties of a large number of mutants. The iterative approach employed to describe alpha 1b-AR activation in terms of molecular structure and dynamics allows further complications of the model to allow prediction and interpretation of an ever-increasing number of experimental data.
Resumo:
Among the largest resources for biological sequence data is the large amount of expressed sequence tags (ESTs) available in public and proprietary databases. ESTs provide information on transcripts but for technical reasons they often contain sequencing errors. Therefore, when analyzing EST sequences computationally, such errors must be taken into account. Earlier attempts to model error prone coding regions have shown good performance in detecting and predicting these while correcting sequencing errors using codon usage frequencies. In the research presented here, we improve the detection of translation start and stop sites by integrating a more complex mRNA model with codon usage bias based error correction into one hidden Markov model (HMM), thus generalizing this error correction approach to more complex HMMs. We show that our method maintains the performance in detecting coding sequences.
Resumo:
Pleistocene glacial and interglacial periods have moulded the evolutionary history of European cold-adapted organisms. The role of the different mountain massifs has, however, not been accurately investigated in the case of high-altitude insect species. Here, we focus on three closely related species of non-flying leaf beetles of the genus Oreina (Coleoptera, Chrysomelidae), which are often found in sympatry within the mountain ranges of Europe. After showing that the species concept as currently applied does not match barcoding results, we show, based on more than 700 sequences from one nuclear and three mitochondrial genes, the role of biogeography in shaping the phylogenetic hypothesis. Dating the phylogeny using an insect molecular clock, we show that the earliest lineages diverged more than 1 Mya and that the main shift in diversification rate occurred between 0.36 and 0.18 Mya. By using a probabilistic approach on the parsimony-based dispersal/vicariance framework (MP-DIVA) as well as a direct likelihood method of state change optimization, we show that the Alps acted as a cross-roads with multiple events of dispersal to and reinvasion from neighbouring mountains. However, the relative importance of vicariance vs. dispersal events on the process of rapid diversification remains difficult to evaluate because of a bias towards overestimation of vicariance in the DIVA algorithm. Parallels are drawn with recent studies of cold-adapted species, although our study reveals novel patterns in diversity and genetic links between European mountains, and highlights the importance of neglected regions, such as the Jura and the Balkanic range.
Resumo:
The application of multi-region environmental input-output (IO) analysis to the problem of accounting for emissions generation (and/or resource use) under different accounting principles has become increasingly common in the ecological and environmental economics literature in particular, with applications at the international and interregional subnational level. However, while environmental IO analysis is invaluable in accounting for pollution flows in the single time period that the accounts relate to, it is limited when the focus is on modelling the impacts of any marginal change in activity. This is because a conventional demand-driven IO model assumes an entirely passive supply-side in the economy (i.e. all supply is infinitely elastic) and is further restricted by the assumption of universal Leontief (fixed proportions) technology implied by the use of the A and multiplier matrices. Where analysis of marginal changes in activity is required, extension from an IO accounting framework to a more flexible interregional computable general equilibrium (CGE) approach, where behavioural relationships can be modelled in a more realistic and theory-consistent manner, is appropriate. Our argument is illustrated by comparing the results of introducing a positive demand stimulus in the UK economy using IO and CGE interregional models of Scotland and the rest of the UK. In the case of the latter, we demonstrate how more theory consistent modelling of both demand and supply side behaviour at the regional and national levels effect model results, including the impact on the interregional CO2 ‘trade balance’.
Resumo:
Résumé La mondialisation des marchés, les mutations du contexte économique et enfin l'impact des nouvelles technologies de l'information ont obligé les entreprises à revoir la façon dont elles gèrent leurs capitaux intellectuel (gestion des connaissances) et humain (gestion des compétences). II est communément admis aujourd'hui que ceux-ci jouent un rôle particulièrement stratégique dans l'organisation. L'entreprise désireuse de se lancer dans une politique gestion de ces capitaux devra faire face à différents problèmes. En effet, afin de gérer ces connaissances et ces compétences, un long processus de capitalisation doit être réalisé. Celui-ci doit passer par différentes étapes comme l'identification, l'extraction et la représentation des connaissances et des compétences. Pour cela, il existe différentes méthodes de gestion des connaissances et des compétences comme MASK, CommonKADS, KOD... Malheureusement, ces différentes méthodes sont très lourdes à mettre en oeuvre, et se cantonnent à certains types de connaissances et sont, par conséquent, plus limitées dans les fonctionnalités qu'elles peuvent offrir. Enfin, la gestion des compétences et la gestion des connaissances sont deux domaines dissociés alors qu'il serait intéressant d'unifier ces deux approches en une seule. En effet, les compétences sont très proches des connaissances comme le souligne la définition de la compétence qui suit : « un ensemble de connaissances en action dans un contexte donné ». Par conséquent, nous avons choisi d'appuyer notre proposition sur le concept de compétence. En effet, la compétence est parmi les connaissances de l'entreprise l'une des plus cruciales, en particulier pour éviter la perte de savoir-faire ou pour pouvoir prévenir les besoins futurs de l'entreprise, car derrière les compétences des collaborateurs, se trouve l'efficacité de l'organisation. De plus, il est possible de décrire grâce à la compétence de nombreux autres concepts de l'organisation, comme les métiers, les missions, les projets, les formations... Malheureusement, il n'existe pas réellement de consensus sur la définition de la compétence. D'ailleurs, les différentes définitions existantes, même si elles sont pleinement satisfaisantes pour les experts, ne permettent pas de réaliser un système opérationnel. Dans notre approche; nous abordons la gestion des compétences à l'aide d'une méthode de gestion des connaissances. En effet, de par leur nature même, connaissance et compétence sont intimement liées et donc une telle méthode est parfaitement adaptée à la gestion des compétences. Afin de pouvoir exploiter ces connaissances et ces compétences nous avons dû, dans un premier temps, définir les concepts organisationnels de façon claire et computationnelle. Sur cette base, nous proposons une méthodologie de construction des différents référentiels d'entreprise (référentiel de compétences, des missions, des métiers...). Pour modéliser ces différents référentiels, nous avons choisi l'ontologie, car elle permet d'obtenir des définitions cohérentes et consensuelles aux concepts tout en supportant les diversités langagières. Ensuite, nous cartographions les connaissances de l'entreprise (formations, missions, métiers...) sur ces différentes ontologies afin de pouvoir les exploiter et les diffuser. Notre approche de la gestion des connaissances et de la gestion des compétences a permis la réalisation d'un outil offrant de nombreuses fonctionnalités comme la gestion des aires de mobilités, l'analyse stratégique, les annuaires ou encore la gestion des CV. Abstract The globalization of markets, the easing of economical regulation and finally the impact of new information and communication technologies have obliged firms to re-examine the way they manage their knowledge capital (knowledge management) and their human capital (competence management). It is commonly admitted that knowledge plays a slightly strategical role in the organization. The firms who want to establish one politic of management of these capitals will have to face with different problems. To manage that knowledge, a long process of capitalization must be done. That one has different steps like identification, extraction and representation of knowledge and competences. There are some different methods of knowledge management like MASK, CommonKADS or KOD. Unfortunately, those methods are very difficult to implement and are using only some types of knowledge and are consequently more limited in the functionalities they can offer. Knowledge management and competence management are two different domain where it could be interesting to unify those to one. Indeed, competence is very close than knowledge as underline this definition: "a set of knowledge in action in a specified context". We choose in our approach to rely on the concept of competence. Indeed, the competence is one of crucial knowledge in the company, particularly to avoid the loss of know-how or to prevent future needs. Because behind collaborator's competence, we can find company efficiency. Unfortunately, there is no real consensus on the definition of the concept of competence. Moreover, existing different definitions don't permit to develop an operational system. Among other key concept, we can find jobs, mission, project, and training... Moreover, we approach different problems of the competence management under the angle of the knowledge management. Indeed, knowledge and competence are closely linked. Then, we propose a method to build different company repositories (competence, jobs, projects repositories). To model those different repositories we choose ontology because it permits to obtain coherent and consensual definitions of the concepts with support of linguistics diversities too. This building repositories method coupled with this knowledge and competence management approach permitted the realization of a tool offering functionalities like mobility management, strategical analysis, yellow pages or CV management.
Resumo:
1. Species distribution modelling is used increasingly in both applied and theoretical research to predict how species are distributed and to understand attributes of species' environmental requirements. In species distribution modelling, various statistical methods are used that combine species occurrence data with environmental spatial data layers to predict the suitability of any site for that species. While the number of data sharing initiatives involving species' occurrences in the scientific community has increased dramatically over the past few years, various data quality and methodological concerns related to using these data for species distribution modelling have not been addressed adequately. 2. We evaluated how uncertainty in georeferences and associated locational error in occurrences influence species distribution modelling using two treatments: (1) a control treatment where models were calibrated with original, accurate data and (2) an error treatment where data were first degraded spatially to simulate locational error. To incorporate error into the coordinates, we moved each coordinate with a random number drawn from the normal distribution with a mean of zero and a standard deviation of 5 km. We evaluated the influence of error on the performance of 10 commonly used distributional modelling techniques applied to 40 species in four distinct geographical regions. 3. Locational error in occurrences reduced model performance in three of these regions; relatively accurate predictions of species distributions were possible for most species, even with degraded occurrences. Two species distribution modelling techniques, boosted regression trees and maximum entropy, were the best performing models in the face of locational errors. The results obtained with boosted regression trees were only slightly degraded by errors in location, and the results obtained with the maximum entropy approach were not affected by such errors. 4. Synthesis and applications. To use the vast array of occurrence data that exists currently for research and management relating to the geographical ranges of species, modellers need to know the influence of locational error on model quality and whether some modelling techniques are particularly robust to error. We show that certain modelling techniques are particularly robust to a moderate level of locational error and that useful predictions of species distributions can be made even when occurrence data include some error.
Resumo:
The dynamical analysis of large biological regulatory networks requires the development of scalable methods for mathematical modeling. Following the approach initially introduced by Thomas, we formalize the interactions between the components of a network in terms of discrete variables, functions, and parameters. Model simulations result in directed graphs, called state transition graphs. We are particularly interested in reachability properties and asymptotic behaviors, which correspond to terminal strongly connected components (or "attractors") in the state transition graph. A well-known problem is the exponential increase of the size of state transition graphs with the number of network components, in particular when using the biologically realistic asynchronous updating assumption. To address this problem, we have developed several complementary methods enabling the analysis of the behavior of large and complex logical models: (i) the definition of transition priority classes to simplify the dynamics; (ii) a model reduction method preserving essential dynamical properties, (iii) a novel algorithm to compact state transition graphs and directly generate compressed representations, emphasizing relevant transient and asymptotic dynamical properties. The power of an approach combining these different methods is demonstrated by applying them to a recent multilevel logical model for the network controlling CD4+ T helper cell response to antigen presentation and to a dozen cytokines. This model accounts for the differentiation of canonical Th1 and Th2 lymphocytes, as well as of inflammatory Th17 and regulatory T cells, along with many hybrid subtypes. All these methods have been implemented into the software GINsim, which enables the definition, the analysis, and the simulation of logical regulatory graphs.
Resumo:
The 2009 International Society of Urological Pathology Consensus Conference in Boston, made recommendations regarding the standardization of pathology reporting of radical prostatectomy specimens. Issues relating to surgical margin assessment were coordinated by working group 5. Pathologists agreed that tumor extending close to the 'capsular' margin, yet not to it, should be reported as a negative margin, and that locations of positive margins should be indicated as either posterior, posterolateral, lateral, anterior at the prostatic apex, mid-prostate or base. Other items of consensus included specifying the extent of any positive margin as millimeters of involvement; tumor in skeletal muscle at the apical perpendicular margin section, in the absence of accompanying benign glands, to be considered organ confined; and that proximal and distal margins be uniformly referred to as bladder neck and prostatic apex, respectively. Grading of tumor at positive margins was to be left to the discretion of the reporting pathologists. There was no consensus as to how the surgical margin should be regarded when tumor is present at the inked edge of the tissue, in the absence of transected benign glands at the apical margin. Pathologists also did not achieve agreement on the reporting approach to benign prostatic glands at an inked surgical margin in which no carcinoma is present.
Resumo:
La question centrale de ce travail est celle de la relation entre finitude environnementale et liberté individuelle. Par finitude environnementale il faut entendre l'ensemble des contraintes écologiques variées qui posent des limites à l'action humaine. Celles-ci sont de deux types généraux : les limites de disponibilité des ressources naturelles et: les limites de charge des écosystèmes et des grands cycles biogéochimiques globaux (chapitre 1). La thèse défendue ici est que les conceptions libertariennes et libérales de la liberté sont en conflit avec la nécessité de prendre en compte de telles limites et qu'une approche néo-républicaine est mieux à même de répondre à ces enjeux écologiques. Les théories libertariennes, de droite comme de gauche, sont inadaptées à la prise en compte de la finitude des ressources naturelles car elles maintiennent un droit à l'appropriation illimitée de ces dernières par les individus. Ce point est en contradiction avec le caractère systémique de la rareté et avec l'absence de substitut pour certaines ressources indispensables à la poursuite d'une vie décente (chapitres 2 et 3). La théorie libérale de la neutralité, appuyée par le principe du tort (harm principle), est quant à elle inadaptée à la prise en compte des problèmes environnementaux globaux comme le changement climatique. Les mécanismes causaux menant à la création de dommages environnementaux sont en effet indirects et diffus, ce qui empêche l'assignation de responsabilités au niveau individuel. La justification de politiques environnementales contraignantes s'en trouve donc mise en péril (chapitre 4). Ces difficultés proviennent avant tout de deux traits caractéristiques de ces doctrines : leur ontologie sociale atomiste et leur conception de la liberté comme liberté de choix. Le néo-républicanisme de Philip Pettit permet de répondre à ces deux problèmes grâce à son ontologie holiste et à sa conception de la liberté comme non- domination. Cette théorie permet donc à la fois de proposer une conception de la liberté compatible avec la finitude environnementale et de justifier des politiques environnementales exigeantes, sans que le sacrifice en termes de liberté n'apparaisse trop important (chapitre 5). - The centrai issue of this work is that of the relationship between environmental finiteness and individual liberty. By environmental finiteness one should understand the set of diverse ecological constraints that limit human action. These limits are of two general kinds: on the one hand the availability of natural resources, and on the other hand the carrying capacity of ecosystems and biogeochemical cycles (chapter 1}. The thesis defended here is that libertarian and liberal conceptions of liberty conflict with the necessity to take such limits into account, and that a neo-republican approach is best suited to address environmental issues. Libertarian theories, right-wing as well as left-wing, are in particular not able to take resource scarcity into account because they argue for an unlimited right of individuals to appropriate those resources. This point is in contradiction with the systemic nature of scarcity and with the absence of substitutes for some essential resources (chapters 2 and 3). The liberal doctrine of neutrality, as associated with the harm principle, is unsuitable when addressing global environmental issues like climate change. Causal mechanisms leading to environmental harm are indirect and diffuse, which prevents the assignation of individual responsibilities. This makes the justification of coercive environmental policies difficult (chapter 4). These difficulties stem above all from two characteristic features of libertarian and liberal doctrines: their atomistic social ontology and their conception of freedom as liberty of choice. Philip Pettit's neo- republicanism on the other hand is able to address these problems thanks to its holist social ontology and its conception of liberty as non-domination. This doctrine offers a conception of liberty compatible with environmental limits and theoretical resources able to justify demanding environmental policies without sacrificing too much in terms of liberty (chapter 5).
Resumo:
The flourishing number of publications on the use of isotope ratio mass spectrometry (IRMS) in forensicscience denotes the enthusiasm and the attraction generated by this technology. IRMS has demonstratedits potential to distinguish chemically identical compounds coming from different sources. Despite thenumerous applications of IRMS to a wide range of forensic materials, its implementation in a forensicframework is less straightforward than it appears. In addition, each laboratory has developed its ownstrategy of analysis on calibration, sequence design, standards utilisation and data treatment without aclear consensus.Through the experience acquired from research undertaken in different forensic fields, we propose amethodological framework of the whole process using IRMS methods. We emphasize the importance ofconsidering isotopic results as part of a whole approach, when applying this technology to a particularforensic issue. The process is divided into six different steps, which should be considered for a thoughtfuland relevant application. The dissection of this process into fundamental steps, further detailed, enablesa better understanding of the essential, though not exhaustive, factors that have to be considered in orderto obtain results of quality and sufficiently robust to proceed to retrospective analyses or interlaboratorycomparisons.
Resumo:
Habitat destruction and fragmentation are known to strongly affect dispersal by altering the quality of the environment between populations. As a consequence, lower landscape connectivity is expected to enhance extinction risks through a decrease in gene flow and the resulting negative effects of genetic drift, accumulation of deleterious mutations and inbreeding depression. Such phenomena are particularly harmful for amphibian species, characterized by disjunct breeding habitats. The dispersal behaviour of amphibians being poorly understood, it is crucial to develop new tools, allowing us to determine the influence of landscape connectivity on the persistence of populations. In this study, we developed a new landscape genetics approach that aims at identifying land-uses affecting genetic differentiation, without a priori assumptions about associated ecological costs. We surveyed genetic variation at seven microsatellite loci for 19 Alpine newt (Mesotriton alpestris) populations in western Switzerland. Using strips of varying widths that define a dispersal corridor between pairs of populations, we were able to identify land-uses that act as dispersal barriers (i.e. urban areas) and corridors (i.e. forests). Our results suggest that habitat destruction and landscape fragmentation might in the near future affect common species such as M. alpestris. In addition, by identifying relevant landscape variables influencing population structure without unrealistic assumptions about dispersal, our method offers a simple and flexible tool of investigation as an alternative to least-cost models and other approaches.