17 resultados para GOAL PROGRAMMING APPROACH

em Universit


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents the current state and development of a prototype web-GIS (Geographic Information System) decision support platform intended for application in natural hazards and risk management, mainly for floods and landslides. This web platform uses open-source geospatial software and technologies, particularly the Boundless (formerly OpenGeo) framework and its client side software development kit (SDK). The main purpose of the platform is to assist the experts and stakeholders in the decision-making process for evaluation and selection of different risk management strategies through an interactive participation approach, integrating web-GIS interface with decision support tool based on a compromise programming approach. The access rights and functionality of the platform are varied depending on the roles and responsibilities of stakeholders in managing the risk. The application of the prototype platform is demonstrated based on an example case study site: Malborghetto Valbruna municipality of North-Eastern Italy where flash floods and landslides are frequent with major events having occurred in 2003. The preliminary feedback collected from the stakeholders in the region is discussed to understand the perspectives of stakeholders on the proposed prototype platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The long term goal of this research is to develop a program able to produce an automatic segmentation and categorization of textual sequences into discourse types. In this preliminary contribution, we present the construction of an algorithm which takes a segmented text as input and attempts to produce a categorization of sequences, such as narrative, argumentative, descriptive and so on. Also, this work aims at investigating a possible convergence between the typological approach developed in particular in the field of text and discourse analysis in French by Adam (2008) and Bronckart (1997) and unsupervised statistical learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND & AIMS: Since the publications of the ESPEN guidelines on enteral and parenteral nutrition in ICU, numerous studies have added information to assist the nutritional management of critically ill patients regarding the recognition of the right population to feed, the energy-protein targeting, the route and the timing to start. METHODS: We reviewed and discussed the literature related to nutrition in the ICU from 2006 until October 2013. RESULTS: To identify safe, minimal and maximal amounts for the different nutrients and at the different stages of the acute illness is necessary. These amounts might be specific for different phases in the time course of the patient's illness. The best approach is to target the energy goal defined by indirect calorimetry. High protein intake (1.5 g/kg/d) is recommended during the early phase of the ICU stay, regardless of the simultaneous calorie intake. This recommendation can reduce catabolism. Later on, high protein intake remains recommended, likely combined with a sufficient amount of energy to avoid proteolysis. CONCLUSIONS: Pragmatic recommendations are proposed to practically optimize nutritional therapy based on recent publications. However, on some issues, there is insufficient evidence to make expert recommendations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AbstractPerforming publicly has become increasingly important in a variety of professions. This condition is associated with performance anxiety in almost all performers. Whereas some performers successfully cope with this anxiety, for others it represents a major problem and even threatens their career. Musicians and especially music students were shown to be particularly affected by performance anxiety.Therefore, the goal of this PhD thesis was to gain a better understanding of performance anxiety in university music students. More precisely, the first part of this thesis aimed at increasing knowledge on the occurrence, the experience, and the management of performance anxiety (Article 1). The second part aimed at investigating the hypothesis that there is an underlying hyperventilation problem in musicians with a high level of anxiety before a performance. This hypothesis was addressed in two ways: firstly, by investigating the association between the negative affective dimension of music performance anxiety (MPA) and self-perceived physiological symptoms that are known to co-occur with hyperventilation (Article 2) and secondly, by analyzing this association on the physiological level before a private (audience-free) and a public performance (Article 3). Article 4 places some key variables of Article 3 in a larger context by jointly analyzing the phases before, during, and after performing.The main results of the self-report data show (a) that stage fright is experienced as a problem by one-third of the surveyed students, (b) that the students express a considerable need for more help to better cope with it, and (c) that there is a positive association between negative feelings of MPA and the self-reported hyperventilation complaints before performing. This latter finding was confirmed on the physiological level in a tendency of particularly high performance-anxious musicians to hyperventilate. Furthermore, the psycho-physiological activation increased from a private to a public performance, and was higher during the performances than before or after them. The physiological activation was mainly independent of the MPA score. Finally, there was a low response coherence between the actual physiological activation and the self-reports on the instantaneous anxiety, tension, and perceived physiological activation.Given the high proportion of music students who consider stage fright as a problem and given the need for more help to better cope with it, a better understanding of this phenomenon and its inclusion in the educational process is fundamental to prevent future occupational problems. On the physiological level, breathing exercises might be a good means to decrease - but also to increase - the arousal associated with a public performance in order to meet an optimal level of arousal needed for a good performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methods like Event History Analysis can show the existence of diffusion and part of its nature, but do not study the process itself. Nowadays, thanks to the increasing performance of computers, processes can be studied using computational modeling. This thesis presents an agent-based model of policy diffusion mainly inspired from the model developed by Braun and Gilardi (2006). I first start by developing a theoretical framework of policy diffusion that presents the main internal drivers of policy diffusion - such as the preference for the policy, the effectiveness of the policy, the institutional constraints, and the ideology - and its main mechanisms, namely learning, competition, emulation, and coercion. Therefore diffusion, expressed by these interdependencies, is a complex process that needs to be studied with computational agent-based modeling. In a second step, computational agent-based modeling is defined along with its most significant concepts: complexity and emergence. Using computational agent-based modeling implies the development of an algorithm and its programming. When this latter has been developed, we let the different agents interact. Consequently, a phenomenon of diffusion, derived from learning, emerges, meaning that the choice made by an agent is conditional to that made by its neighbors. As a result, learning follows an inverted S-curve, which leads to partial convergence - global divergence and local convergence - that triggers the emergence of political clusters; i.e. the creation of regions with the same policy. Furthermore, the average effectiveness in this computational world tends to follow a J-shaped curve, meaning that not only time is needed for a policy to deploy its effects, but that it also takes time for a country to find the best-suited policy. To conclude, diffusion is an emergent phenomenon from complex interactions and its outcomes as ensued from my model are in line with the theoretical expectations and the empirical evidence.Les méthodes d'analyse de biographie (event history analysis) permettent de mettre en évidence l'existence de phénomènes de diffusion et de les décrire, mais ne permettent pas d'en étudier le processus. Les simulations informatiques, grâce aux performances croissantes des ordinateurs, rendent possible l'étude des processus en tant que tels. Cette thèse, basée sur le modèle théorique développé par Braun et Gilardi (2006), présente une simulation centrée sur les agents des phénomènes de diffusion des politiques. Le point de départ de ce travail met en lumière, au niveau théorique, les principaux facteurs de changement internes à un pays : la préférence pour une politique donnée, l'efficacité de cette dernière, les contraintes institutionnelles, l'idéologie, et les principaux mécanismes de diffusion que sont l'apprentissage, la compétition, l'émulation et la coercition. La diffusion, définie par l'interdépendance des différents acteurs, est un système complexe dont l'étude est rendue possible par les simulations centrées sur les agents. Au niveau méthodologique, nous présenterons également les principaux concepts sous-jacents aux simulations, notamment la complexité et l'émergence. De plus, l'utilisation de simulations informatiques implique le développement d'un algorithme et sa programmation. Cette dernière réalisée, les agents peuvent interagir, avec comme résultat l'émergence d'un phénomène de diffusion, dérivé de l'apprentissage, où le choix d'un agent dépend en grande partie de ceux faits par ses voisins. De plus, ce phénomène suit une courbe en S caractéristique, poussant à la création de régions politiquement identiques, mais divergentes au niveau globale. Enfin, l'efficacité moyenne, dans ce monde simulé, suit une courbe en J, ce qui signifie qu'il faut du temps, non seulement pour que la politique montre ses effets, mais également pour qu'un pays introduise la politique la plus efficace. En conclusion, la diffusion est un phénomène émergent résultant d'interactions complexes dont les résultats du processus tel que développé dans ce modèle correspondent tant aux attentes théoriques qu'aux résultats pratiques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the field of perinatality, development of prenatal diagnosis and neonatal management have been impressive. But these were also associated with the emergence of the increasingly important emotional dimension for parents and professionals. Obstetricians dealing with the difficult breaking of bad news, the uncertainties of prenatal diagnosis and the complex somatic, psychological and social follow-up have to work in a multidisciplinary approach. The securing role of a coherent teamwork is recognised by parents as well as health care providers This article discusses interprofessional relationship as an obstetrical goal and give some landmarks in order to improve the management and the collaboration with parents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SUMMARY Following the complete sequencing of the human genome, the field of nutrition has begun utilizing this vast quantity of information to comprehensively explore the interactions between diet and genes. This approach, coined nutrigenomics, aims to determine the influence of common dietary ingredients on the genome, and attempts to relate the resulting different phenotypes to differences in the cellular and/or genetic response of the biological system. However, complementary to defining the biological outcomes of dietary ingredients, we must also understand the influence of the multiple factors (such as the microbiota, bile, and function of transporters) that may contribute to the bioavailability, and ultimately bioefficacy, of these ingredients. The gastrointestinal tract (GIT) is the body's foremost tissue boundary, interacting with nutrients, exogenous compounds and microbiota, and whose condition is influenced by the complex interplay between these environmental factors and genetic elements. In order to understand GIT nutrient-gene interactions, our goal was to comprehensively elucidate the region-specific gene expression underlying intestinal functions. We found important regional differences in the expression of members of the ATP-binding cassette family of transporters in the mouse intestine, suggesting that absorption of dietary compounds may vary along the GIT. Furthermore, the influence of the microbiota on host gene expression indicated that this luminal factor predominantly influences immune function and water transport throughout the GIT; however, the identification of region-specific functions suggest distinct host-bacterial interactions along the GIT. Thus, these findings reinforce that to understand nutrient bioavailability and GIT function, one must consider the physiologically distinct regions of the gut. Nutritional molecules absorbed by the enterocytes of the GIT enter circulation and will be selectively absorbed and metabolised by tissues throughout the body; however, their bioefficacy in the body will depend on the unique and shared molecular mechanisms of the various tissues. Using a nutrigenomic approach, the biological responses of the liver and hippocampus of mice fed different long chain-polyunsaturated fatty acids diets revealed tissue-specific responses. Furthermore, we identified stearoyl-CoA desaturase as a hepatic target for arachidonic acid, suggesting a potentially novel molecular mechanism that may protect against diet-induced obesity. In summary, this work begins to unveil the fundamentally important role that nutrigenomics will play in unravelling the molecular mechanisms, and those exogenous factors capable of influencing these mechanisms, that regulate the bioefficacy of nutritional molecules. RÉSUMÉ Suite au séquençage complet du génome humain, le domaine de la nutrition a commencé à utiliser cette vaste quantité d'information pour explorer de manière globale les interactions entre la nourriture et les gènes. Cette approche, appelée « nutrigenomics », a pour but de déterminer l'influence d'ingrédients couramment utilisés dans l'alimentation sur le génome, et d'essayer de relier ces différents phénotypes, ainsi révélés, à des différences de réponses cellulaires et/ou génétiques. Cependant, en plus de définir les effets biologiques d'ingrédients alimentaires, il est important de comprendre l'influence des multiples facteurs (telle que la microflore, la bile et la fonction des transporteurs) pouvant contribuer à la bio- disponibilité et par conséquent à l'efficacité de ces ingrédients. Le tractus gastro-intestinal (TGI), qui est la première barrière vers les tissus, interagit avec les nutriments, les composés exogènes et la microflore. La fonction de cet organe est influencée par les interactions complexes entre les facteurs environnementaux et les éléments génétiques. Dans le but de comprendre les interactions entre les nutriments et les gènes au niveau du TGI, notre objectif a été de décrire de manière globale l'expression génique spécifique de chaque région de l'intestin définissant leurs fonctions. Nous avons trouvé d'importantes différences régionales dans l'expression des transporteurs de la famille des « ATP-binding cassette transporter » dans l'intestin de souris, suggérant que l'absorption des composés alimentaires puisse varier le long de l'intestin. De plus, l'étude des effets de la microflore sur l'expression des gènes hôtes a indiqué que ce facteur de la lumière intestinale influence surtout la fonction immunitaire et le transport de l'eau à travers l'intestin. Cependant, l'identification des fonctions spécifiques de chaque région suggère des interactions distinctes entre l'hôte et les bactéries le long de l'intestin. Ainsi, ces résultats renforcent l'idée que la compréhension de la bio-disponibilité des nutriments, et par conséquent la fonction du TGI, doit prendre en considération les différences régionales. Les molécules nutritionnelles transportées par les entérocytes jusqu'à la circulation sanguine, sont ensuite sélectivement absorbées et métabolisées par les différents tissus de l'organisme. Cependant, leur efficacité biologique dépendra du mécanisme commun ou spécifique de chaque tissu. En utilisant une approche « nutriogenomics », nous avons pu mettre en évidence les réponses biologiques spécifiques du foie et de l'hippocampe de souris nourris avec des régimes supplémentés avec différents acides gras poly-insaturés à chaîne longue. De plus, nous avons identifié la stearoyl-CoA desaturase comme une cible hépatique pour l'acide arachidonique, suggérant un nouveau mécanisme moléculaire pouvant potentiellement protéger contre le développement de l'obésité. En résumé, ce travail a permis de dévoiler le rôle fondamental qu'une approche telle que la « nutrigenomics » peut jouer dans le décryptage des mécanismes moléculaires et de leur régulation par des facteurs exogènes, qui ensemble vont contrôler l'efficacité biologique des nutriments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research on achievement goal promotion at University has shown that performance-approach goals are perceived as a means to succeed at University (high social utility) but are not appreciated (low social desirability). We argue that such a paradox could explain why research has detected that performance-approach goals consistently predict academic grades. First-year psychology students answered a performance-approach goal scale with standard, social desirability and social utility instructions. Participants' grades were recorded at the end of the semester. Results showed that the relationship between performance-approach goals and grades was inhibited by the increase of these goals' social desirability and facilitated by the increase of their social utility, revealing that the predictive validity of performance-approach goals depend on social value.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Classroom research on achievement goals has revealed that performance-approach goals (goals to outperform others) positively predict exam performance whereas performance-avoidance goals (goals not to perform more poorly than others) negatively predict it. Because prior classroom research has primarily utilized multiplechoice exam performance, the first aim of the present study was to extend these findings to a different measure of exam performance (oral examination). The second aim of this research was to test the mediating role of perceived difficulty. Participants were 49 4th year psychology students of the University of Geneva. Participants answered a questionnaire assessing their level of performance-approach and performance-avoidance goal endorsement in one of their classes as well as the perceived difficulty of this class for themselves. Results indicated that performance-approach goals significantly and positively predicted exam grades. Performance-avoidance goals significantly and negatively predicted grades. Both of these relationships were mediated by the perceived difficulty of the class for oneself. Thus, the links previously observed between performance goals and exam performance were replicated on an oral exam. Perceived difficulty is discussed as a key dimension responsible for these findings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Régner, Escribe, and Dupeyrat (2007) recently demonstrated that not only performance-approach and performance-avoidance goals (respectively, the desire to outperform others and not to be outperformed by others) but also mastery goals (the desire to acquire knowledge) were related to social comparison orientation (SCO, the tendency to search for social comparison information). In the present article, the possibility of a link between mastery goals and social comparison that depends on the level of performance-approach goals-a possibility supported by a multiple-goal perspective-was tested by examining the interaction effect between mastery and performance-approach goals. This is an important endeavor, as educational settings are rarely free from performance-approach goals, even when mastery goals are promoted. In Study 1, we tested self-set achievement goals (mastery, performance-approach, and performance-avoidance goals) as predictors of SCO; the interaction between mastery goals and performance-approach goals indicated that the higher the performance-approach goal endorsement, the stronger the link between mastery goals and SCO. In Study 2, we manipulated goal conditions; mastery goals predicted interest in social comparison in the performance-approach goal condition only. Results are discussed in terms of the importance of multiple-goal pursuit in academic settings. (PsycINFO Database Record (c) 2009 APA, all rights reserved)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cette thèse explore dans quelle mesure la poursuite d'un but de performance-approche (i.e., le désir de surpasser autrui et de démontrer ses compétences) favorise, ou au contraire endommage, la réussite et l'apprentissage-une question toujours largement débattue dans la littérature. Quatre études menées en laboratoire ont confirmé cette hypothèse et démontré que la poursuite du but de performance-approche amène les individus à diviser leur attention entre d'une part la réalisation de la tâche évaluée, et d'autre part la gestion de préoccupations liées à l'atteinte du but-ceci empêchant une concentration efficace sur les processus de résolution de la tâche. Dans une deuxième ligne de recherche, nous avons ensuite démontré que cette distraction est exacerbée chez les individus les plus performants et ayant le plus l'habitude de réussir, ceci dérivant d'une pression supplémentaire liée au souhait de maintenir le statut positif de « bon élève ». Enfin, notre troisième ligne de recherche a cherché à réconcilier ces résultats-pointant l'aspect distractif du but de performance-approche-avec le profil se dégageant des études longitudinales rapportées dans la littérature-associant ce but avec la réussite académique. Ainsi, nous avons mené une étude longitudinale testant si l'adoption du but de performance-approche en classe pourrait augmenter la mise en oeuvre de stratégies d'étude tactiquement dirigées vers la performance-favorisant une réussite optimale aux tests. Nos résultats ont apporté des éléments en faveur de cette hypothèse, mais uniquement chez les élèves de bas niveau. Ainsi, l'ensemble de nos résultats permet de mettre en lumière les processus cognitifs à l'oeuvre lors de la poursuite du but de performance-approche, ainsi que d'alimenter le débat concernant leur aspect bénéfique ou nuisible en contexte éducatif. -- In this dissertation, we propose to investigate whether the pursuit of performance-approach goals (i.e., the desire to outperform others and appear talented) facilitates or rather endangers achievement and learning-an issue that is still widely discussed in the achievement goal literature. Four experiments carried out in a laboratory setting have provided evidence that performance- approach goals create a divided-attention situation that leads cognitive resources to be divided between task processing and the activation of goal-attainment concerns-which jeopardizes full cognitive immersion in the task. Then, in a second research line, we found evidence that high- achievers (i.e., those individuals who are the most used to succeed) experience, under evaluative contexts, heightened pressure to excel at the task, deriving from concerns associated with the preservation of their "high-achiever" status. Finally, a third research line was designed to try to reconcile results stemming from our laboratory studies with the overall profile emerging from longitudinal research-which have consistently found performance-approach goals to be a positive predictor of students' test scores. We thus set up a longitudinal study so as to test whether students' adoption of performance-approach goals in a long-term classroom setting enhances the implementation of strategic study behaviors tactically directed toward goal-attainment, hence favoring test performance. Our findings brought support for this hypothesis, but only for low-achieving students. Taken together, our findings shed new light on the cognitive processes at play during the pursuit of performance-approach goals, and are likely to fuel the debate regarding whether performance-approach goals should be encouraged or not in educational settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Conventional magnetic resonance imaging (MRI) techniques are highly sensitive to detect multiple sclerosis (MS) plaques, enabling a quantitative assessment of inflammatory activity and lesion load. In quantitative analyses of focal lesions, manual or semi-automated segmentations have been widely used to compute the total number of lesions and the total lesion volume. These techniques, however, are both challenging and time-consuming, being also prone to intra-observer and inter-observer variability.Aim: To develop an automated approach to segment brain tissues and MS lesions from brain MRI images. The goal is to reduce the user interaction and to provide an objective tool that eliminates the inter- and intra-observer variability.Methods: Based on the recent methods developed by Souplet et al. and de Boer et al., we propose a novel pipeline which includes the following steps: bias correction, skull stripping, atlas registration, tissue classification, and lesion segmentation. After the initial pre-processing steps, a MRI scan is automatically segmented into 4 classes: white matter (WM), grey matter (GM), cerebrospinal fluid (CSF) and partial volume. An expectation maximisation method which fits a multivariate Gaussian mixture model to T1-w, T2-w and PD-w images is used for this purpose. Based on the obtained tissue masks and using the estimated GM mean and variance, we apply an intensity threshold to the FLAIR image, which provides the lesion segmentation. With the aim of improving this initial result, spatial information coming from the neighbouring tissue labels is used to refine the final lesion segmentation.Results:The experimental evaluation was performed using real data sets of 1.5T and the corresponding ground truth annotations provided by expert radiologists. The following values were obtained: 64% of true positive (TP) fraction, 80% of false positive (FP) fraction, and an average surface distance of 7.89 mm. The results of our approach were quantitatively compared to our implementations of the works of Souplet et al. and de Boer et al., obtaining higher TP and lower FP values.Conclusion: Promising MS lesion segmentation results have been obtained in terms of TP. However, the high number of FP which is still a well-known problem of all the automated MS lesion segmentation approaches has to be improved in order to use them for the standard clinical practice. Our future work will focus on tackling this issue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Strong leadership from heads of state is needed to meet national commitments to the UN political declaration on non-communicable diseases (NCDs) and to achieve the goal of a 25% reduction in premature NCD mortality by 2025 (the 25 by 25 goal). A simple, phased, national response to the political declaration is suggested, with three key steps: planning, implementation, and accountability. Planning entails mobilisation of a multisectoral response to develop and support the national action plan, and to build human, financial, and regulatory capacity for change. Implementation of a few priority and feasible cost-effective interventions for the prevention and treatment of NCDs will achieve the 25 by 25 goal and will need only few additional financial resources. Accountability incorporates three dimensions: monitoring of progress, reviewing of progress, and appropriate responses to accelerate progress. A national NCD commission or equivalent, which is independent of government, is needed to ensure that all relevant stakeholders are held accountable for the UN commitments to NCDs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this dissertation is to find and provide the basis for a managerial tool that allows a firm to easily express its business logic. The methodological basis for this work is design science, where the researcher builds an artifact to solve a specific problem. In this case the aim is to provide an ontology that makes it possible to explicit a firm's business model. In other words, the proposed artifact helps a firm to formally describe its value proposition, its customers, the relationship with them, the necessary intra- and inter-firm infrastructure and its profit model. Such an ontology is relevant because until now there is no model that expresses a company's global business logic from a pure business point of view. Previous models essentially take an organizational or process perspective or cover only parts of a firm's business logic. The four main pillars of the ontology, which are inspired by management science and enterprise- and processmodeling, are product, customer interface, infrastructure and finance. The ontology is validated by case studies, a panel of experts and managers. The dissertation also provides a software prototype to capture a company's business model in an information system. The last part of the thesis consists of a demonstration of the value of the ontology in business strategy and Information Systems (IS) alignment. Structure of this thesis: The dissertation is structured in nine parts: Chapter 1 presents the motivations of this research, the research methodology with which the goals shall be achieved and why this dissertation present a contribution to research. Chapter 2 investigates the origins, the term and the concept of business models. It defines what is meant by business models in this dissertation and how they are situated in the context of the firm. In addition this chapter outlines the possible uses of the business model concept. Chapter 3 gives an overview of the research done in the field of business models and enterprise ontologies. Chapter 4 introduces the major contribution of this dissertation: the business model ontology. In this part of the thesis the elements, attributes and relationships of the ontology are explained and described in detail. Chapter 5 presents a case study of the Montreux Jazz Festival which's business model was captured by applying the structure and concepts of the ontology. In fact, it gives an impression of how a business model description based on the ontology looks like. Chapter 6 shows an instantiation of the ontology into a prototype tool: the Business Model Modelling Language BM2L. This is an XML-based description language that allows to capture and describe the business model of a firm and has a large potential for further applications. Chapter 7 is about the evaluation of the business model ontology. The evaluation builds on literature review, a set of interviews with practitioners and case studies. Chapter 8 gives an outlook on possible future research and applications of the business model ontology. The main areas of interest are alignment of business and information technology IT/information systems IS and business model comparison. Finally, chapter 9 presents some conclusions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.