123 resultados para Diffusion process

em Université de Lausanne, Switzerland


Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this thesis, I examine the diffusion process for a complex medical technology, the PET scanner, in two different health care systems, one of which is more market-oriented (Switzerland) and the other more centrally managed by a public agency (Quebec). The research draws on institutional and socio-political theories of the diffusion of innovations to examine how institutional contexts affect processes of diffusion. I find that diffusion proceeds more rapidly in Switzerland than in Quebec, but that processes in both jurisdictions are characterized by intense struggles among providers and between providers and public agencies. I show that the institutional environment influences these processes by determining the patterns of material resources and authority available to actors in their struggles to strategically control the technology, and by constituting the discursive resources or institutional logics on which actors may legitimately draw in their struggles to give meaning to the technology in line with their interests and values. This thesis illustrates how institutional structures and meanings manifest themselves in the context of specific decisions within an organizational field, and reveals the ways in which governance structures may be contested and realigned when they conflict with interests that are legitimized by dominant institutional logics. It is argued that this form of contestation and readjustment at the margins constitutes one mechanism by which institutional frameworks are tested, stretched and reproduced or redefined.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The influenza of the winter of 1889-90 was one of the first epidemics to spread all over the world. At the time, several people hypothesized that the railway was one of the main vectors of diffusion of this influenza. This hypothesis was defended in Switzerland especially by Schmid, Chief of the Swiss Office of Health, who collected an impressive body of material about the spread of the epidemic in that country. These data on influenza combined with data about the structure of the railway are used in this paper in order to test the hypothesis of a mixed diffusion process, first between communes interconnected by the railway, and secondly, between those communes and neighbouring communes. An event history analysis model taking into account diffusion effects is proposed and estimated. Results show that the hypothesis is supported if the railway network in Switzerland is not taken as a whole but if a distinction between railway companies is made.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Summary (in English) Computer simulations provide a practical way to address scientific questions that would be otherwise intractable. In evolutionary biology, and in population genetics in particular, the investigation of evolutionary processes frequently involves the implementation of complex models, making simulations a particularly valuable tool in the area. In this thesis work, I explored three questions involving the geographical range expansion of populations, taking advantage of spatially explicit simulations coupled with approximate Bayesian computation. First, the neutral evolutionary history of the human spread around the world was investigated, leading to a surprisingly simple model: A straightforward diffusion process of migrations from east Africa throughout a world map with homogeneous landmasses replicated to very large extent the complex patterns observed in real human populations, suggesting a more continuous (as opposed to structured) view of the distribution of modern human genetic diversity, which may play a better role as a base model for further studies. Second, the postglacial evolution of the European barn owl, with the formation of a remarkable coat-color cline, was inspected with two rounds of simulations: (i) determine the demographic background history and (ii) test the probability of a phenotypic cline, like the one observed in the natural populations, to appear without natural selection. We verified that the modern barn owl population originated from a single Iberian refugium and that they formed their color cline, not due to neutral evolution, but with the necessary participation of selection. The third and last part of this thesis refers to a simulation-only study inspired by the barn owl case above. In this chapter, we showed that selection is, indeed, effective during range expansions and that it leaves a distinguished signature, which can then be used to detect and measure natural selection in range-expanding populations. Résumé (en français) Les simulations fournissent un moyen pratique pour répondre à des questions scientifiques qui seraient inabordable autrement. En génétique des populations, l'étude des processus évolutifs implique souvent la mise en oeuvre de modèles complexes, et les simulations sont un outil particulièrement précieux dans ce domaine. Dans cette thèse, j'ai exploré trois questions en utilisant des simulations spatialement explicites dans un cadre de calculs Bayésiens approximés (approximate Bayesian computation : ABC). Tout d'abord, l'histoire de la colonisation humaine mondiale et de l'évolution de parties neutres du génome a été étudiée grâce à un modèle étonnement simple. Un processus de diffusion des migrants de l'Afrique orientale à travers un monde avec des masses terrestres homogènes a reproduit, dans une très large mesure, les signatures génétiques complexes observées dans les populations humaines réelles. Un tel modèle continu (opposé à un modèle structuré en populations) pourrait être très utile comme modèle de base dans l'étude de génétique humaine à l'avenir. Deuxièmement, l'évolution postglaciaire d'un gradient de couleur chez l'Effraie des clocher (Tyto alba) Européenne, a été examiné avec deux séries de simulations pour : (i) déterminer l'histoire démographique de base et (ii) tester la probabilité qu'un gradient phénotypique, tel qu'observé dans les populations naturelles puisse apparaître sans sélection naturelle. Nous avons montré que la population actuelle des chouettes est sortie d'un unique refuge ibérique et que le gradient de couleur ne peux pas s'être formé de manière neutre (sans l'action de la sélection naturelle). La troisième partie de cette thèse se réfère à une étude par simulations inspirée par l'étude de l'Effraie. Dans ce dernier chapitre, nous avons montré que la sélection est, en effet, aussi efficace dans les cas d'expansion d'aire de distribution et qu'elle laisse une signature unique, qui peut être utilisée pour la détecter et estimer sa force.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new hypothesis is formulated to explain the development of rapakivi texture in and around the mafic enclaves of porphyritic granitoids, i.e. in environments involving magma mixing and mingling. The formation of a plagioclase mantle around alkali feldspar megacrysts is attributed to the localized presence of a melt resulting from the reaction of these megacrysts, with host hybrid magma with which they are in disequilibrium. This feldspathic melt adheres to the resorbed crystals and is virtually immiscible with the surrounding magma. Its composition is modified in terms of the relative proportions of K2O, Na2O, and CaO through selective diffusion of these elements, thus allowing the specific crystallization of andesine. With decreasing temperature, the K-feldspar, again stable, crystallizes along with the plagioclase, leading to mixed mantle structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methods like Event History Analysis can show the existence of diffusion and part of its nature, but do not study the process itself. Nowadays, thanks to the increasing performance of computers, processes can be studied using computational modeling. This thesis presents an agent-based model of policy diffusion mainly inspired from the model developed by Braun and Gilardi (2006). I first start by developing a theoretical framework of policy diffusion that presents the main internal drivers of policy diffusion - such as the preference for the policy, the effectiveness of the policy, the institutional constraints, and the ideology - and its main mechanisms, namely learning, competition, emulation, and coercion. Therefore diffusion, expressed by these interdependencies, is a complex process that needs to be studied with computational agent-based modeling. In a second step, computational agent-based modeling is defined along with its most significant concepts: complexity and emergence. Using computational agent-based modeling implies the development of an algorithm and its programming. When this latter has been developed, we let the different agents interact. Consequently, a phenomenon of diffusion, derived from learning, emerges, meaning that the choice made by an agent is conditional to that made by its neighbors. As a result, learning follows an inverted S-curve, which leads to partial convergence - global divergence and local convergence - that triggers the emergence of political clusters; i.e. the creation of regions with the same policy. Furthermore, the average effectiveness in this computational world tends to follow a J-shaped curve, meaning that not only time is needed for a policy to deploy its effects, but that it also takes time for a country to find the best-suited policy. To conclude, diffusion is an emergent phenomenon from complex interactions and its outcomes as ensued from my model are in line with the theoretical expectations and the empirical evidence.Les méthodes d'analyse de biographie (event history analysis) permettent de mettre en évidence l'existence de phénomènes de diffusion et de les décrire, mais ne permettent pas d'en étudier le processus. Les simulations informatiques, grâce aux performances croissantes des ordinateurs, rendent possible l'étude des processus en tant que tels. Cette thèse, basée sur le modèle théorique développé par Braun et Gilardi (2006), présente une simulation centrée sur les agents des phénomènes de diffusion des politiques. Le point de départ de ce travail met en lumière, au niveau théorique, les principaux facteurs de changement internes à un pays : la préférence pour une politique donnée, l'efficacité de cette dernière, les contraintes institutionnelles, l'idéologie, et les principaux mécanismes de diffusion que sont l'apprentissage, la compétition, l'émulation et la coercition. La diffusion, définie par l'interdépendance des différents acteurs, est un système complexe dont l'étude est rendue possible par les simulations centrées sur les agents. Au niveau méthodologique, nous présenterons également les principaux concepts sous-jacents aux simulations, notamment la complexité et l'émergence. De plus, l'utilisation de simulations informatiques implique le développement d'un algorithme et sa programmation. Cette dernière réalisée, les agents peuvent interagir, avec comme résultat l'émergence d'un phénomène de diffusion, dérivé de l'apprentissage, où le choix d'un agent dépend en grande partie de ceux faits par ses voisins. De plus, ce phénomène suit une courbe en S caractéristique, poussant à la création de régions politiquement identiques, mais divergentes au niveau globale. Enfin, l'efficacité moyenne, dans ce monde simulé, suit une courbe en J, ce qui signifie qu'il faut du temps, non seulement pour que la politique montre ses effets, mais également pour qu'un pays introduise la politique la plus efficace. En conclusion, la diffusion est un phénomène émergent résultant d'interactions complexes dont les résultats du processus tel que développé dans ce modèle correspondent tant aux attentes théoriques qu'aux résultats pratiques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article studies the diffusion of the main institutional feature of regulatory capitalism, namely, independent regulatory agencies. While only a few such authorities existed in Europe in the early 1980s, by the end of the twentieth century they had spread impressively across countries and sectors. The analysis finds that three classes of factors (bottom-up, top-down, and horizontal) explain this trend. First, the establishment of independent regulatory agencies was an attempt to improve credible commitment capacity when liberalizing and privatizing utilities and to alleviate the political uncertainty problem, namely, the risk to a government that its policies will be changed when it loses power. Second, Europeanization favored the creation of independent regulators. Third, individual decisions were interdependent, as governments were influenced by the decisions of others in an emulation process where the symbolic properties of independent regulators mattered more than the functions they performed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article builds on the recent policy diffusion literature and attempts to overcome one of its major problems, namely the lack of a coherent theoretical framework. The literature defines policy diffusion as a process where policy choices are interdependent, and identifies several diffusion mechanisms that specify the link between the policy choices of the various actors. As these mechanisms are grounded in different theories, theoretical accounts of diffusion currently have little internal coherence. In this article we put forward an expected-utility model of policy change that is able to subsume all the diffusion mechanisms. We argue that the expected utility of a policy depends on both its effectiveness and the payoffs it yields, and we show that the various diffusion mechanisms operate by altering these two parameters. Each mechanism affects one of the two parameters, and does so in distinct ways. To account for aggregate patterns of diffusion, we embed our model in a simple threshold model of diffusion. Given the high complexity of the process that results, strong analytical conclusions on aggregate patterns cannot be drawn without more extensive analysis which is beyond the scope of this article. However, preliminary considerations indicate that a wide range of diffusion processes may exist and that convergence is only one possible outcome.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Preface The starting point for this work and eventually the subject of the whole thesis was the question: how to estimate parameters of the affine stochastic volatility jump-diffusion models. These models are very important for contingent claim pricing. Their major advantage, availability T of analytical solutions for characteristic functions, made them the models of choice for many theoretical constructions and practical applications. At the same time, estimation of parameters of stochastic volatility jump-diffusion models is not a straightforward task. The problem is coming from the variance process, which is non-observable. There are several estimation methodologies that deal with estimation problems of latent variables. One appeared to be particularly interesting. It proposes the estimator that in contrast to the other methods requires neither discretization nor simulation of the process: the Continuous Empirical Characteristic function estimator (EGF) based on the unconditional characteristic function. However, the procedure was derived only for the stochastic volatility models without jumps. Thus, it has become the subject of my research. This thesis consists of three parts. Each one is written as independent and self contained article. At the same time, questions that are answered by the second and third parts of this Work arise naturally from the issues investigated and results obtained in the first one. The first chapter is the theoretical foundation of the thesis. It proposes an estimation procedure for the stochastic volatility models with jumps both in the asset price and variance processes. The estimation procedure is based on the joint unconditional characteristic function for the stochastic process. The major analytical result of this part as well as of the whole thesis is the closed form expression for the joint unconditional characteristic function for the stochastic volatility jump-diffusion models. The empirical part of the chapter suggests that besides a stochastic volatility, jumps both in the mean and the volatility equation are relevant for modelling returns of the S&P500 index, which has been chosen as a general representative of the stock asset class. Hence, the next question is: what jump process to use to model returns of the S&P500. The decision about the jump process in the framework of the affine jump- diffusion models boils down to defining the intensity of the compound Poisson process, a constant or some function of state variables, and to choosing the distribution of the jump size. While the jump in the variance process is usually assumed to be exponential, there are at least three distributions of the jump size which are currently used for the asset log-prices: normal, exponential and double exponential. The second part of this thesis shows that normal jumps in the asset log-returns should be used if we are to model S&P500 index by a stochastic volatility jump-diffusion model. This is a surprising result. Exponential distribution has fatter tails and for this reason either exponential or double exponential jump size was expected to provide the best it of the stochastic volatility jump-diffusion models to the data. The idea of testing the efficiency of the Continuous ECF estimator on the simulated data has already appeared when the first estimation results of the first chapter were obtained. In the absence of a benchmark or any ground for comparison it is unreasonable to be sure that our parameter estimates and the true parameters of the models coincide. The conclusion of the second chapter provides one more reason to do that kind of test. Thus, the third part of this thesis concentrates on the estimation of parameters of stochastic volatility jump- diffusion models on the basis of the asset price time-series simulated from various "true" parameter sets. The goal is to show that the Continuous ECF estimator based on the joint unconditional characteristic function is capable of finding the true parameters. And, the third chapter proves that our estimator indeed has the ability to do so. Once it is clear that the Continuous ECF estimator based on the unconditional characteristic function is working, the next question does not wait to appear. The question is whether the computation effort can be reduced without affecting the efficiency of the estimator, or whether the efficiency of the estimator can be improved without dramatically increasing the computational burden. The efficiency of the Continuous ECF estimator depends on the number of dimensions of the joint unconditional characteristic function which is used for its construction. Theoretically, the more dimensions there are, the more efficient is the estimation procedure. In practice, however, this relationship is not so straightforward due to the increasing computational difficulties. The second chapter, for example, in addition to the choice of the jump process, discusses the possibility of using the marginal, i.e. one-dimensional, unconditional characteristic function in the estimation instead of the joint, bi-dimensional, unconditional characteristic function. As result, the preference for one or the other depends on the model to be estimated. Thus, the computational effort can be reduced in some cases without affecting the efficiency of the estimator. The improvement of the estimator s efficiency by increasing its dimensionality faces more difficulties. The third chapter of this thesis, in addition to what was discussed above, compares the performance of the estimators with bi- and three-dimensional unconditional characteristic functions on the simulated data. It shows that the theoretical efficiency of the Continuous ECF estimator based on the three-dimensional unconditional characteristic function is not attainable in practice, at least for the moment, due to the limitations on the computer power and optimization toolboxes available to the general public. Thus, the Continuous ECF estimator based on the joint, bi-dimensional, unconditional characteristic function has all the reasons to exist and to be used for the estimation of parameters of the stochastic volatility jump-diffusion models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evidence-based (EBP) aims for a new distribution of power centered on scientific evidence rather than clinical expertise. The present article describes the operational process of EBP by describing the implementation stages of this type of practise. This stage presentation is essential given that there are many conceptions end models of EBP and that some nurses have a limited knowledge of its rules ans implications. Given that number and formulation of the stages varies by author, the process presented here attempts to integrate the different stages reviewed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diffusion MRI has evolved towards an important clinical diagnostic and research tool. Though clinical routine is using mainly diffusion weighted and tensor imaging approaches, Q-ball imaging and diffusion spectrum imaging techniques have become more widely available. They are frequently used in research-oriented investigations in particular those aiming at measuring brain network connectivity. In this work, we aim at assessing the dependency of connectivity measurements on various diffusion encoding schemes in combination with appropriate data modeling. We process and compare the structural connection matrices computed from several diffusion encoding schemes, including diffusion tensor imaging, q-ball imaging and high angular resolution schemes, such as diffusion spectrum imaging with a publically available processing pipeline for data reconstruction, tracking and visualization of diffusion MR imaging. The results indicate that the high angular resolution schemes maximize the number of obtained connections when applying identical processing strategies to the different diffusion schemes. Compared to the conventional diffusion tensor imaging, the added connectivity is mainly found for pathways in the 50-100mm range, corresponding to neighboring association fibers and long-range associative, striatal and commissural fiber pathways. The analysis of the major associative fiber tracts of the brain reveals striking differences between the applied diffusion schemes. More complex data modeling techniques (beyond tensor model) are recommended 1) if the tracts of interest run through large fiber crossings such as the centrum semi-ovale, or 2) if non-dominant fiber populations, e.g. the neighboring association fibers are the subject of investigation. An important finding of the study is that since the ground truth sensitivity and specificity is not known, the comparability between results arising from different strategies in data reconstruction and/or tracking becomes implausible to understand.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resume : Mieux comprendre les stromatolithes et les tapis microbiens est un sujet important en biogéosciences puisque cela aide à l'étude des premières formes de vie sur Terre, a mieux cerner l'écologie des communautés microbiennes et la contribution des microorganismes a la biominéralisation, et même à poser certains fondements dans les recherches en exobiologie. D'autre part, la modélisation est un outil puissant utilisé dans les sciences naturelles pour appréhender différents phénomènes de façon théorique. Les modèles sont généralement construits sur un système d'équations différentielles et les résultats sont obtenus en résolvant ce système. Les logiciels disponibles pour implémenter les modèles incluent les logiciels mathématiques et les logiciels généraux de simulation. L'objectif principal de cette thèse est de développer des modèles et des logiciels pour aider a comprendre, via la simulation, le fonctionnement des stromatolithes et des tapis microbiens. Ces logiciels ont été développés en C++ en ne partant d'aucun pré-requis de façon a privilégier performance et flexibilité maximales. Cette démarche permet de construire des modèles bien plus spécifiques et plus appropriés aux phénomènes a modéliser. Premièrement, nous avons étudié la croissance et la morphologie des stromatolithes. Nous avons construit un modèle tridimensionnel fondé sur l'agrégation par diffusion limitée. Le modèle a été implémenté en deux applications C++: un moteur de simulation capable d'exécuter un batch de simulations et de produire des fichiers de résultats, et un outil de visualisation qui permet d'analyser les résultats en trois dimensions. Après avoir vérifié que ce modèle peut en effet reproduire la croissance et la morphologie de plusieurs types de stromatolithes, nous avons introduit un processus de sédimentation comme facteur externe. Ceci nous a mené a des résultats intéressants, et permis de soutenir l'hypothèse que la morphologie des stromatolithes pourrait être le résultat de facteurs externes autant que de facteurs internes. Ceci est important car la classification des stromatolithes est généralement fondée sur leur morphologie, imposant que la forme d'un stromatolithe est dépendante de facteurs internes uniquement (c'est-à-dire les tapis microbiens). Les résultats avancés dans ce mémoire contredisent donc ces assertions communément admises. Ensuite, nous avons décidé de mener des recherches plus en profondeur sur les aspects fonctionnels des tapis microbiens. Nous avons construit un modèle bidimensionnel de réaction-diffusion fondé sur la simulation discrète. Ce modèle a été implémenté dans une application C++ qui permet de paramétrer et exécuter des simulations. Nous avons ensuite pu comparer les résultats de simulation avec des données du monde réel et vérifier que le modèle peut en effet imiter le comportement de certains tapis microbiens. Ainsi, nous avons pu émettre et vérifier des hypothèses sur le fonctionnement de certains tapis microbiens pour nous aider à mieux en comprendre certains aspects, comme la dynamique des éléments, en particulier le soufre et l'oxygène. En conclusion, ce travail a abouti à l'écriture de logiciels dédiés à la simulation de tapis microbiens d'un point de vue tant morphologique que fonctionnel, suivant deux approches différentes, l'une holistique, l'autre plus analytique. Ces logiciels sont gratuits et diffusés sous licence GPL (General Public License). Abstract : Better understanding of stromatolites and microbial mats is an important topic in biogeosciences as it helps studying the early forms of life on Earth, provides clues re- garding the ecology of microbial ecosystems and their contribution to biomineralization, and gives basis to a new science, exobiology. On the other hand, modelling is a powerful tool used in natural sciences for the theoretical approach of various phenomena. Models are usually built on a system of differential equations and results are obtained by solving that system. Available software to implement models includes mathematical solvers and general simulation software. The main objective of this thesis is to develop models and software able to help to understand the functioning of stromatolites and microbial mats. Software was developed in C++ from scratch for maximum performance and flexibility. This allows to build models much more specific to a phenomenon rather than general software. First, we studied stromatolite growth and morphology. We built a three-dimensional model based on diffusion-limited aggregation. The model was implemented in two C++ applications: a simulator engine, which can run a batch of simulations and produce result files, and a Visualization tool, which allows results to be analysed in three dimensions. After verifying that our model can indeed reproduce the growth and morphology of several types of stromatolites, we introduced a sedimentation process as an external factor. This lead to interesting results, and allowed to emit the hypothesis that stromatolite morphology may be the result of external factors as much as internal factors. This is important as stromatolite classification is usually based on their morphology, imposing that a stromatolite shape is dependant on internal factors only (i.e. the microbial mat). This statement is contradicted by our findings, Second, we decided to investigate deeper the functioning of microbial mats, We built a two-dimensional reaction-diffusion model based on discrete simulation, The model was implemented in a C++ application that allows setting and running simulations. We could then compare simulation results with real world data and verify that our model can indeed mimic the behaviour of some microbial mats. Thus, we have proposed and verified hypotheses regarding microbial mats functioning in order to help to better understand them, e.g. the cycle of some elements such as oxygen or sulfur. ln conclusion, this PhD provides a simulation software, dealing with two different approaches. This software is free and available under a GPL licence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper is motivated by the valuation problem of guaranteed minimum death benefits in various equity-linked products. At the time of death, a benefit payment is due. It may depend not only on the price of a stock or stock fund at that time, but also on prior prices. The problem is to calculate the expected discounted value of the benefit payment. Because the distribution of the time of death can be approximated by a combination of exponential distributions, it suffices to solve the problem for an exponentially distributed time of death. The stock price process is assumed to be the exponential of a Brownian motion plus an independent compound Poisson process whose upward and downward jumps are modeled by combinations (or mixtures) of exponential distributions. Results for exponential stopping of a Lévy process are used to derive a series of closed-form formulas for call, put, lookback, and barrier options, dynamic fund protection, and dynamic withdrawal benefit with guarantee. We also discuss how barrier options can be used to model lapses and surrenders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Food allergies are believed to be on the rise and currently management relies on the avoidance of the food. Hen's egg allergy is after cow's milk allergy the most common food allergy; eggs are used in many food products and thus difficult to avoid. A technological process using a combination of enzymatic hydrolysis and heat treatment was designed to produce modified hen's egg with reduced allergenic potential. Biochemical (SDS-PAGE, Size exclusion chromatography and LC-MS/MS) and immunological (ELISA, immunoblot, RBL-assays, animal model) analysis showed a clear decrease in intact proteins as well as a strong decrease of allergenicity. In a clinical study, 22 of the 24 patients with a confirmed egg allergy who underwent a double blind food challenge with the hydrolysed egg remained completely free of symptoms. Hydrolysed egg products may be beneficial as low allergenic foods for egg allergic patients to extent their diet. This article is protected by copyright. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current explanatory models for binge eating in binge eating disorder (BED) mostly rely onmodels for bulimianervosa (BN), although research indicates different antecedents for binge eating in BED. This studyinvestigates antecedents and maintaining factors in terms of positive mood, negative mood and tension in asample of 22 women with BED using ecological momentary assessment over a 1-week. Values for negativemood were higher and those for positive mood lower during binge days compared with non-binge days.During binge days, negative mood and tension both strongly and significantly increased and positive moodstrongly and significantly decreased at the first binge episode, followed by a slight though significant, andlonger lasting decrease (negative mood, tension) or increase (positive mood) during a 4-h observation periodfollowing binge eating. Binge eating in BED seems to be triggered by an immediate breakdown of emotionregulation. There are no indications of an accumulation of negative mood triggering binge eating followed byimmediate reinforcing mechanisms in terms of substantial and stable improvement of mood as observed inBN. These differences implicate a further specification of etiological models and could serve as a basis fordeveloping new treatment approaches for BED.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RESUME GRAND PUBLICLe cerveau est composé de différents types cellulaires, dont les neurones et les astrocytes. Faute de moyens pour les observer, les astrocytes sont très longtemps restés dans l'ombre alors que les neurones, bénéficiant des outils ad hoc pour être stimulés et étudiés, ont fait l'objet de toutes les attentions. Le développement de l'imagerie cellulaire et des outils fluorescents ont permis d'observer ces cellules non électriquement excitables et d'obtenir des informations qui laissent penser que ces cellules sont loin d'être passives et participent activement au fonctionnement cérébral. Cette participation au fonctionnement cérébral se fait en partie par le biais de la libération de substances neuro-actives (appellées gliotransmetteurs) que les astrocytes libèrent à proximité des synapses permettant ainsi de moduler le fonctionnement neuronal. Cette libération de gliotransmetteurs est principalement causée par l'activité neuronale que les astrocytes sont capables de sentir. Néanmoins, nous savons encore peu de chose sur les propriétés précises de la libération des gliotransmetteurs. Comprendre les propriétés spatio-temporelles de cette libération est essentiel pour comprendre le mode de communication de ces cellules et leur implication dans la transmission de l'information cérébrale. En utilisant des outils fluorescents récemment développés et en combinant différentes techniques d'imagerie cellulaire, nous avons pu obtenir des informations très précises sur la libération de ces gliotransmetteurs par les astrocytes. Nous avons ainsi confirmé que cette libération était un processus très rapide et qu'elle était contrôlée par des augmentations de calcium locales et rapides. Nous avons également décrit une organisation complexe de la machinerie supportant la libération des gliotransmetteurs. Cette organisation complexe semble être à la base de la libération extrêmement rapide des gliotransmetteurs. Cette rapidité de libération et cette complexité structurelle semblent indiquer que les astrocytes sont des cellules particulièrement adaptées à une communication rapide et qu'elles peuvent, au même titre que les neurones dont elles seraient les partenaires légitimes, participer à la transmission et à l'intégration de l'information cérébrale.RESUMEDe petites vésicules, les « SLMVs » ou « Synaptic Like MicroVesicles », exprimant des transporteurs vésiculaires du glutamate (VGluTs) et libérant du glutamate par exocytose régulée, ont récemment été décrites dans les astrocytes en culture et in situ. Néanmoins, nous savons peu de chose sur les propriétés précises de la sécrétion de ces SLMVs. Contrairement aux neurones, le couplage stimulussécrétion des astrocytes n'est pas basé sur l'ouverture des canaux calciques membranaires mais nécessite l'intervention de seconds messagers et la libération du calcium par le reticulum endoplasmique (RE). Comprendre les propriétés spatio-temporelles de la sécrétion astrocytaire est essentiel pour comprendre le mode de communication de ces cellules et leur implication dans la transmission de l'information cérébrale. Nous avons utilisé des outils fluorescents récemment développés pour étudier le recyclage des vésicules synaptiques glutamatergiques comme les colorants styryles et la pHluorin afin de pouvoir suivre la sécrétion des SLMVs à l'échelle de la cellule mais également à l'échelle des évènements. L'utilisation combinée de l'épifluorescence et de la fluorescence à onde évanescente nous a permis d'obtenir une résolution temporelle et spatiale sans précédent. Ainsi avons-nous confirmé que la sécrétion régulée des astrocytes était un processus très rapide (de l'ordre de quelques centaines de millisecondes). Nous avons découvert que cette sécrétion est contrôlée par des augmentations de calcium locales et rapides. Nous avons également décrit des compartiments cytosoliques délimités par le RE à proximité de la membrane plasmique et contenant les SLMVs. Cette organisation semble être à la base du couplage rapide entre l'activation des GPCRs et la sécrétion. L'existence de compartiments subcellulaires indépendants permettant de contenir les messagers intracellulaires et de limiter leur diffusion semble compenser de manière efficace la nonexcitabilité électrique des astrocytes. Par ailleurs, l'existence des différents pools de vésicules recrutés séquentiellement et fusionnant selon des modalités distinctes ainsi que l'existence de mécanismes permettant le renouvellement de ces pools lors de la stimulation suggèrent que les astrocytes peuvent faire face à une stimulation soutenue de leur sécrétion. Ces données suggèrent que la libération de gliotransmetteurs par exocytose régulée n'est pas seulement une propriété des astrocytes en culture mais bien le résultat d'une forte spécialisation de ces cellules pour la sécrétion. La rapidité de cette sécrétion donne aux astrocytes toutes les compétences pour pouvoir intervenir de manière active dans la transmission et l'intégration de l'information.ABSTRACTRecently, astrocytic synaptic like microvesicles (SLMVs), that express vesicular glutamate transporters (VGluTs) and are able to release glutamate by Ca2+-dependent regulated exocytosis, have been described both in tissue and in cultured astrocytes. Nevertheless, little is known about the specific properties of regulated secretion in astrocytes. Important differences may exist between astrocytic and neuronal exocytosis, starting from the fact that stimulus-secretion coupling in astrocytes is voltage independent, mediated by G-protein-coupled receptors and the release of Ca2+ from internal stores. Elucidating the spatiotemporal properties of astrocytic exo-endocytosis is, therefore, of primary importance for understanding the mode of communication of these cells and their role in brain signaling. We took advantage of fluorescent tools recently developed for studying recycling of glutamatergic vesicles at synapses like styryl dyes and pHluorin in order to follow exocytosis and endocytosis of SLMVs at the level of the entire cell or at the level of single event. We combined epifluorescence and total internal reflection fluorescence imaging to investigate, with unprecedented temporal and spatial resolution, the events underlying the stimulus-secretion in astrocytes. We confirmed that exo-endocytosis process in astrocytes proceeds with a time course on the millisecond time scale. We discovered that SLMVs exocytosis is controlled by local and fast Ca2+ elevations; indeed submicrometer cytosolic compartments delimited by endoplasmic reticulum (ER) tubuli reaching beneath the plasma membrane and containing SLMVs. Such complex organization seems to support the fast stimulus-secretion coupling reported here. Independent subcellular compartments formed by ER, SLMVs and plasma membrane containing intracellular messengers and limiting their diffusion seem to compensate efficiently the non-electrical excitability of astrocytes. Moreover, the existence of two pools of SLMVs which are sequentially recruited suggests a compensatory mechanisms allowing the refill of SLMVs and supporting exocytosis process over a wide range of multiple stimuli. These data suggest that regulated secretion is not only a feature of cultured astrocytes but results from a strong specialization of these cells. The rapidity of secretion demonstrates that astrocytes are able to actively participate in brain information transmission and processing.