983 resultados para Diffusion process
Resumo:
Methods like Event History Analysis can show the existence of diffusion and part of its nature, but do not study the process itself. Nowadays, thanks to the increasing performance of computers, processes can be studied using computational modeling. This thesis presents an agent-based model of policy diffusion mainly inspired from the model developed by Braun and Gilardi (2006). I first start by developing a theoretical framework of policy diffusion that presents the main internal drivers of policy diffusion - such as the preference for the policy, the effectiveness of the policy, the institutional constraints, and the ideology - and its main mechanisms, namely learning, competition, emulation, and coercion. Therefore diffusion, expressed by these interdependencies, is a complex process that needs to be studied with computational agent-based modeling. In a second step, computational agent-based modeling is defined along with its most significant concepts: complexity and emergence. Using computational agent-based modeling implies the development of an algorithm and its programming. When this latter has been developed, we let the different agents interact. Consequently, a phenomenon of diffusion, derived from learning, emerges, meaning that the choice made by an agent is conditional to that made by its neighbors. As a result, learning follows an inverted S-curve, which leads to partial convergence - global divergence and local convergence - that triggers the emergence of political clusters; i.e. the creation of regions with the same policy. Furthermore, the average effectiveness in this computational world tends to follow a J-shaped curve, meaning that not only time is needed for a policy to deploy its effects, but that it also takes time for a country to find the best-suited policy. To conclude, diffusion is an emergent phenomenon from complex interactions and its outcomes as ensued from my model are in line with the theoretical expectations and the empirical evidence.Les méthodes d'analyse de biographie (event history analysis) permettent de mettre en évidence l'existence de phénomènes de diffusion et de les décrire, mais ne permettent pas d'en étudier le processus. Les simulations informatiques, grâce aux performances croissantes des ordinateurs, rendent possible l'étude des processus en tant que tels. Cette thèse, basée sur le modèle théorique développé par Braun et Gilardi (2006), présente une simulation centrée sur les agents des phénomènes de diffusion des politiques. Le point de départ de ce travail met en lumière, au niveau théorique, les principaux facteurs de changement internes à un pays : la préférence pour une politique donnée, l'efficacité de cette dernière, les contraintes institutionnelles, l'idéologie, et les principaux mécanismes de diffusion que sont l'apprentissage, la compétition, l'émulation et la coercition. La diffusion, définie par l'interdépendance des différents acteurs, est un système complexe dont l'étude est rendue possible par les simulations centrées sur les agents. Au niveau méthodologique, nous présenterons également les principaux concepts sous-jacents aux simulations, notamment la complexité et l'émergence. De plus, l'utilisation de simulations informatiques implique le développement d'un algorithme et sa programmation. Cette dernière réalisée, les agents peuvent interagir, avec comme résultat l'émergence d'un phénomène de diffusion, dérivé de l'apprentissage, où le choix d'un agent dépend en grande partie de ceux faits par ses voisins. De plus, ce phénomène suit une courbe en S caractéristique, poussant à la création de régions politiquement identiques, mais divergentes au niveau globale. Enfin, l'efficacité moyenne, dans ce monde simulé, suit une courbe en J, ce qui signifie qu'il faut du temps, non seulement pour que la politique montre ses effets, mais également pour qu'un pays introduise la politique la plus efficace. En conclusion, la diffusion est un phénomène émergent résultant d'interactions complexes dont les résultats du processus tel que développé dans ce modèle correspondent tant aux attentes théoriques qu'aux résultats pratiques.
Resumo:
This article studies the diffusion of the main institutional feature of regulatory capitalism, namely, independent regulatory agencies. While only a few such authorities existed in Europe in the early 1980s, by the end of the twentieth century they had spread impressively across countries and sectors. The analysis finds that three classes of factors (bottom-up, top-down, and horizontal) explain this trend. First, the establishment of independent regulatory agencies was an attempt to improve credible commitment capacity when liberalizing and privatizing utilities and to alleviate the political uncertainty problem, namely, the risk to a government that its policies will be changed when it loses power. Second, Europeanization favored the creation of independent regulators. Third, individual decisions were interdependent, as governments were influenced by the decisions of others in an emulation process where the symbolic properties of independent regulators mattered more than the functions they performed.
Resumo:
Les prioritats per als museus canvien. La missió de la nova museologia és convertir els museus en llocs per a gaudir i aprendre, cosa que fa que hagin de dur a terme una gestió financera molt semblant a la d'una empresa social que competeixi en el sector del lleure. Amb el pas del temps, els museus han d'establir i aplicar els criteris necessaris per a la supervivència, aplanant el terreny perquè altres institucions públiques siguin més obertes en els seus esforços per comunicar i difondre el seu patrimoni. Ja podem començar a parlar d'algunes conclusions comunament acceptades sobre el comportament dels visitants, que són necessàries per a planificar exposicions futures que vegin l'aprenentatge com un procés constructiu, les col·leccions com a objectes amb significat i les mateixes exposicions com a mitjans de comunicació que haurien de transformar la manera de pensar de l'espectador i que estan al servei del mateix missatge. Sembla que internet representa un mitjà efectiu per a assolir aquests objectius, ja que és capaç (a) d'adaptar-se als interessos i les característiques intel·lectuals d'un públic divers; (b) de redescobrir els significats dels objectes i adquirir un reconeixement sociocultural del seu valor per mitjà del seu potencial interactiu, i (c) de fer ús d'elements atractius i estimulants perquè tothom en gaudeixi. Per a aquest propòsit, és bàsic fer-nos les preguntes següents: quins criteris ha de seguir un museu virtual per a optimar la difusió del seu patrimoni?; quins elements estimulen els usuaris a quedar-se en una pàgina web i fer visites virtuals que els siguin satisfactòries?; quin paper té la usabilitat de l'aplicació en tot això?
Resumo:
The speed of front propagation in fractals is studied by using (i) the reduction of the reaction-transport equation into a Hamilton-Jacobi equation and (ii) the local-equilibrium approach. Different equations proposed for describing transport in fractal media, together with logistic reaction kinetics, are considered. Finally, we analyze the main features of wave fronts resulting from this dynamic process, i.e., why they are accelerated and what is the exact form of this acceleration
Resumo:
A time-delayed second-order approximation for the front speed in reaction-dispersion systems was obtained by Fort and Méndez [Phys. Rev. Lett. 82, 867 (1999)]. Here we show that taking proper care of the effect of the time delay on the reactive process yields a different evolution equation and, therefore, an alternate equation for the front speed. We apply the new equation to the Neolithic transition. For this application the new equation yields speeds about 10% slower than the previous one
Resumo:
Objective To analyze innovative contents on Early Child Development Promotion. Method This action-research involves nine faculties from four Higher Education Institutions at inner-state of São Paulo, Brazil.Data were collected by syllabi analyses (2009-2011), interviews and focus group. We have adopted an ECDP underpinning from international consensus, thus evaluating KT Results We have found relevant incorporation between teaching and extension in Nursing (87,5%) and Psychology (75%) undergraduate courses, while Pedagogy was restricted to teaching. Conclusion This KT evaluation has evinced innovative potential of extension, regardless teaching and research, for a better Early Childhood.
Resumo:
I study the relation between the delay in the transmission of spilloversof information and diffusion. When a firm enters or innovates it benefitsfrom the information it gets by observing past entry. Delays in the processof receiving the information reduce the benefits of the spillover and affectthe entry process.I derive the effects this delay has on diffusion, on the dynamics of priceand cost of entry, and on efficiency. I explain why, when spillovers ofinformation are delayed, a zero profit condition requires an initial set ofentrants bigger than zero. I also illustrate how an S-shaped diffusion curvecan be generated. I show that competitive equilibrium entails a slowergeneration of information relative to the social optimum and how a socialplanner can improve efficiency.
Resumo:
In this paper we use Malliavin calculus techniques to obtain an expression for the short-time behavior of the at-the-money implied volatility skew for a generalization of the Bates model, where the volatility does not need to be neither a difussion, nor a Markov process as the examples in section 7 show. This expression depends on the derivative of the volatility in the sense of Malliavin calculus.
Resumo:
This article builds on the recent policy diffusion literature and attempts to overcome one of its major problems, namely the lack of a coherent theoretical framework. The literature defines policy diffusion as a process where policy choices are interdependent, and identifies several diffusion mechanisms that specify the link between the policy choices of the various actors. As these mechanisms are grounded in different theories, theoretical accounts of diffusion currently have little internal coherence. In this article we put forward an expected-utility model of policy change that is able to subsume all the diffusion mechanisms. We argue that the expected utility of a policy depends on both its effectiveness and the payoffs it yields, and we show that the various diffusion mechanisms operate by altering these two parameters. Each mechanism affects one of the two parameters, and does so in distinct ways. To account for aggregate patterns of diffusion, we embed our model in a simple threshold model of diffusion. Given the high complexity of the process that results, strong analytical conclusions on aggregate patterns cannot be drawn without more extensive analysis which is beyond the scope of this article. However, preliminary considerations indicate that a wide range of diffusion processes may exist and that convergence is only one possible outcome.
Resumo:
Microstructural features of La2/3Ca1/3MnO3 layers of various thicknesses grown on top of 001 LaAlO3 substrates are studied by using transmission electron microscopy and electron energy loss spectroscopy. Films are of high microstructural quality but exhibit some structural relaxation and mosaicity both when increasing thickness or after annealing processes. The existence of a cationic segregation process of La atoms toward free surface has been detected, as well as a Mn oxidation state variation through layer thickness. La diffusion would lead to a Mn valence change and, in turn, to reduced magnetization.
Resumo:
The origin of the microscopic inhomogeneities in InxGa12xAs layers grown on GaAs by molecular beam epitaxy is analyzed through the optical absorption spectra near the band gap. It is seen that, for relaxed thick layers of about 2.8 mm, composition inhomogeneities are responsible for the band edge smoothing into the whole compositional range (0.05,x,0.8). On the other hand, in thin enough layers strain inhomogeneities are dominant. This evolution in line with layer thickness is due to the atomic diffusion at the surface during growth, induced by the strain inhomogeneities that arise from stress relaxation. In consequence, the strain variations present in the layer are converted into composition variations during growth. This process is energetically favorable as it diminishes elastic energy. An additional support to this hypothesis is given by a clear proportionality between the magnitude of the composition variations and the mean strain.
Resumo:
Preface The starting point for this work and eventually the subject of the whole thesis was the question: how to estimate parameters of the affine stochastic volatility jump-diffusion models. These models are very important for contingent claim pricing. Their major advantage, availability T of analytical solutions for characteristic functions, made them the models of choice for many theoretical constructions and practical applications. At the same time, estimation of parameters of stochastic volatility jump-diffusion models is not a straightforward task. The problem is coming from the variance process, which is non-observable. There are several estimation methodologies that deal with estimation problems of latent variables. One appeared to be particularly interesting. It proposes the estimator that in contrast to the other methods requires neither discretization nor simulation of the process: the Continuous Empirical Characteristic function estimator (EGF) based on the unconditional characteristic function. However, the procedure was derived only for the stochastic volatility models without jumps. Thus, it has become the subject of my research. This thesis consists of three parts. Each one is written as independent and self contained article. At the same time, questions that are answered by the second and third parts of this Work arise naturally from the issues investigated and results obtained in the first one. The first chapter is the theoretical foundation of the thesis. It proposes an estimation procedure for the stochastic volatility models with jumps both in the asset price and variance processes. The estimation procedure is based on the joint unconditional characteristic function for the stochastic process. The major analytical result of this part as well as of the whole thesis is the closed form expression for the joint unconditional characteristic function for the stochastic volatility jump-diffusion models. The empirical part of the chapter suggests that besides a stochastic volatility, jumps both in the mean and the volatility equation are relevant for modelling returns of the S&P500 index, which has been chosen as a general representative of the stock asset class. Hence, the next question is: what jump process to use to model returns of the S&P500. The decision about the jump process in the framework of the affine jump- diffusion models boils down to defining the intensity of the compound Poisson process, a constant or some function of state variables, and to choosing the distribution of the jump size. While the jump in the variance process is usually assumed to be exponential, there are at least three distributions of the jump size which are currently used for the asset log-prices: normal, exponential and double exponential. The second part of this thesis shows that normal jumps in the asset log-returns should be used if we are to model S&P500 index by a stochastic volatility jump-diffusion model. This is a surprising result. Exponential distribution has fatter tails and for this reason either exponential or double exponential jump size was expected to provide the best it of the stochastic volatility jump-diffusion models to the data. The idea of testing the efficiency of the Continuous ECF estimator on the simulated data has already appeared when the first estimation results of the first chapter were obtained. In the absence of a benchmark or any ground for comparison it is unreasonable to be sure that our parameter estimates and the true parameters of the models coincide. The conclusion of the second chapter provides one more reason to do that kind of test. Thus, the third part of this thesis concentrates on the estimation of parameters of stochastic volatility jump- diffusion models on the basis of the asset price time-series simulated from various "true" parameter sets. The goal is to show that the Continuous ECF estimator based on the joint unconditional characteristic function is capable of finding the true parameters. And, the third chapter proves that our estimator indeed has the ability to do so. Once it is clear that the Continuous ECF estimator based on the unconditional characteristic function is working, the next question does not wait to appear. The question is whether the computation effort can be reduced without affecting the efficiency of the estimator, or whether the efficiency of the estimator can be improved without dramatically increasing the computational burden. The efficiency of the Continuous ECF estimator depends on the number of dimensions of the joint unconditional characteristic function which is used for its construction. Theoretically, the more dimensions there are, the more efficient is the estimation procedure. In practice, however, this relationship is not so straightforward due to the increasing computational difficulties. The second chapter, for example, in addition to the choice of the jump process, discusses the possibility of using the marginal, i.e. one-dimensional, unconditional characteristic function in the estimation instead of the joint, bi-dimensional, unconditional characteristic function. As result, the preference for one or the other depends on the model to be estimated. Thus, the computational effort can be reduced in some cases without affecting the efficiency of the estimator. The improvement of the estimator s efficiency by increasing its dimensionality faces more difficulties. The third chapter of this thesis, in addition to what was discussed above, compares the performance of the estimators with bi- and three-dimensional unconditional characteristic functions on the simulated data. It shows that the theoretical efficiency of the Continuous ECF estimator based on the three-dimensional unconditional characteristic function is not attainable in practice, at least for the moment, due to the limitations on the computer power and optimization toolboxes available to the general public. Thus, the Continuous ECF estimator based on the joint, bi-dimensional, unconditional characteristic function has all the reasons to exist and to be used for the estimation of parameters of the stochastic volatility jump-diffusion models.
Resumo:
Evidence-based (EBP) aims for a new distribution of power centered on scientific evidence rather than clinical expertise. The present article describes the operational process of EBP by describing the implementation stages of this type of practise. This stage presentation is essential given that there are many conceptions end models of EBP and that some nurses have a limited knowledge of its rules ans implications. Given that number and formulation of the stages varies by author, the process presented here attempts to integrate the different stages reviewed.
Resumo:
The effect of hydrodynamic flow upon diffusion-limited deposition on a line is investigated using a Monte Carlo model. The growth process is governed by the convection and diffusion field. The convective diffusion field is simulated by the biased-random walker resulting from a superimposed drift that represents the convective flow. The development of distinct morphologies is found with varying direction and strength of drift. By introducing a horizontal drift parallel to the deposition plate, the diffusion-limited deposit changes into a single needle inclined to the plate. The width of the needle decreases with increasing strength of drift. The angle between the needle and the plate is about 45° at high flow rate. In the presence of an inclined drift to the plate, the convection-diffusion-limited deposit leads to the formation of a characteristic columnar morphology. In the limiting case where the convection dominates, the deposition process is equivalent to ballistic deposition onto an inclined surface.
Resumo:
Diffusion MRI has evolved towards an important clinical diagnostic and research tool. Though clinical routine is using mainly diffusion weighted and tensor imaging approaches, Q-ball imaging and diffusion spectrum imaging techniques have become more widely available. They are frequently used in research-oriented investigations in particular those aiming at measuring brain network connectivity. In this work, we aim at assessing the dependency of connectivity measurements on various diffusion encoding schemes in combination with appropriate data modeling. We process and compare the structural connection matrices computed from several diffusion encoding schemes, including diffusion tensor imaging, q-ball imaging and high angular resolution schemes, such as diffusion spectrum imaging with a publically available processing pipeline for data reconstruction, tracking and visualization of diffusion MR imaging. The results indicate that the high angular resolution schemes maximize the number of obtained connections when applying identical processing strategies to the different diffusion schemes. Compared to the conventional diffusion tensor imaging, the added connectivity is mainly found for pathways in the 50-100mm range, corresponding to neighboring association fibers and long-range associative, striatal and commissural fiber pathways. The analysis of the major associative fiber tracts of the brain reveals striking differences between the applied diffusion schemes. More complex data modeling techniques (beyond tensor model) are recommended 1) if the tracts of interest run through large fiber crossings such as the centrum semi-ovale, or 2) if non-dominant fiber populations, e.g. the neighboring association fibers are the subject of investigation. An important finding of the study is that since the ground truth sensitivity and specificity is not known, the comparability between results arising from different strategies in data reconstruction and/or tracking becomes implausible to understand.