236 resultados para STOCHASTIC DYNAMICS
Resumo:
In this paper, we analyse the asymptotic behavior of solutions of the continuous kinetic version of flocking by Cucker and Smale [16], which describes the collective behavior of an ensemble of organisms, animals or devices. This kinetic version introduced in [24] is here obtained starting from a Boltzmann-type equation. The large-time behavior of the distribution in phase space is subsequently studied by means of particle approximations and a stability property in distances between measures. A continuous analogue of the theorems of [16] is shown to hold for the solutions on the kinetic model. More precisely, the solutions will concentrate exponentially fast their velocity to their mean while in space they will converge towards a translational flocking solution.
Resumo:
The paper seeks to shed light on inflation dynamics of four new EU member states: the Czech Republic, Hungary, Poland and Slovakia. To this end, the New Keynesian Phillips curve augmented for open economies is estimated and additional statistical tests applied. We find the following. (1) The claim of New Keynesians that the real marginal cost is the main inflation-forcing variable is fragile. (2) Inflation seems to be driven by external factors. (3) Although inflation holds a forward-looking component, the backward-looking component is substantial. An intuitive explanation for higher inflation persistence may be rather adaptive than rational price setting of local firms.
Resumo:
To describe the collective behavior of large ensembles of neurons in neuronal network, a kinetic theory description was developed in [13, 12], where a macroscopic representation of the network dynamics was directly derived from the microscopic dynamics of individual neurons, which are modeled by conductance-based, linear, integrate-and-fire point neurons. A diffusion approximation then led to a nonlinear Fokker-Planck equation for the probability density function of neuronal membrane potentials and synaptic conductances. In this work, we propose a deterministic numerical scheme for a Fokker-Planck model of an excitatory-only network. Our numerical solver allows us to obtain the time evolution of probability distribution functions, and thus, the evolution of all possible macroscopic quantities that are given by suitable moments of the probability density function. We show that this deterministic scheme is capable of capturing the bistability of stationary states observed in Monte Carlo simulations. Moreover, the transient behavior of the firing rates computed from the Fokker-Planck equation is analyzed in this bistable situation, where a bifurcation scenario, of asynchronous convergence towards stationary states, periodic synchronous solutions or damped oscillatory convergence towards stationary states, can be uncovered by increasing the strength of the excitatory coupling. Finally, the computation of moments of the probability distribution allows us to validate the applicability of a moment closure assumption used in [13] to further simplify the kinetic theory.
Resumo:
The paper focuses on the argumentative process through which new international norms prohibiting the use of weapons causing severe civilian harm emerge. It examines the debate surrounding the use and usefulness of landmines and cluster munitions and traces the process through which NGOs change conceptions of military utility and effectiveness of certain weapons by highlighting their humanitarian problems and questioning their military value. By challenging military thinking on these issues, NGOs redefine the terms of the debate – from a commonplace practice, the use of such weapons becomes controversial and military decisions need to be justified. The argument-counterargument dynamic shifts the burden of proof of the necessity and safety of the weapons to the users. The process witnesses the ability of NGOs to influence debates on military issues despite their disadvantaged position in hard security issue areas. It also challenges realist assumptions that only weapons that are obsolete or low-cost force equalizers for weak actors can be banned. To the contrary, the paper shows that in the case of landmines and cluster munitions, defining the military (in)effectiveness of the weapons is part and parcel of the struggle for their prohibition.
Gaussian estimates for the density of the non-linear stochastic heat equation in any space dimension
Resumo:
In this paper, we establish lower and upper Gaussian bounds for the probability density of the mild solution to the stochastic heat equation with multiplicative noise and in any space dimension. The driving perturbation is a Gaussian noise which is white in time with some spatially homogeneous covariance. These estimates are obtained using tools of the Malliavin calculus. The most challenging part is the lower bound, which is obtained by adapting a general method developed by Kohatsu-Higa to the underlying spatially homogeneous Gaussian setting. Both lower and upper estimates have the same form: a Gaussian density with a variance which is equal to that of the mild solution of the corresponding linear equation with additive noise.
Resumo:
We analyze the transitional dynamics of a model with heterogeneous consumption goods. In this model, convergence is driven by two different forces: the typical diminishing returns to capital and the sectoral change inducing the variation in relative prices. We show that this second force affects the growth rate if the two consumption goods are not Edgeworth independent and if these two goods are produced with technologies exhibiting different capital intensities. Because the afore mentioned dynamic sectoral change arises only under heterogeneous consumption goods, the transitional dynamics of this model exhibits striking differences with the growth model with a single consumption good. We also show that these differences in the transitional dynamics can give raise to large discrepancies in the welfare cost of shocks between the economy with a unique consumption good and the economy with multiple consumption goods.
Resumo:
In this paper the scales of classes of stochastic processes are introduced. New interpolation theorems and boundedness of some transforms of stochastic processes are proved. Interpolation method for generously-monotonous rocesses is entered. Conditions and statements of interpolation theorems concern he xed stochastic process, which diers from the classical results.
Resumo:
El projecte consistirà en instal·lar Microsoft CRM 3.0 en un servidor amb Wndows Server 2003, configurar l'aplicació (establir permisos, introduir la informació de l'empresa...), parametritzar-la (modificacions que es realitzen dintre del propi CRM per tal d'adaptar-lo al negoci com ara crear camps, taules, relacions, vistes...) i desenvolupar nova funcionalitat (callout). A més Microsoft CRM estarà integrat amb Microsoft Office Outlook.
Resumo:
First: A continuous-time version of Kyle's model (Kyle 1985), known as the Back's model (Back 1992), of asset pricing with asymmetric information, is studied. A larger class of price processes and of noise traders' processes are studied. The price process, as in Kyle's model, is allowed to depend on the path of the market order. The process of the noise traders' is an inhomogeneous Lévy process. Solutions are found by the Hamilton-Jacobi-Bellman equations. With the insider being risk-neutral, the price pressure is constant, and there is no equilibirium in the presence of jumps. If the insider is risk-averse, there is no equilibirium in the presence of either jumps or drifts. Also, it is analised when the release time is unknown. A general relation is established between the problem of finding an equilibrium and of enlargement of filtrations. Random announcement time is random is also considered. In such a case the market is not fully efficient and there exists equilibrium if the sensitivity of prices with respect to the global demand is time decreasing according with the distribution of the random time. Second: Power variations. it is considered, the asymptotic behavior of the power variation of processes of the form _integral_0^t u(s-)dS(s), where S_ is an alpha-stable process with index of stability 0&alpha&2 and the integral is an Itô integral. Stable convergence of corresponding fluctuations is established. These results provide statistical tools to infer the process u from discrete observations. Third: A bond market is studied where short rates r(t) evolve as an integral of g(t-s)sigma(s) with respect to W(ds), where g and sigma are deterministic and W is the stochastic Wiener measure. Processes of this type are particular cases of ambit processes. These processes are in general not of the semimartingale kind.
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
There are many factors that influence the day-ahead market bidding strategies of a generation company (GenCo) in the current energy market framework. Environmental policy issues have become more and more important for fossil-fuelled power plants and they have to be considered in their management, giving rise to emission limitations. This work allows to investigate the influence of both the allowances and emission reduction plan, and the incorporation of the derivatives medium-term commitments in the optimal generation bidding strategy to the day-ahead electricity market. Two different technologies have been considered: the coal thermal units, high-emission technology, and the combined cycle gas turbine units, low-emission technology. The Iberian Electricity Market and the Spanish National Emissions and Allocation Plans are the framework to deal with the environmental issues in the day-ahead market bidding strategies. To address emission limitations, some of the standard risk management methodologies developed for financial markets, such as Value-at-Risk (VaR) and Conditional Value-at-Risk (CVaR), have been extended. This study offers to electricity generation utilities a mathematical model to determinate the individual optimal generation bid to the wholesale electricity market, for each one of their generation units that maximizes the long-run profits of the utility abiding by the Iberian Electricity Market rules, the environmental restrictions set by the EU Emission Trading Scheme, as well as the restrictions set by the Spanish National Emissions Reduction Plan. The economic implications for a GenCo of including the environmental restrictions of these National Plans are analyzed and the most remarkable results will be presented.
Resumo:
This paper investigates the role of learning by private agents and the central bank (two-sided learning) in a New Keynesian framework in which both sides of the economy have asymmetric and imperfect knowledge about the true data generating process. We assume that all agents employ the data that they observe (which may be distinct for different sets of agents) to form beliefs about unknown aspects of the true model of the economy, use their beliefs to decide on actions, and revise these beliefs through a statistical learning algorithm as new information becomes available. We study the short-run dynamics of our model and derive its policy recommendations, particularly with respect to central bank communications. We demonstrate that two-sided learning can generate substantial increases in volatility and persistence, and alter the behavior of the variables in the model in a signifficant way. Our simulations do not converge to a symmetric rational expectations equilibrium and we highlight one source that invalidates the convergence results of Marcet and Sargent (1989). Finally, we identify a novel aspect of central bank communication in models of learning: communication can be harmful if the central bank's model is substantially mis-specified
Resumo:
The objective the present research is try to find some control design strategies, which must be effective and closed to the real operation conditions. As a novel contribution to structural control strategies, the theories of Interval Modal Arithmetic, Backstepping Control and QFT (Qualitative Feedback Theory) will be studied. The steps to follow are to develop first new controllers based on the above theories and then to implement the proposed control strategies to different kind of structures. The report is organized as follows. The Chapter 2 presents the state-of-the-art on structural control systems. The chapter 3 presents the most important open problems found in field of structural control. The exploratory work made by the author, research proposal and working plan are given in the Chapter 4
Resumo:
We developed a procedure that combines three complementary computational methodologies to improve the theoretical description of the electronic structure of nickel oxide. The starting point is a Car-Parrinello molecular dynamics simulation to incorporate vibrorotational degrees of freedom into the material model. By means ofcomplete active space self-consistent field second-order perturbation theory (CASPT2) calculations on embedded clusters extracted from the resulting trajectory, we describe localized spectroscopic phenomena on NiO with an efficient treatment of electron correlation. The inclusion of thermal motion into the theoretical description allowsus to study electronic transitions that, otherwise, would be dipole forbidden in the ideal structure and results in a natural reproduction of the band broadening. Moreover, we improved the embedded cluster model by incorporating self-consistently at the complete active space self-consistent field (CASSCF) level a discrete (or direct) reaction field (DRF) in the cluster surroundings. The DRF approach offers an efficient treatment ofelectric response effects of the crystalline embedding to the electronic transitions localized in the cluster. We offer accurate theoretical estimates of the absorption spectrum and the density of states around the Fermi level of NiO, and a comprehensive explanation of the source of the broadening and the relaxation of the charge transferstates due to the adaptation of the environment