982 resultados para Long memory
Resumo:
Um dos principais fatores de estudo do mercado de capitais é a discussão a respeito da teoria de eficiência de mercado, que no caso diverge em relação ao comportamento do preço da maioria dos ativos. Este trabalho tem o intuito de analisar o comportamento do principal índice de preços do mercado de bitcoins (BPI) durante o período de julho de 2010 a setembro de 2014. Inicialmente será testada a hipótese do passeio aleatório para o BPI. Em seguida serão verificadas as correlações de longa data nas séries financeiras temporais utilizando como instrumento de análise o expoente de Hurst (H), que inicialmente foi usado para calcular correlações em fenômenos naturais e posteriormente sua abrangência alcançou a área financeira. O estudo avalia o expoente H através de métodos distintos destacando-se a análise R/S e a DFA. Para o cálculo do expoente ao longo do tempo, utiliza-se uma janela móvel de 90 dias deslocando-se de 10 em 10 dias. Já para o cálculo em diferentes escalas verifica-se, para cada dia, o valor do expoente H nos últimos 360, 180 e 90 dias respectivamente. Os resultados evidenciaram que o índice BPI apresenta memória longa persistente em praticamente todo o período analisado. Além disso, a análise em diferentes escalas indica a possibilidade de previsão de eventos turbulentos no índice neste mesmo período. Finalmente foi possível comprovar a hipótese de mercados fractais para a série histórica de retornos do BPI.
Resumo:
This work provides a forward step in the study and comprehension of the relationships between stochastic processes and a certain class of integral-partial differential equation, which can be used in order to model anomalous diffusion and transport in statistical physics. In the first part, we brought the reader through the fundamental notions of probability and stochastic processes, stochastic integration and stochastic differential equations as well. In particular, within the study of H-sssi processes, we focused on fractional Brownian motion (fBm) and its discrete-time increment process, the fractional Gaussian noise (fGn), which provide examples of non-Markovian Gaussian processes. The fGn, together with stationary FARIMA processes, is widely used in the modeling and estimation of long-memory, or long-range dependence (LRD). Time series manifesting long-range dependence, are often observed in nature especially in physics, meteorology, climatology, but also in hydrology, geophysics, economy and many others. We deepely studied LRD, giving many real data examples, providing statistical analysis and introducing parametric methods of estimation. Then, we introduced the theory of fractional integrals and derivatives, which indeed turns out to be very appropriate for studying and modeling systems with long-memory properties. After having introduced the basics concepts, we provided many examples and applications. For instance, we investigated the relaxation equation with distributed order time-fractional derivatives, which describes models characterized by a strong memory component and can be used to model relaxation in complex systems, which deviates from the classical exponential Debye pattern. Then, we focused in the study of generalizations of the standard diffusion equation, by passing through the preliminary study of the fractional forward drift equation. Such generalizations have been obtained by using fractional integrals and derivatives of distributed orders. In order to find a connection between the anomalous diffusion described by these equations and the long-range dependence, we introduced and studied the generalized grey Brownian motion (ggBm), which is actually a parametric class of H-sssi processes, which have indeed marginal probability density function evolving in time according to a partial integro-differential equation of fractional type. The ggBm is of course Non-Markovian. All around the work, we have remarked many times that, starting from a master equation of a probability density function f(x,t), it is always possible to define an equivalence class of stochastic processes with the same marginal density function f(x,t). All these processes provide suitable stochastic models for the starting equation. Studying the ggBm, we just focused on a subclass made up of processes with stationary increments. The ggBm has been defined canonically in the so called grey noise space. However, we have been able to provide a characterization notwithstanding the underline probability space. We also pointed out that that the generalized grey Brownian motion is a direct generalization of a Gaussian process and in particular it generalizes Brownain motion and fractional Brownain motion as well. Finally, we introduced and analyzed a more general class of diffusion type equations related to certain non-Markovian stochastic processes. We started from the forward drift equation, which have been made non-local in time by the introduction of a suitable chosen memory kernel K(t). The resulting non-Markovian equation has been interpreted in a natural way as the evolution equation of the marginal density function of a random time process l(t). We then consider the subordinated process Y(t)=X(l(t)) where X(t) is a Markovian diffusion. The corresponding time-evolution of the marginal density function of Y(t) is governed by a non-Markovian Fokker-Planck equation which involves the same memory kernel K(t). We developed several applications and derived the exact solutions. Moreover, we considered different stochastic models for the given equations, providing path simulations.
Resumo:
Mit der Zielsetzung der vorliegenden Arbeit wurde die detailierten Analyse von Migrationsdynamiken epithelilaler Monolayer anhand zweier neuartiger in vitro Biosensoren verfolgt, der elektrischen Zell-Substrat Impedanz Spektroskopie (electrical cell-substrate impedance sensing, ECIS) sowie der Quarz Kristall Mikrowaage (quartz crystal microbalance, QCM). Beide Methoden erwiesen sich als sensitiv gegenüber der Zellmotilität und der Nanozytotoxizität.rnInnerhalb des ersten Projektes wurde ein Fingerprinting von Krebszellen anhand ihrer Motilitätsdynamiken und der daraus generierten elektrischen oder akkustischen Fluktuationen auf ECIS oder QCM Basis vorgenommen; diese Echtzeitsensoren wurdene mit Hilfe klassicher in vitro Boyden-Kammer Migrations- und Invasions-assays validiert. Fluktuationssignaturen, also Langzeitkorrelationen oder fraktale Selbstähnlichkeit aufgrund der kollektiven Zellbewegung, wurden über Varianz-, Fourier- sowie trendbereinigende Fluktuationsanalyse quantifiziert. Stochastische Langzeitgedächtnisphänomene erwiesen sich als maßgebliche Beiträge zur Antwort adhärenter Zellen auf den QCM und ECIS-Sensoren. Des weiteren wurde der Einfluss niedermolekularer Toxine auf die Zytoslelettdynamiken verfolgt: die Auswirkungen von Cytochalasin D, Phalloidin und Blebbistatin sowie Taxol, Nocodazol und Colchicin wurden dabei über die QCM und ECIS Fluktuationsanalyse erfasst.rnIn einem zweiten Projektschwerpunkt wurden Adhäsionsprozesse sowie Zell-Zell und Zell-Substrat Degradationsprozesse bei Nanopartikelgabe charackterisiert, um ein Maß für Nanozytotoxizität in Abhangigkeit der Form, Funktionalisierung Stabilität oder Ladung der Partikel zu erhalten.rnAls Schlussfolgerung ist zu nennen, dass die neuartigen Echtzeit-Biosensoren QCM und ECIS eine hohe Zellspezifität besitzen, auf Zytoskelettdynamiken reagieren sowie als sensitive Detektoren für die Zellvitalität fungieren können.
Resumo:
The variables involved in the equations that describe realistic synaptic dynamics always vary in a limited range. Their boundedness makes the synapses forgetful, not for the mere passage of time, but because new experiences overwrite old memories. The forgetting rate depends on how many synapses are modified by each new experience: many changes means fast learning and fast forgetting, whereas few changes means slow learning and long memory retention. Reducing the average number of modified synapses can extend the memory span at the price of a reduced amount of information stored when a new experience is memorized. Every trick which allows to slow down the learning process in a smart way can improve the memory performance. We review some of the tricks that allow to elude fast forgetting (oblivion). They are based on the stochastic selection of the synapses whose modifications are actually consolidated following each new experience. In practice only a randomly selected, small fraction of the synapses eligible for an update are actually modified. This allows to acquire the amount of information necessary to retrieve the memory without compromising the retention of old experiences. The fraction of modified synapses can be further reduced in a smart way by changing synapses only when it is really necessary, i.e. when the post-synaptic neuron does not respond as desired. Finally we show that such a stochastic selection emerges naturally from spike driven synaptic dynamics which read noisy pre and post-synaptic neural activities. These activities can actually be generated by a chaotic system.
Resumo:
This paper examines the mean-reverting property of real exchange rates. Earlier studies have generally not been able to reject the null hypothesis of a unit-root in real exchange rates, especially for the post-Bretton Woods floating period. The results imply that long-run purchasing power parity does not hold. More recent studies, especially those using panel unit-root tests, have found more favorable results, however. But, Karlsson and Löthgren (2000) and others have recently pointed out several potential pitfalls of panel unit-root tests. Thus, the panel unit-root test results are suggestive, but they are far from conclusive. Moreover, consistent individual country time series evidence that supports long-run purchasing power parity continues to be scarce. In this paper, we test for long memory using Lo's (1991) modified rescaled range test, and the rescaled variance test of Giraitis, Kokoszka, Leipus, and Teyssière (2003). Our testing procedure provides a non-parametric alternative to the parametric tests commonly used in this literature. Our data set consists of monthly observations from April 1973 to April 2001 of the G-7 countries in the OECD. Our two tests find conflicting results when we use U.S. dollar real exchange rates. However, when non-U.S. dollar real exchange rates are used, we find only two cases out of fifteen where the null hypothesis of an unit-root with short-term dependence can be rejected in favor of the alternative hypothesis of long-term dependence using the modified rescaled range test, and only one case when using the rescaled variance test. Our results therefore provide a contrast to the recent favorable panel unit-root test results.
Resumo:
Neste trabalho propomos o uso de um método Bayesiano para estimar o parâmetro de memória de um processo estocástico com memória longa quando sua função de verossimilhança é intratável ou não está disponível. Esta abordagem fornece uma aproximação para a distribuição a posteriori sobre a memória e outros parâmetros e é baseada numa aplicação simples do método conhecido como computação Bayesiana aproximada (ABC). Alguns estimadores populares para o parâmetro de memória serão revisados e comparados com esta abordagem. O emprego de nossa proposta viabiliza a solução de problemas complexos sob o ponto de vista Bayesiano e, embora aproximativa, possui um desempenho muito satisfatório quando comparada com métodos clássicos.
Resumo:
Stochastic anti-resonance, that is resonant enhancement of randomness caused by polarization mode beatings, is analyzed both numerically and analytically on an example of fibre Raman amplifier with randomly varying birefringence. As a result of such anti-resonance, the polarization mode dispersion growth causes an escape of the signal state of polarization from a metastable state corresponding to the pulling of the signal to the pump state of polarization.This phenomenon reveals itself in abrupt growth of gain fluctuations as well as in dropping of Hurst parameter and Kramers length characterizing long memory in a system and noise induced escape from the polarization pulling state. The results based on analytical multiscale averaging technique agree perfectly with the numerical data obtained by direct numerical simulations of underlying stochastic differential equations. This challenging outcome would allow replacing the cumbersome numerical simulations for real-world extra-long high-speed communication systems.
Resumo:
The paper considers various extended asymmetric multivariate conditional volatility models, and derives appropriate regularity conditions and associated asymptotic theory. This enables checking of internal consistency and allows valid statistical inferences to be drawn based on empirical estimation. For this purpose, we use an underlying vector random coefficient autoregressive process, for which we show the equivalent representation for the asymmetric multivariate conditional volatility model, to derive asymptotic theory for the quasi-maximum likelihood estimator. As an extension, we develop a new multivariate asymmetric long memory volatility model, and discuss the associated asymptotic properties.
Resumo:
This dissertation contains four essays that all share a common purpose: developing new methodologies to exploit the potential of high-frequency data for the measurement, modeling and forecasting of financial assets volatility and correlations. The first two chapters provide useful tools for univariate applications while the last two chapters develop multivariate methodologies. In chapter 1, we introduce a new class of univariate volatility models named FloGARCH models. FloGARCH models provide a parsimonious joint model for low frequency returns and realized measures, and are sufficiently flexible to capture long memory as well as asymmetries related to leverage effects. We analyze the performances of the models in a realistic numerical study and on the basis of a data set composed of 65 equities. Using more than 10 years of high-frequency transactions, we document significant statistical gains related to the FloGARCH models in terms of in-sample fit, out-of-sample fit and forecasting accuracy compared to classical and Realized GARCH models. In chapter 2, using 12 years of high-frequency transactions for 55 U.S. stocks, we argue that combining low-frequency exogenous economic indicators with high-frequency financial data improves the ability of conditionally heteroskedastic models to forecast the volatility of returns, their full multi-step ahead conditional distribution and the multi-period Value-at-Risk. Using a refined version of the Realized LGARCH model allowing for time-varying intercept and implemented with realized kernels, we document that nominal corporate profits and term spreads have strong long-run predictive ability and generate accurate risk measures forecasts over long-horizon. The results are based on several loss functions and tests, including the Model Confidence Set. Chapter 3 is a joint work with David Veredas. We study the class of disentangled realized estimators for the integrated covariance matrix of Brownian semimartingales with finite activity jumps. These estimators separate correlations and volatilities. We analyze different combinations of quantile- and median-based realized volatilities, and four estimators of realized correlations with three synchronization schemes. Their finite sample properties are studied under four data generating processes, in presence, or not, of microstructure noise, and under synchronous and asynchronous trading. The main finding is that the pre-averaged version of disentangled estimators based on Gaussian ranks (for the correlations) and median deviations (for the volatilities) provide a precise, computationally efficient, and easy alternative to measure integrated covariances on the basis of noisy and asynchronous prices. Along these lines, a minimum variance portfolio application shows the superiority of this disentangled realized estimator in terms of numerous performance metrics. Chapter 4 is co-authored with Niels S. Hansen, Asger Lunde and Kasper V. Olesen, all affiliated with CREATES at Aarhus University. We propose to use the Realized Beta GARCH model to exploit the potential of high-frequency data in commodity markets. The model produces high quality forecasts of pairwise correlations between commodities which can be used to construct a composite covariance matrix. We evaluate the quality of this matrix in a portfolio context and compare it to models used in the industry. We demonstrate significant economic gains in a realistic setting including short selling constraints and transaction costs.
Resumo:
Mestrado em Controlo de Gestão e dos Negócios
Resumo:
This thesis studies the field of asset price bubbles. It is comprised of three independent chapters. Each of these chapters either directly or indirectly analyse the existence or implications of asset price bubbles. The type of bubbles assumed in each of these chapters is consistent with rational expectations. Thus, the kind of price bubbles investigated here are known as rational bubbles in the literature. The following describes the three chapters. Chapter 1: This chapter attempts to explain the recent US housing price bubble by developing a heterogeneous agent endowment economy asset pricing model with risky housing, endogenous collateral and defaults. Investment in housing is subject to an idiosyncratic risk and some mortgages are defaulted in equilibrium. We analytically derive the leverage or the endogenous loan to value ratio. This variable comes from a limited participation constraint in a one period mortgage contract with monitoring costs. Our results show that low values of housing investment risk produces a credit easing effect encouraging excess leverage and generates credit driven rational price bubbles in the housing good. Conversely, high values of housing investment risk produces a credit crunch characterized by tight borrowing constraints, low leverage and low house prices. Furthermore, the leverage ratio was found to be procyclical and the rate of defaults countercyclical consistent with empirical evidence. Chapter 2: It is widely believed that financial assets have considerable persistence and are susceptible to bubbles. However, identification of this persistence and potential bubbles is not straightforward. This chapter tests for price bubbles in the United States housing market accounting for long memory and structural breaks. The intuition is that the presence of long memory negates price bubbles while the presence of breaks could artificially induce bubble behaviour. Hence, we use procedures namely semi-parametric Whittle and parametric ARFIMA procedures that are consistent for a variety of residual biases to estimate the value of the long memory parameter, d, of the log rent-price ratio. We find that the semi-parametric estimation procedures robust to non-normality and heteroskedasticity errors found far more bubble regions than parametric ones. A structural break was identified in the mean and trend of all the series which when accounted for removed bubble behaviour in a number of regions. Importantly, the United States housing market showed evidence for rational bubbles at both the aggregate and regional levels. In the third and final chapter, we attempt to answer the following question: To what extend should individuals participate in the stock market and hold risky assets over their lifecycle? We answer this question by employing a lifecycle consumption-portfolio choice model with housing, labour income and time varying predictable returns where the agents are constrained in the level of their borrowing. We first analytically characterize and then numerically solve for the optimal asset allocation on the risky asset comparing the return predictability case with that of IID returns. We successfully resolve the puzzles and find equity holding and participation rates close to the data. We also find that return predictability substantially alter both the level of risky portfolio allocation and the rate of stock market participation. High factor (dividend-price ratio) realization and high persistence of factor process indicative of stock market bubbles raise the amount of wealth invested in risky assets and the level of stock market participation, respectively. Conversely, rare disasters were found to bring down these rates, the change being severe for investors in the later years of the life-cycle. Furthermore, investors following time varying returns (return predictability) hedged background risks significantly better than the IID ones.
Resumo:
Following the methodology of Ferreira and Dionísio (2016), the objective of this paper is to analyze the behavior stock markets in the G7 countries and find which of those countries is the first to reach levels of long-range correlations that are not significant. We carry out this analysis using detrended cross-correlation analysis and its correlation coefficient, to check for the existence of long-range dependence in time series. The existence of long-range dependence could be understood as a possibility of EMH violation. This analysis remains interesting because studies are not conclusive about the existence or not of long memory in stock return rates.
Resumo:
A modified version of the intruder-resident paradigm was used to investigate if social recognition memory lasts at least 24 h. One hundred and forty-six adult male Wistar rats were used. Independent groups of rats were exposed to an intruder for 0.083, 0.5, 2, 24, or 168 h and tested 24 h after the first encounter with the familiar or a different conspecific. Factor analysis was employed to identify associations between behaviors and treatments. Resident rats exhibited a 24-h social recognition memory, as indicated by a 3- to 5-fold decrease in social behaviors in the second encounter with the same conspecific compared to those observed for a different conspecific, when the duration of the first encounter was 2 h or longer. It was possible to distinguish between two different categories of social behaviors and their expression depended on the duration of the first encounter. Sniffing the anogenital area (49.9% of the social behaviors), sniffing the body (17.9%), sniffing the head (3%), and following the conspecific (3.1%), exhibited mostly by resident rats, characterized social investigation and revealed long-term social recognition memory. However, dominance (23.8%) and mild aggression (2.3%), exhibited by both resident and intruders, characterized social agonistic behaviors and were not affected by memory. Differently, sniffing the environment (76.8% of the non-social behaviors) and rearing (14.3%), both exhibited mostly by adult intruder rats, characterized non-social behaviors. Together, these results show that social recognition memory in rats may last at least 24 h after a 2-h or longer exposure to the conspecific.
Resumo:
The mechanism of generation of memory cytotoxic T cells (CTL) following immunization remains controversial. Using tumor protection and IFN-gamma ELISPOT assays in mice to detect functional CTL, we show that the initial effector CTL burst size after immunization is not directly related to the amount of functional memory CTL formed, suggesting that memory CTL are unlikely to arise stochastically from effector CTL. Induction of MHC class II-restricted T helper cells at the time of immunization by inclusion of a T helper peptide or protein in the immunogen, is necessary to generate memory CTL, although no T helper cell induction is required to generate effector CTL to a strong MHC class I-binding peptide. Host protective T cell memory correlates with the number of CTL epitope responsive IFN-gamma-secreting memory T cells as measured in an ELISPOT assay at the time of tumor challenge. We conclude that a different antigen presenting environment is required to induce long-lasting functional memory CTL, and non-cognate stimulation of the immune system is essential to allow generation of a long-lasting host protective memory CTL response.
Resumo:
Prepared for presentation at the Portuguese Finance Network International Conference 2014, Vilamoura, Portugal, June 18-20