16 resultados para eigenfunction stochastic volatility models

em Repositório Institucional UNESP - Universidade Estadual Paulista "Julio de Mesquita Filho"


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we study the possible microscopic origin of heavy-tailed probability density distributions for the price variation of financial instruments. We extend the standard log-normal process to include another random component in the so-called stochastic volatility models. We study these models under an assumption, akin to the Born-Oppenheimer approximation, in which the volatility has already relaxed to its equilibrium distribution and acts as a background to the evolution of the price process. In this approximation, we show that all models of stochastic volatility should exhibit a scaling relation in the time lag of zero-drift modified log-returns. We verify that the Dow-Jones Industrial Average index indeed follows this scaling. We then focus on two popular stochastic volatility models, the Heston and Hull-White models. In particular, we show that in the Hull-White model the resulting probability distribution of log-returns in this approximation corresponds to the Tsallis (t-Student) distribution. The Tsallis parameters are given in terms of the microscopic stochastic volatility model. Finally, we show that the log-returns for 30 years Dow Jones index data is well fitted by a Tsallis distribution, obtaining the relevant parameters. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the Heston model with stochastic volatility and exponential tails as a model for the typical price fluctuations of the Brazilian São Paulo Stock Exchange Index (IBOVESPA). Raw prices are first corrected for inflation and a period spanning 15 years characterized by memoryless returns is chosen for the analysis. Model parameters are estimated by observing volatility scaling and correlation properties. We show that the Heston model with at least two time scales for the volatility mean reverting dynamics satisfactorily describes price fluctuations ranging from time scales larger than 20min to 160 days. At time scales shorter than 20 min we observe autocorrelated returns and power law tails incompatible with the Heston model. Despite major regulatory changes, hyperinflation and currency crises experienced by the Brazilian market in the period studied, the general success of the description provided may be regarded as an evidence for a general underlying dynamics of price fluctuations at intermediate mesoeconomic time scales well approximated by the Heston model. We also notice that the connection between the Heston model and Ehrenfest urn models could be exploited for bringing new insights into the microeconomic market mechanics. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Rio Claro Formation, Tertiary-Quaternary age, is composed of unconsolidated sediments deposited by fluvial systems. In Paulínia (SP) region geological studies comprising sedimentological, structural and geomorphological aspects indicate that the Rio Claro Formation is constituted by deposits of a meandering fluvial system. Data from SPT drillings were used to obtain sedimentary textural information in order to generate stochastic stratigraphic models. Particle size analysis was carried with the core samples which resulted in the distinction of five litofacies, three of which can be grouped into only one mudstone unit. The other two facies represent channel belt facies, being clayey sands and medium to coarse sands. Geostatistical modeling of the stratigraphic architecture followed together with correlation of analogue outcrop data and conceptual models for this type of depositional system. 100 models were generated with the SPT drillings and 50 models were generated with data from an analogue outcrop, which allowed constraining of both simulation sets to the depositional model given for the region. T-PROGS methodology has good applicability in simulating stratigraphic frameworks and its inherent limitations may be approached with parallel studies, such as stochastic modeling of analogue outcrops or geophysical methods

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ionospheric scintillations are caused by time-varying electron density irregularities in the ionosphere, occurring more often at equatorial and high latitudes. This paper focuses exclusively on experiments undertaken in Europe, at geographic latitudes between similar to 50 degrees N and similar to 80 degrees N, where a network of GPS receivers capable of monitoring Total Electron Content and ionospheric scintillation parameters was deployed. The widely used ionospheric scintillation indices S4 and sigma(phi) represent a practical measure of the intensity of amplitude and phase scintillation affecting GNSS receivers. However, they do not provide sufficient information regarding the actual tracking errors that degrade GNSS receiver performance. Suitable receiver tracking models, sensitive to ionospheric scintillation, allow the computation of the variance of the output error of the receiver PLL (Phase Locked Loop) and DLL (Delay Locked Loop), which expresses the quality of the range measurements used by the receiver to calculate user position. The ability of such models of incorporating phase and amplitude scintillation effects into the variance of these tracking errors underpins our proposed method of applying relative weights to measurements from different satellites. That gives the least squares stochastic model used for position computation a more realistic representation, vis-a-vis the otherwise 'equal weights' model. For pseudorange processing, relative weights were computed, so that a 'scintillation-mitigated' solution could be performed and compared to the (non-mitigated) 'equal weights' solution. An improvement between 17 and 38% in height accuracy was achieved when an epoch by epoch differential solution was computed over baselines ranging from 1 to 750 km. The method was then compared with alternative approaches that can be used to improve the least squares stochastic model such as weighting according to satellite elevation angle and by the inverse of the square of the standard deviation of the code/carrier divergence (sigma CCDiv). The influence of multipath effects on the proposed mitigation approach is also discussed. With the use of high rate scintillation data in addition to the scintillation indices a carrier phase based mitigated solution was also implemented and compared with the conventional solution. During a period of occurrence of high phase scintillation it was observed that problems related to ambiguity resolution can be reduced by the use of the proposed mitigated solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two stochastic models have been fitted to daily rainfall data for an interior station of Brazil. Of these two models, the results show a better fit to describe the data, by truncated negative probability model in comparison with Markov chain probability model. Kolmogorov-Smirnov test is applied for significance for these models. © 1983 Springer-Verlag.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Power-law distributions, i.e. Levy flights have been observed in various economical, biological, and physical systems in high-frequency regime. These distributions can be successfully explained via gradually truncated Levy flight (GTLF). In general, these systems converge to a Gaussian distribution in the low-frequency regime. In the present work, we develop a model for the physical basis for the cut-off length in GTLF and its variation with respect to the time interval between successive observations. We observe that GTLF automatically approach a Gaussian distribution in the low-frequency regime. We applied the present method to analyze time series in some physical and financial systems. The agreement between the experimental results and theoretical curves is excellent. The present method can be applied to analyze time series in a variety of fields, which in turn provide a basis for the development of further microscopic models for the system. © 2000 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The GPS observables are subject to several errors. Among them, the systematic ones have great impact, because they degrade the accuracy of the accomplished positioning. These errors are those related, mainly, to GPS satellites orbits, multipath and atmospheric effects. Lately, a method has been suggested to mitigate these errors: the semiparametric model and the penalised least squares technique (PLS). In this method, the errors are modeled as functions varying smoothly in time. It is like to change the stochastic model, in which the errors functions are incorporated, the results obtained are similar to those in which the functional model is changed. As a result, the ambiguities and the station coordinates are estimated with better reliability and accuracy than the conventional least square method (CLS). In general, the solution requires a shorter data interval, minimizing costs. The method performance was analyzed in two experiments, using data from single frequency receivers. The first one was accomplished with a short baseline, where the main error was the multipath. In the second experiment, a baseline of 102 km was used. In this case, the predominant errors were due to the ionosphere and troposphere refraction. In the first experiment, using 5 minutes of data collection, the largest coordinates discrepancies in relation to the ground truth reached 1.6 cm and 3.3 cm in h coordinate for PLS and the CLS, respectively, in the second one, also using 5 minutes of data, the discrepancies were 27 cm in h for the PLS and 175 cm in h for the CLS. In these tests, it was also possible to verify a considerable improvement in the ambiguities resolution using the PLS in relation to the CLS, with a reduced data collection time interval. © Springer-Verlag Berlin Heidelberg 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The system reliability depends on the reliability of its components itself. Therefore, it is necessary a methodology capable of inferring the state of functionality of these components to establish reliable indices of quality. Allocation models for maintenance and protective devices, among others, have been used in order to improve the quality and availability of services on electric power distribution systems. This paper proposes a methodology for assessing the reliability of distribution system components in an integrated way, using probabilistic models and fuzzy inference systems to infer about the operation probability of each component. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Deterministic Optimal Reactive Power Dispatch problem has been extensively studied, such that the demand power and the availability of shunt reactive power compensators are known and fixed. Give this background, a two-stage stochastic optimization model is first formulated under the presumption that the load demand can be modeled as specified random parameters. A second stochastic chance-constrained model is presented considering uncertainty on the demand and the equivalent availability of shunt reactive power compensators. Simulations on six-bus and 30-bus test systems are used to illustrate the validity and essential features of the proposed models. This simulations shows that the proposed models can prevent to the power system operator about of the deficit of reactive power in the power system and suggest that shunt reactive sourses must be dispatched against the unavailability of any reactive source. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The transcription process is crucial to life and the enzyme RNA polymerase (RNAP) is the major component of the transcription machinery. The development of single-molecule techniques, such as magnetic and optical tweezers, atomic-force microscopy and single-molecule fluorescence, increased our understanding of the transcription process and complements traditional biochemical studies. Based on these studies, theoretical models have been proposed to explain and predict the kinetics of the RNAP during the polymerization, highlighting the results achieved by models based on the thermodynamic stability of the transcription elongation complex. However, experiments showed that if more than one RNAP initiates from the same promoter, the transcription behavior slightly changes and new phenomenona are observed. We proposed and implemented a theoretical model that considers collisions between RNAPs and predicts their cooperative behavior during multi-round transcription generalizing the Bai et al. stochastic sequence-dependent model. In our approach, collisions between elongating enzymes modify their transcription rate values. We performed the simulations in Mathematica® and compared the results of the single and the multiple-molecule transcription with experimental results and other theoretical models. Our multi-round approach can recover several expected behaviors, showing that the transcription process for the studied sequences can be accelerated up to 48% when collisions are allowed: the dwell times on pause sites are reduced as well as the distance that the RNAPs backtracked from backtracking sites. © 2013 Costa et al.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gravitational waves from a variety of sources are predicted to superpose to create a stochastic background. This background is expected to contain unique information from throughout the history of the Universe that is unavailable through standard electromagnetic observations, making its study of fundamental importance to understanding the evolution of the Universe. We carry out a search for the stochastic background with the latest data from the LIGO and Virgo detectors. Consistent with predictions from most stochastic gravitational-wave background models, the data display no evidence of a stochastic gravitational-wave signal. Assuming a gravitational-wave spectrum of Omega(GW)(f) = Omega(alpha)(f/f(ref))(alpha), we place 95% confidence level upper limits on the energy density of the background in each of four frequency bands spanning 41.5-1726 Hz. In the frequency band of 41.5-169.25 Hz for a spectral index of alpha = 0, we constrain the energy density of the stochastic background to be Omega(GW)(f) < 5.6 x 10(-6). For the 600-1000 Hz band, Omega(GW)(f) < 0.14(f/900 Hz)(3), a factor of 2.5 lower than the best previously reported upper limits. We find Omega(GW)(f) < 1.8 x 10(-4) using a spectral index of zero for 170-600 Hz and Omega(GW)(f) < 1.0(f/1300 Hz)(3) for 1000-1726 Hz, bands in which no previous direct limits have been placed. The limits in these four bands are the lowest direct measurements to date on the stochastic background. We discuss the implications of these results in light of the recent claim by the BICEP2 experiment of the possible evidence for inflationary gravitational waves.