940 resultados para Time-frequency analysis
Resumo:
Expectations of future market conditions are generally acknowledged to be crucial for the development decision and hence for shaping the built environment. This empirical study of the Central London office market from 1987 to 2009 tests for evidence of adaptive and naive expectations. Applying VAR models and a recursive OLS regression with one-step forecasts, we find evidence of adaptive and naïve, rather than rational expectations of developers. Although the magnitude of the errors and the length of time lags vary over time and development cycles, the results confirm that developers’ decisions are explained to a large extent by contemporaneous and past conditions in both London submarkets. The corollary of this finding is that developers may be able to generate excess profits by exploiting market inefficiencies but this may be hindered in practice by the long periods necessary for planning and construction of the asset. More generally, the results of this study suggest that real estate cycles are largely generated endogenously rather than being the result of unexpected exogenous shocks.
Resumo:
The Stochastic Diffusion Search algorithm -an integral part of Stochastic Search Networks is investigated. Stochastic Diffusion Search is an alternative solution for invariant pattern recognition and focus of attention. It has been shown that the algorithm can be modelled as an ergodic, finite state Markov Chain under some non-restrictive assumptions. Sub-linear time complexity for some settings of parameters has been formulated and proved. Some properties of the algorithm are then characterised and numerical examples illustrating some features of the algorithm are presented.
Resumo:
Gamow's explanation of the exponential decay law uses complex 'eigenvalues' and exponentially growing 'eigenfunctions'. This raises the question, how Gamow's description fits into the quantum mechanical description of nature, which is based on real eigenvalues and square integrable wavefunctions. Observing that the time evolution of any wavefunction is given by its expansion in generalized eigenfunctions, we shall answer this question in the most straightforward manner, which at the same time is accessible to graduate students and specialists. Moreover, the presentation can well be used in physics lectures to students.
Resumo:
Simulations of 15 coupled chemistry climate models, for the period 1960–2100, are presented. The models include a detailed stratosphere, as well as including a realistic representation of the tropospheric climate. The simulations assume a consistent set of changing greenhouse gas concentrations, as well as temporally varying chlorofluorocarbon concentrations in accordance with observations for the past and expectations for the future. The ozone results are analyzed using a nonparametric additive statistical model. Comparisons are made with observations for the recent past, and the recovery of ozone, indicated by a return to 1960 and 1980 values, is investigated as a function of latitude. Although chlorine amounts are simulated to return to 1980 values by about 2050, with only weak latitudinal variations, column ozone amounts recover at different rates due to the influence of greenhouse gas changes. In the tropics, simulated peak ozone amounts occur by about 2050 and thereafter total ozone column declines. Consequently, simulated ozone does not recover to values which existed prior to the early 1980s. The results also show a distinct hemispheric asymmetry, with recovery to 1980 values in the Northern Hemisphere extratropics ahead of the chlorine return by about 20 years. In the Southern Hemisphere midlatitudes, ozone is simulated to return to 1980 levels only 10 years ahead of chlorine. In the Antarctic, annually averaged ozone recovers at about the same rate as chlorine in high latitudes and hence does not return to 1960s values until the last decade of the simulations.
Resumo:
This chapter applies rigorous statistical analysis to existing datasets of medieval exchange rates quoted in merchants’ letters sent from Barcelona, Bruges and Venice between 1380 and 1310, which survive in the archive of Francesco di Marco Datini of Prato. First, it tests the exchange rates for stationarity. Second, it uses regression analysis to examine the seasonality of exchange rates at the three financial centres and compares them against contemporary descriptions by the merchant Giovanni di Antonio da Uzzano. Third, it tests for structural breaks in the exchange rate series.
Resumo:
Purpose: This clinical study aimed to evaluate initial, 4-months, and 1-year stability of immediately loaded dental implants inserted according to a protocol of lower rehabilitation with prefabricated bars. Materials and Methods: The sample was composed of 11 edentulous patients. In each patient, 4 interforaminal implants were inserted. Immediately after implant installation, resonance frequency analysis (RFA) for each fixation was registered as well as after 4 months and 1 year with the prosthetic bar removed as it is a screwed system. Results: The clinical implant survival rate was 100%. The RFA showed an increase in stability after 4 months from 64.09 +/- 648 to 64.31 +/- 4.96 and I year, 67.11 +/- 4.37. The analysis of variance showed a statistically significant result (P = 0.015) among implant stability quotient values for the different periods evaluated. Tukey test results showed statistically significant differences between 1-year results and the initial periods but there was no statistically significant difference between initial and 4-month results (P > 0.05). Conclusion: These preliminary 1-year results indicate that immediate loading of mandibular dental implants using the studied prefabricated bars protocol is a reliable treatment as it is in accordance with the results described in the literature for other similar techniques. (Implant Dent 2009; 18:530-538)
Resumo:
Flickering is a phenomenon related to mass accretion observed among many classes of astrophysical objects. In this paper we present a study of flickering emission lines and the continuum of the cataclysmic variable V3885 Sgr. The flickering behavior was first analyzed through statistical analysis and the power spectra of lightcurves. Autocorrelation techniques were then employed to estimate the flickering timescale of flares. A cross-correlation study between the line and its underlying continuum variability is presented. The cross-correlation between the photometric and spectroscopic data is also discussed. Periodograms, calculated using emission-line data, show a behavior that is similar to those obtained from photometric datasets found in the literature, with a plateau at lower frequencies and a power-law at higher frequencies. The power-law index is consistent with stochastic events. The cross-correlation study indicates the presence of a correlation between the variability on Ha and its underlying continuum. Flickering timescales derived from the photometric data were estimated to be 25 min for two lightcurves and 10 min for one of them. The average timescales of the line flickering is 40 min, while for its underlying continuum it drops to 20 min.
Resumo:
This work aims at combining the Chaos theory postulates and Artificial Neural Networks classification and predictive capability, in the field of financial time series prediction. Chaos theory, provides valuable qualitative and quantitative tools to decide on the predictability of a chaotic system. Quantitative measurements based on Chaos theory, are used, to decide a-priori whether a time series, or a portion of a time series is predictable, while Chaos theory based qualitative tools are used to provide further observations and analysis on the predictability, in cases where measurements provide negative answers. Phase space reconstruction is achieved by time delay embedding resulting in multiple embedded vectors. The cognitive approach suggested, is inspired by the capability of some chartists to predict the direction of an index by looking at the price time series. Thus, in this work, the calculation of the embedding dimension and the separation, in Takens‘ embedding theorem for phase space reconstruction, is not limited to False Nearest Neighbor, Differential Entropy or other specific method, rather, this work is interested in all embedding dimensions and separations that are regarded as different ways of looking at a time series by different chartists, based on their expectations. Prior to the prediction, the embedded vectors of the phase space are classified with Fuzzy-ART, then, for each class a back propagation Neural Network is trained to predict the last element of each vector, whereas all previous elements of a vector are used as features.
Resumo:
This paper analyzes empirically the effect of crude oil price change on the economic growth of Indian-Subcontinent (India, Pakistan and Bangladesh). We use a multivariate Vector Autoregressive analysis followed by Wald Granger causality test and Impulse Response Function (IRF). Wald Granger causality test results show that only India’s economic growth is significantly affected when crude oil price decreases. Impact of crude oil price increase is insignificantly negative for all three countries during first year. In second year, impact is negative but smaller than first year for India, negative but larger for Bangladesh and positive for Pakistan.
Resumo:
This study aims to investigate the relation between foreign direct investment (FDI) and per capita gross domestic product (GDP) in Pakistan. The study is based on a basic Cobb-Douglas production function. Population over age 15 to 64 is used as a proxy for labor in the investigation. The other variables used are gross capital formation, technological gap and a dummy variable measuring among other things political stability. We find positive correlation between GDP per capita in Pakistan and two variables, FDI and population over age 15 to 64. The GDP gap (gap between GDP of USA and GDP of Pakistan) is negatively correlated with GDP per capita as expected. Political instability, economic crisis, wars and polarization in the society have no significant impact on GDP per capita in the long run.
Resumo:
We study the use of para-orthogonal polynomials in solving the frequency analysis problem. Through a transformation of Delsarte and Genin, we present an approach for the frequency analysis by using the zeros and Christoffel numbers of polynomials orthogonal on the real line. This leads to a simple and fast algorithm for the estimation of frequencies. We also provide a new method, faster than the Levinson algorithm, for the determination of the reflection coefficients of the corresponding real Szego polynomials from the given moments.
Resumo:
The minority game (MG) model introduced recently provides promising insights into the understanding of the evolution of prices, indices and rates in the financial markets. In this paper we perform a time series analysis of the model employing tools from statistics, dynamical systems theory and stochastic processes. Using benchmark systems and a financial index for comparison, several conclusions are obtained about the generating mechanism for this kind of evolution. The motion is deterministic, driven by occasional random external perturbation. When the interval between two successive perturbations is sufficiently large, one can find low dimensional chaos in this regime. However, the full motion of the MG model is found to be similar to that of the first differences of the SP500 index: stochastic, nonlinear and (unit root) stationary. (C) 2002 Elsevier B.V. B.V. All rights reserved.
Resumo:
It is well known that the interstitial elements present in solid solution in metals interact with the matrix by a relaxation process known as stress induced ordering. Traditionally this relaxation process is observed in the internal friction measurements. It is a common practice that researchers present the results of the frequency together with internal friction without giving any analysis. In this work we apply an expression which relates the variation of frequency with temperature and analyse the experimental results cited in the literature of the relaxation process due to the stress induced ordering of oxygen and nitrogen present in niobium and tantalum.
Resumo:
Masticatory muscle contraction causes both jaw movement and tissue deformation during function. Natural chewing data from 25 adult miniature pigs were studied by means of time series analysis. The data set included simultaneous recordings of electromyography (EMG) from bilateral masseter (MA), zygomaticomandibularis (ZM) and lateral pterygoid muscles, bone surface strains from the left squamosal bone (SQ), condylar neck (CD) and mandibular corpus (MD), and linear deformation of the capsule of the jaw joint measured bilaterally using differential variable reluctance transducers. Pairwise comparisons were examined by calculating the cross-correlation functions. Jaw-adductor muscle activity of MA and ZM was found to be highly cross-correlated with CD and SQ strains and weakly with MD strain. No muscle’s activity was strongly linked to capsular deformation of the jaw joint, nor were bone strains and capsular deformation tightly linked. Homologous muscle pairs showed the greatest synchronization of signals, but the signals themselves were not significantly more correlated than those of non-homologous muscle pairs. These results suggested that bone strains and capsular deformation are driven by different mechanical regimes. Muscle contraction and ensuing reaction forces are probably responsible for bone strains, whereas capsular deformation is more likely a product of movement.