944 resultados para Time Series Model
Resumo:
African societies are dependent on rainfall for agricultural and other water-dependent activities, yet rainfall is extremely variable in both space and time and reoccurring water shocks, such as drought, can have considerable social and economic impacts. To help improve our knowledge of the rainfall climate, we have constructed a 30-year (1983–2012), temporally consistent rainfall dataset for Africa known as TARCAT (TAMSAT African Rainfall Climatology And Time-series) using archived Meteosat thermal infra-red (TIR) imagery, calibrated against rain gauge records collated from numerous African agencies. TARCAT has been produced at 10-day (dekad) scale at a spatial resolution of 0.0375°. An intercomparison of TARCAT from 1983 to 2010 with six long-term precipitation datasets indicates that TARCAT replicates the spatial and seasonal rainfall patterns and interannual variability well, with correlation coefficients of 0.85 and 0.70 with the Climate Research Unit (CRU) and Global Precipitation Climatology Centre (GPCC) gridded-gauge analyses respectively in the interannual variability of the Africa-wide mean monthly rainfall. The design of the algorithm for drought monitoring leads to TARCAT underestimating the Africa-wide mean annual rainfall on average by −0.37 mm day−1 (21%) compared to other datasets. As the TARCAT rainfall estimates are historically calibrated across large climatically homogeneous regions, the data can provide users with robust estimates of climate related risk, even in regions where gauge records are inconsistent in time.
Resumo:
Factor forecasting models are shown to deliver real-time gains over autoregressive models for US real activity variables during the recent period, but are less successful for nominal variables. The gains are largely due to the Financial Crisis period, and are primarily at the shortest (one quarter ahead) horizon. Excluding the pre-Great Moderation years from the factor forecasting model estimation period (but not from the data used to extract factors) results in a marked fillip in factor model forecast accuracy, but does the same for the AR model forecasts. The relative performance of the factor models compared to the AR models is largely unaffected by whether the exercise is in real time or is pseudo out-of-sample.
Resumo:
Empirical Mode Decomposition is presented as an alternative to traditional analysis methods to decompose geomagnetic time series into spectral components. Important comments on the algorithm and its variations will be given. Using this technique, planetary wave modes of 5-, 10-, and 16-day mean periods can be extracted from magnetic field components of three different stations in Germany. In a second step, the amplitude modulation functions of these wave modes can be shown to contain significant contribution from solar cycle variation through correlation with smoothed sunspot numbers. Additionally, the data indicate connections with geomagnetic jerk occurrences, supported by a second set of data providing reconstructed near-Earth magnetic field for 150 years. Usually attributed to internal dynamo processes within the Earth's outer core, the question of who is impacting whom will be briefly discussed here.
Resumo:
The Arctic is an important region in the study of climate change, but monitoring surface temperatures in this region is challenging, particularly in areas covered by sea ice. Here in situ, satellite and reanalysis data were utilised to investigate whether global warming over recent decades could be better estimated by changing the way the Arctic is treated in calculating global mean temperature. The degree of difference arising from using five different techniques, based on existing temperature anomaly dataset techniques, to estimate Arctic SAT anomalies over land and sea ice were investigated using reanalysis data as a testbed. Techniques which interpolated anomalies were found to result in smaller errors than non-interpolating techniques. Kriging techniques provided the smallest errors in anomaly estimates. Similar accuracies were found for anomalies estimated from in situ meteorological station SAT records using a kriging technique. Whether additional data sources, which are not currently utilised in temperature anomaly datasets, would improve estimates of Arctic surface air temperature anomalies was investigated within the reanalysis testbed and using in situ data. For the reanalysis study, the additional input anomalies were reanalysis data sampled at certain supplementary data source locations over Arctic land and sea ice areas. For the in situ data study, the additional input anomalies over sea ice were surface temperature anomalies derived from the Advanced Very High Resolution Radiometer satellite instruments. The use of additional data sources, particularly those located in the Arctic Ocean over sea ice or on islands in sparsely observed regions, can lead to substantial improvements in the accuracy of estimated anomalies. Decreases in Root Mean Square Error can be up to 0.2K for Arctic-average anomalies and more than 1K for spatially resolved anomalies. Further improvements in accuracy may be accomplished through the use of other data sources.
Resumo:
Flickering is a phenomenon related to mass accretion observed among many classes of astrophysical objects. In this paper we present a study of flickering emission lines and the continuum of the cataclysmic variable V3885 Sgr. The flickering behavior was first analyzed through statistical analysis and the power spectra of lightcurves. Autocorrelation techniques were then employed to estimate the flickering timescale of flares. A cross-correlation study between the line and its underlying continuum variability is presented. The cross-correlation between the photometric and spectroscopic data is also discussed. Periodograms, calculated using emission-line data, show a behavior that is similar to those obtained from photometric datasets found in the literature, with a plateau at lower frequencies and a power-law at higher frequencies. The power-law index is consistent with stochastic events. The cross-correlation study indicates the presence of a correlation between the variability on Ha and its underlying continuum. Flickering timescales derived from the photometric data were estimated to be 25 min for two lightcurves and 10 min for one of them. The average timescales of the line flickering is 40 min, while for its underlying continuum it drops to 20 min.
Resumo:
Electrochemical systems are ideal working-horses for studying oscillatory dynamics. Experimentally obtained time series, however, are usually associated with a spontaneous drift in some uncontrollable parameter that triggers transitions among different oscillatory patterns, despite the fact that all controllable parameters are kept constant. Herein we present an empirical method to stabilize experimental potential time series. The method consists of applying a negative galvanodynamic sweep to compensate the spontaneous drift and was tested for the oscillatory electro-oxidation of methanol on platinum. For a wide range of applied currents, the base system presents spontaneous transitions from quasi-harmonic to mixed mode oscillations. Temporal patterns were stabilized by galvanodynamic sweeps at different rates. The procedure resulted in a considerable increase in the number of oscillatory cycles from 5 to 20 times, depending on the specific temporal pattern. The spontaneous drift has been associated with uncompensated oscillations, in which the coverage of some adsorbed species are not reestablished after one cycle; i.e., there is a net accumulation and/or depletion of adsorbed species during oscillations. We interpreted the rate of the galvanodynamic sweep in terms of the time scales of the poisoning processes that underlies the uncompensated oscillations and thus the spontaneous slow drift.
Resumo:
This work aims at combining the Chaos theory postulates and Artificial Neural Networks classification and predictive capability, in the field of financial time series prediction. Chaos theory, provides valuable qualitative and quantitative tools to decide on the predictability of a chaotic system. Quantitative measurements based on Chaos theory, are used, to decide a-priori whether a time series, or a portion of a time series is predictable, while Chaos theory based qualitative tools are used to provide further observations and analysis on the predictability, in cases where measurements provide negative answers. Phase space reconstruction is achieved by time delay embedding resulting in multiple embedded vectors. The cognitive approach suggested, is inspired by the capability of some chartists to predict the direction of an index by looking at the price time series. Thus, in this work, the calculation of the embedding dimension and the separation, in Takens‘ embedding theorem for phase space reconstruction, is not limited to False Nearest Neighbor, Differential Entropy or other specific method, rather, this work is interested in all embedding dimensions and separations that are regarded as different ways of looking at a time series by different chartists, based on their expectations. Prior to the prediction, the embedded vectors of the phase space are classified with Fuzzy-ART, then, for each class a back propagation Neural Network is trained to predict the last element of each vector, whereas all previous elements of a vector are used as features.
Resumo:
This paper analyzes empirically the effect of crude oil price change on the economic growth of Indian-Subcontinent (India, Pakistan and Bangladesh). We use a multivariate Vector Autoregressive analysis followed by Wald Granger causality test and Impulse Response Function (IRF). Wald Granger causality test results show that only India’s economic growth is significantly affected when crude oil price decreases. Impact of crude oil price increase is insignificantly negative for all three countries during first year. In second year, impact is negative but smaller than first year for India, negative but larger for Bangladesh and positive for Pakistan.
Resumo:
This study aims to investigate the relation between foreign direct investment (FDI) and per capita gross domestic product (GDP) in Pakistan. The study is based on a basic Cobb-Douglas production function. Population over age 15 to 64 is used as a proxy for labor in the investigation. The other variables used are gross capital formation, technological gap and a dummy variable measuring among other things political stability. We find positive correlation between GDP per capita in Pakistan and two variables, FDI and population over age 15 to 64. The GDP gap (gap between GDP of USA and GDP of Pakistan) is negatively correlated with GDP per capita as expected. Political instability, economic crisis, wars and polarization in the society have no significant impact on GDP per capita in the long run.
Resumo:
Using national accounts data for the revenue-GDP and expenditure GDP ratios from 1947 to 1992, we examine two central issues in public finance. First, was the path of public debt sustainable during this period? Second, if debt is sustainable, how has the government historically balanced the budget after hocks to either revenues or expenditures? The results show that (i) public deficit is stationary (bounded asymptotic variance), with the budget in Brazil being balanced almost entirely through changes in taxes, regardless of the cause of the initial imbalance. Expenditures are weakly exogenous, but tax revenues are not;(ii) a rational Brazilian consumer can have a behavior consistent with Ricardian Equivalence (iii) seignorage revenues are critical to restore intertemporal budget equilibrium, since, when we exclude them from total revenues, debt is not sustainable in econometric tests.
Resumo:
The aim of this paper is to provide evidence on output convergence among the Mercosur countries and associates, using multivariate time-series tests. The methodology is based on a combination of tests and estimation procedures, both univariate and multivariate, applied to the differences in per capita real income. We use the definitions of time-series convergence proposed by Bernard & Durlauf and apply unit root and tests proposed by Abuaf & Jorion and Taylor & Sarno. In this same multivariate context, the Flôres, Preumont & Szafarz and Breuer, MbNown & Wallace tests, which allow for the existence of correlations across the series without imposing a common speed of mean reversion, identify the countries that convergence. Concerning the empirical results, there is evidence of long-run convergence or, at least, catching up, for the smaller countries, Bolivia, Paraguay, Peru and Uruguay, towards Brazil and, to some extent, Argentina. In contrast, the evidence on convergence for the larger countries is weaker, as they have followed different (or rather opposing) macroeconomic policy strategies. Thus the future of the whole area will critically depend on the ability of Brazil, Argentina and Chile to find some scope for more cooperative policy actions.
Resumo:
Initial endogenous growth models emphasized the importance of external effects and increasing retums in explaining growth. Empirically, this hypothesis can be confumed if the coefficient of physical capital per hour is unity in the aggregate production function. Previous estimates using time series data rejected this hypothesis, although cross-country estimates did nol The problem lies with the techniques employed, which are unable to capture low-frequency movements of high-frequency data. Using cointegration, new time series evidence confum the theory and conform to cross-country evidence. The implied Solow residual, which takes into account externaI effects to aggregate capital, has its behavior analyzed. The hypothesis that it is explained by government expenditures on infrasttucture is confIrmed. This suggests a supply-side role for government affecting productivity.
Resumo:
While it is recognized that output fuctuations are highly persistent over certain range, less persistent results are also found around very long horizons (Conchrane, 1988), indicating the existence of local or temporary persistency. In this paper, we study time series with local persistency. A test for stationarity against locally persistent alternative is proposed. Asymptotic distributions of the test statistic are provided under both the null and the alternative hypothesis of local persistency. Monte Carlo experiment is conducted to study the power and size of the test. An empirical application reveals that many US real economic variables may exhibit local persistency.
Resumo:
Chambers (1998) explores the interaction between long memory and aggregation. For continuous-time processes, he takes the aliasing effect into account when studying temporal aggregation. For discrete-time processes, however, he seems to fail to do so. This note gives the spectral density function of temporally aggregated long memory discrete-time processes in light of the aliasing effect. The results are different from those in Chambers (1998) and are supported by a small simulation exercise. As a result, the order of aggregation may not be invariant to temporal aggregation, specifically if d is negative and the aggregation is of the stock type.
Resumo:
Using national accounts data for the revenue-GDP and expenditureGDP ratios from 1947 to 1992, we examine three central issues in public finance. First, was the path of public debt sustainable during this period? Second, if debt is sustainable, how has the government historically balanced the budget after shocks to either revenues or expenditures? Third, are expenditures exogenous? The results show that (i) public deficit is stationary (bounded asymptotic variance), with the budget in Brazil being balanced almost entirely through changes in taxes, regardless of the cause of the initial imbalance. Expenditures are weakly exogenous, but tax revenues are not; (ii) the behavior of a rational Brazilian consumer may be consistent with Ricardian Equivalence; (iii) seigniorage revenues are critical to restore intertemporal budget equilibrium, since, when we exclude them from total revenues, debt is not sustainable in econometric tests.