954 resultados para Nonlinear time series analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines the cyclical regularities of macroeconomic, financial and property market aggregates in relation to the property stock price cycle in the UK. The Hodrick Prescott filter is employed to fit a long-term trend to the raw data, and to derive the short-term cycles of each series. It is found that the cycles of consumer expenditure, total consumption per capita, the dividend yield and the long-term bond yield are moderately correlated, and mainly coincident, with the property price cycle. There is also evidence that the nominal and real Treasury Bill rates and the interest rate spread lead this cycle by one or two quarters, and therefore that these series can be considered leading indicators of property stock prices. This study recommends that macroeconomic and financial variables can provide useful information to explain and potentially to forecast movements of property-backed stock returns in the UK.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many key economic and financial series are bounded either by construction or through policy controls. Conventional unit root tests are potentially unreliable in the presence of bounds, since they tend to over-reject the null hypothesis of a unit root, even asymptotically. So far, very little work has been undertaken to develop unit root tests which can be applied to bounded time series. In this paper we address this gap in the literature by proposing unit root tests which are valid in the presence of bounds. We present new augmented Dickey–Fuller type tests as well as new versions of the modified ‘M’ tests developed by Ng and Perron [Ng, S., Perron, P., 2001. LAG length selection and the construction of unit root tests with good size and power. Econometrica 69, 1519–1554] and demonstrate how these tests, combined with a simulation-based method to retrieve the relevant critical values, make it possible to control size asymptotically. A Monte Carlo study suggests that the proposed tests perform well in finite samples. Moreover, the tests outperform the Phillips–Perron type tests originally proposed in Cavaliere [Cavaliere, G., 2005. Limited time series with a unit root. Econometric Theory 21, 907–945]. An illustrative application to U.S. interest rate data is provided

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate data are used in a number of applications including climate risk management and adaptation to climate change. However, the availability of climate data, particularly throughout rural Africa, is very limited. Available weather stations are unevenly distributed and mainly located along main roads in cities and towns. This imposes severe limitations to the availability of climate information and services for the rural community where, arguably, these services are needed most. Weather station data also suffer from gaps in the time series. Satellite proxies, particularly satellite rainfall estimate, have been used as alternatives because of their availability even over remote parts of the world. However, satellite rainfall estimates also suffer from a number of critical shortcomings that include heterogeneous time series, short time period of observation, and poor accuracy particularly at higher temporal and spatial resolutions. An attempt is made here to alleviate these problems by combining station measurements with the complete spatial coverage of satellite rainfall estimates. Rain gauge observations are merged with a locally calibrated version of the TAMSAT satellite rainfall estimates to produce over 30-years (1983-todate) of rainfall estimates over Ethiopia at a spatial resolution of 10 km and a ten-daily time scale. This involves quality control of rain gauge data, generating locally calibrated version of the TAMSAT rainfall estimates, and combining these with rain gauge observations from national station network. The infrared-only satellite rainfall estimates produced using a relatively simple TAMSAT algorithm performed as good as or even better than other satellite rainfall products that use passive microwave inputs and more sophisticated algorithms. There is no substantial difference between the gridded-gauge and combined gauge-satellite products over the test area in Ethiopia having a dense station network; however, the combined product exhibits better quality over parts of the country where stations are sparsely distributed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

African societies are dependent on rainfall for agricultural and other water-dependent activities, yet rainfall is extremely variable in both space and time and reoccurring water shocks, such as drought, can have considerable social and economic impacts. To help improve our knowledge of the rainfall climate, we have constructed a 30-year (1983–2012), temporally consistent rainfall dataset for Africa known as TARCAT (TAMSAT African Rainfall Climatology And Time-series) using archived Meteosat thermal infra-red (TIR) imagery, calibrated against rain gauge records collated from numerous African agencies. TARCAT has been produced at 10-day (dekad) scale at a spatial resolution of 0.0375°. An intercomparison of TARCAT from 1983 to 2010 with six long-term precipitation datasets indicates that TARCAT replicates the spatial and seasonal rainfall patterns and interannual variability well, with correlation coefficients of 0.85 and 0.70 with the Climate Research Unit (CRU) and Global Precipitation Climatology Centre (GPCC) gridded-gauge analyses respectively in the interannual variability of the Africa-wide mean monthly rainfall. The design of the algorithm for drought monitoring leads to TARCAT underestimating the Africa-wide mean annual rainfall on average by −0.37 mm day−1 (21%) compared to other datasets. As the TARCAT rainfall estimates are historically calibrated across large climatically homogeneous regions, the data can provide users with robust estimates of climate related risk, even in regions where gauge records are inconsistent in time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Arctic is an important region in the study of climate change, but monitoring surface temperatures in this region is challenging, particularly in areas covered by sea ice. Here in situ, satellite and reanalysis data were utilised to investigate whether global warming over recent decades could be better estimated by changing the way the Arctic is treated in calculating global mean temperature. The degree of difference arising from using five different techniques, based on existing temperature anomaly dataset techniques, to estimate Arctic SAT anomalies over land and sea ice were investigated using reanalysis data as a testbed. Techniques which interpolated anomalies were found to result in smaller errors than non-interpolating techniques. Kriging techniques provided the smallest errors in anomaly estimates. Similar accuracies were found for anomalies estimated from in situ meteorological station SAT records using a kriging technique. Whether additional data sources, which are not currently utilised in temperature anomaly datasets, would improve estimates of Arctic surface air temperature anomalies was investigated within the reanalysis testbed and using in situ data. For the reanalysis study, the additional input anomalies were reanalysis data sampled at certain supplementary data source locations over Arctic land and sea ice areas. For the in situ data study, the additional input anomalies over sea ice were surface temperature anomalies derived from the Advanced Very High Resolution Radiometer satellite instruments. The use of additional data sources, particularly those located in the Arctic Ocean over sea ice or on islands in sparsely observed regions, can lead to substantial improvements in the accuracy of estimated anomalies. Decreases in Root Mean Square Error can be up to 0.2K for Arctic-average anomalies and more than 1K for spatially resolved anomalies. Further improvements in accuracy may be accomplished through the use of other data sources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper demonstrates by means of joint time-frequency analysis that the acoustic noise produced by the breaking of biscuits is dependent on relative humidity and water activity. It also shows that the time-frequency coefficients calculated using the adaptive Gabor transformation algorithm is dependent on the period of time a biscuit is exposed to humidity. This is a new methodology that can be used to assess the crispness of crisp foods. (c) 2007 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electrochemical systems are ideal working-horses for studying oscillatory dynamics. Experimentally obtained time series, however, are usually associated with a spontaneous drift in some uncontrollable parameter that triggers transitions among different oscillatory patterns, despite the fact that all controllable parameters are kept constant. Herein we present an empirical method to stabilize experimental potential time series. The method consists of applying a negative galvanodynamic sweep to compensate the spontaneous drift and was tested for the oscillatory electro-oxidation of methanol on platinum. For a wide range of applied currents, the base system presents spontaneous transitions from quasi-harmonic to mixed mode oscillations. Temporal patterns were stabilized by galvanodynamic sweeps at different rates. The procedure resulted in a considerable increase in the number of oscillatory cycles from 5 to 20 times, depending on the specific temporal pattern. The spontaneous drift has been associated with uncompensated oscillations, in which the coverage of some adsorbed species are not reestablished after one cycle; i.e., there is a net accumulation and/or depletion of adsorbed species during oscillations. We interpreted the rate of the galvanodynamic sweep in terms of the time scales of the poisoning processes that underlies the uncompensated oscillations and thus the spontaneous slow drift.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To develop a method for objective quantification of PD motor symptoms related to Off episodes and peak dose dyskinesias, using spiral data gathered by using a touch screen telemetry device. The aim was to objectively characterize predominant motor phenotypes (bradykinesia and dyskinesia), to help in automating the process of visual interpretation of movement anomalies in spirals as rated by movement disorder specialists. Background: A retrospective analysis was conducted on recordings from 65 patients with advanced idiopathic PD from nine different clinics in Sweden, recruited from January 2006 until August 2010. In addition to the patient group, 10 healthy elderly subjects were recruited. Upper limb movement data were collected using a touch screen telemetry device from home environments of the subjects. Measurements with the device were performed four times per day during week-long test periods. On each test occasion, the subjects were asked to trace pre-drawn Archimedean spirals, using the dominant hand. The pre-drawn spiral was shown on the screen of the device. The spiral test was repeated three times per test occasion and they were instructed to complete it within 10 seconds. The device had a sampling rate of 10Hz and measured both position and time-stamps (in milliseconds) of the pen tip. Methods: Four independent raters (FB, DH, AJ and DN) used a web interface that animated the spiral drawings and allowed them to observe different kinematic features during the drawing process and to rate task performance. Initially, a number of kinematic features were assessed including ‘impairment’, ‘speed’, ‘irregularity’ and ‘hesitation’ followed by marking the predominant motor phenotype on a 3-category scale: tremor, bradykinesia and/or choreatic dyskinesia. There were only 2 test occasions for which all the four raters either classified them as tremor or could not identify the motor phenotype. Therefore, the two main motor phenotype categories were bradykinesia and dyskinesia. ‘Impairment’ was rated on a scale from 0 (no impairment) to 10 (extremely severe) whereas ‘speed’, ‘irregularity’ and ‘hesitation’ were rated on a scale from 0 (normal) to 4 (extremely severe). The proposed data-driven method consisted of the following steps. Initially, 28 spatiotemporal features were extracted from the time series signals before being presented to a Multilayer Perceptron (MLP) classifier. The features were based on different kinematic quantities of spirals including radius, angle, speed and velocity with the aim of measuring the severity of involuntary symptoms and discriminate between PD-specific (bradykinesia) and/or treatment-induced symptoms (dyskinesia). A Principal Component Analysis was applied on the features to reduce their dimensions where 4 relevant principal components (PCs) were retained and used as inputs to the MLP classifier. Finally, the MLP classifier mapped these components to the corresponding visually assessed motor phenotype scores for automating the process of scoring the bradykinesia and dyskinesia in PD patients whilst they draw spirals using the touch screen device. For motor phenotype (bradykinesia vs. dyskinesia) classification, the stratified 10-fold cross validation technique was employed. Results: There were good agreements between the four raters when rating the individual kinematic features with intra-class correlation coefficient (ICC) of 0.88 for ‘impairment’, 0.74 for ‘speed’, 0.70 for ‘irregularity’, and moderate agreements when rating ‘hesitation’ with an ICC of 0.49. When assessing the two main motor phenotype categories (bradykinesia or dyskinesia) in animated spirals the agreements between the four raters ranged from fair to moderate. There were good correlations between mean ratings of the four raters on individual kinematic features and computed scores. The MLP classifier classified the motor phenotype that is bradykinesia or dyskinesia with an accuracy of 85% in relation to visual classifications of the four movement disorder specialists. The test-retest reliability of the four PCs across the three spiral test trials was good with Cronbach’s Alpha coefficients of 0.80, 0.82, 0.54 and 0.49, respectively. These results indicate that the computed scores are stable and consistent over time. Significant differences were found between the two groups (patients and healthy elderly subjects) in all the PCs, except for the PC3. Conclusions: The proposed method automatically assessed the severity of unwanted symptoms and could reasonably well discriminate between PD-specific and/or treatment-induced motor symptoms, in relation to visual assessments of movement disorder specialists. The objective assessments could provide a time-effect summary score that could be useful for improving decision-making during symptom evaluation of individualized treatment when the goal is to maximize functional On time for patients while minimizing their Off episodes and troublesome dyskinesias.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A challenge for the clinical management of advanced Parkinson’s disease (PD) patients is the emergence of fluctuations in motor performance, which represents a significant source of disability during activities of daily living of the patients. There is a lack of objective measurement of treatment effects for in-clinic and at-home use that can provide an overview of the treatment response. The objective of this paper was to develop a method for objective quantification of advanced PD motor symptoms related to off episodes and peak dose dyskinesia, using spiral data gathered by a touch screen telemetry device. More specifically, the aim was to objectively characterize motor symptoms (bradykinesia and dyskinesia), to help in automating the process of visual interpretation of movement anomalies in spirals as rated by movement disorder specialists. Digitized upper limb movement data of 65 advanced PD patients and 10 healthy (HE) subjects were recorded as they performed spiral drawing tasks on a touch screen device in their home environment settings. Several spatiotemporal features were extracted from the time series and used as inputs to machine learning methods. The methods were validated against ratings on animated spirals scored by four movement disorder specialists who visually assessed a set of kinematic features and the motor symptom. The ability of the method to discriminate between PD patients and HE subjects and the test-retest reliability of the computed scores were also evaluated. Computed scores correlated well with mean visual ratings of individual kinematic features. The best performing classifier (Multilayer Perceptron) classified the motor symptom (bradykinesia or dyskinesia) with an accuracy of 84% and area under the receiver operating characteristics curve of 0.86 in relation to visual classifications of the raters. In addition, the method provided high discriminating power when distinguishing between PD patients and HE subjects as well as had good test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using national accounts data for the revenue-GDP and expenditure GDP ratios from 1947 to 1992, we examine two central issues in public finance. First, was the path of public debt sustainable during this period? Second, if debt is sustainable, how has the government historically balanced the budget after hocks to either revenues or expenditures? The results show that (i) public deficit is stationary (bounded asymptotic variance), with the budget in Brazil being balanced almost entirely through changes in taxes, regardless of the cause of the initial imbalance. Expenditures are weakly exogenous, but tax revenues are not;(ii) a rational Brazilian consumer can have a behavior consistent with Ricardian Equivalence (iii) seignorage revenues are critical to restore intertemporal budget equilibrium, since, when we exclude them from total revenues, debt is not sustainable in econometric tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to provide evidence on output convergence among the Mercosur countries and associates, using multivariate time-series tests. The methodology is based on a combination of tests and estimation procedures, both univariate and multivariate, applied to the differences in per capita real income. We use the definitions of time-series convergence proposed by Bernard & Durlauf and apply unit root and tests proposed by Abuaf & Jorion and Taylor & Sarno. In this same multivariate context, the Flôres, Preumont & Szafarz and Breuer, MbNown & Wallace tests, which allow for the existence of correlations across the series without imposing a common speed of mean reversion, identify the countries that convergence. Concerning the empirical results, there is evidence of long-run convergence or, at least, catching up, for the smaller countries, Bolivia, Paraguay, Peru and Uruguay, towards Brazil and, to some extent, Argentina. In contrast, the evidence on convergence for the larger countries is weaker, as they have followed different (or rather opposing) macroeconomic policy strategies. Thus the future of the whole area will critically depend on the ability of Brazil, Argentina and Chile to find some scope for more cooperative policy actions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Initial endogenous growth models emphasized the importance of external effects and increasing retums in explaining growth. Empirically, this hypothesis can be confumed if the coefficient of physical capital per hour is unity in the aggregate production function. Previous estimates using time series data rejected this hypothesis, although cross-country estimates did nol The problem lies with the techniques employed, which are unable to capture low-frequency movements of high-frequency data. Using cointegration, new time series evidence confum the theory and conform to cross-country evidence. The implied Solow residual, which takes into account externaI effects to aggregate capital, has its behavior analyzed. The hypothesis that it is explained by government expenditures on infrasttucture is confIrmed. This suggests a supply-side role for government affecting productivity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While it is recognized that output fuctuations are highly persistent over certain range, less persistent results are also found around very long horizons (Conchrane, 1988), indicating the existence of local or temporary persistency. In this paper, we study time series with local persistency. A test for stationarity against locally persistent alternative is proposed. Asymptotic distributions of the test statistic are provided under both the null and the alternative hypothesis of local persistency. Monte Carlo experiment is conducted to study the power and size of the test. An empirical application reveals that many US real economic variables may exhibit local persistency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chambers (1998) explores the interaction between long memory and aggregation. For continuous-time processes, he takes the aliasing effect into account when studying temporal aggregation. For discrete-time processes, however, he seems to fail to do so. This note gives the spectral density function of temporally aggregated long memory discrete-time processes in light of the aliasing effect. The results are different from those in Chambers (1998) and are supported by a small simulation exercise. As a result, the order of aggregation may not be invariant to temporal aggregation, specifically if d is negative and the aggregation is of the stock type.