954 resultados para Nonlinear Time Series Analysis
Resumo:
A discussion of nonlinear dynamics, demonstrated by the familiar automobile, is followed by the development of a systematic method of analysis of a possibly nonlinear time series using difference equations in the general state-space format. This format allows recursive state-dependent parameter estimation after each observation thereby revealing the dynamics inherent in the system in combination with random external perturbations.^ The one-step ahead prediction errors at each time period, transformed to have constant variance, and the estimated parametric sequences provide the information to (1) formally test whether time series observations y(,t) are some linear function of random errors (ELEM)(,s), for some t and s, or whether the series would more appropriately be described by a nonlinear model such as bilinear, exponential, threshold, etc., (2) formally test whether a statistically significant change has occurred in structure/level either historically or as it occurs, (3) forecast nonlinear system with a new and innovative (but very old numerical) technique utilizing rational functions to extrapolate individual parameters as smooth functions of time which are then combined to obtain the forecast of y and (4) suggest a measure of resilience, i.e. how much perturbation a structure/level can tolerate, whether internal or external to the system, and remain statistically unchanged. Although similar to one-step control, this provides a less rigid way to think about changes affecting social systems.^ Applications consisting of the analysis of some familiar and some simulated series demonstrate the procedure. Empirical results suggest that this state-space or modified augmented Kalman filter may provide interesting ways to identify particular kinds of nonlinearities as they occur in structural change via the state trajectory.^ A computational flow-chart detailing computations and software input and output is provided in the body of the text. IBM Advanced BASIC program listings to accomplish most of the analysis are provided in the appendix. ^
Resumo:
Chaotic behaviour is one of the hardest problems that can happen in nonlinear dynamical systems with severe nonlinearities. It makes the system's responses unpredictable. It makes the system's responses to behave similar to noise. In some applications it should be avoided. One of the approaches to detect the chaotic behaviour is nding the Lyapunov exponent through examining the dynamical equation of the system. It needs a model of the system. The goal of this study is the diagnosis of chaotic behaviour by just exploring the data (signal) without using any dynamical model of the system. In this work two methods are tested on the time series data collected from AMB (Active Magnetic Bearing) system sensors. The rst method is used to nd the largest Lyapunov exponent by Rosenstein method. The second method is a 0-1 test for identifying chaotic behaviour. These two methods are used to detect if the data is chaotic. By using Rosenstein method it is needed to nd the minimum embedding dimension. To nd the minimum embedding dimension Cao method is used. Cao method does not give just the minimum embedding dimension, it also gives the order of the nonlinear dynamical equation of the system and also it shows how the system's signals are corrupted with noise. At the end of this research a test called runs test is introduced to show that the data is not excessively noisy.
Resumo:
This work concerns forecasting with vector nonlinear time series models when errorsare correlated. Point forecasts are numerically obtained using bootstrap methods andillustrated by two examples. Evaluation concentrates on studying forecast equality andencompassing. Nonlinear impulse responses are further considered and graphically sum-marized by highest density region. Finally, two macroeconomic data sets are used toillustrate our work. The forecasts from linear or nonlinear model could contribute usefulinformation absent in the forecasts form the other model.
Resumo:
The minority game (MG) model introduced recently provides promising insights into the understanding of the evolution of prices, indices and rates in the financial markets. In this paper we perform a time series analysis of the model employing tools from statistics, dynamical systems theory and stochastic processes. Using benchmark systems and a financial index for comparison, several conclusions are obtained about the generating mechanism for this kind of evolution. The motion is deterministic, driven by occasional random external perturbation. When the interval between two successive perturbations is sufficiently large, one can find low dimensional chaos in this regime. However, the full motion of the MG model is found to be similar to that of the first differences of the SP500 index: stochastic, nonlinear and (unit root) stationary. (C) 2002 Elsevier B.V. B.V. All rights reserved.
Resumo:
The paper aims to examine the empirical relationship between trade openness and economic growth of India for the time period 1970-2010. Trade openness is a multi-dimensional concept and hence measures of both trade barriers and trade volumes have been used as proxies for openness. The estimation results from Vector Autoregressive method suggest that growth in trade volumes accelerate economic growth in case of India. We do not find any evidence from our analysis that trade barriers lower growth.
Resumo:
A compositional time series is obtained when a compositional data vector is observed atdifferent points in time. Inherently, then, a compositional time series is a multivariatetime series with important constraints on the variables observed at any instance in time.Although this type of data frequently occurs in situations of real practical interest, atrawl through the statistical literature reveals that research in the field is very much in itsinfancy and that many theoretical and empirical issues still remain to be addressed. Anyappropriate statistical methodology for the analysis of compositional time series musttake into account the constraints which are not allowed for by the usual statisticaltechniques available for analysing multivariate time series. One general approach toanalyzing compositional time series consists in the application of an initial transform tobreak the positive and unit sum constraints, followed by the analysis of the transformedtime series using multivariate ARIMA models. In this paper we discuss the use of theadditive log-ratio, centred log-ratio and isometric log-ratio transforms. We also presentresults from an empirical study designed to explore how the selection of the initialtransform affects subsequent multivariate ARIMA modelling as well as the quality ofthe forecasts
Resumo:
Background and objective: Cefepime was one of the most used broad-spectrum antibiotics in Swiss public acute care hospitals. The drug was withdrawn from market in January 2007, and then replaced by a generic since October 2007. The goal of the study was to evaluate changes in the use of broad-spectrum antibiotics after the withdrawal of the cefepime original product. Design: A generalized regression-based interrupted time series model incorporating autocorrelated errors assessed how much the withdrawal changed the monthly use of other broad-spectrum antibiotics (ceftazidime, imipenem/cilastin, meropenem, piperacillin/ tazobactam) in defined daily doses (DDD)/100 bed-days from January 2004 to December 2008 [1, 2]. Setting: 10 Swiss public acute care hospitals (7 with\200 beds, 3 with 200-500 beds). Nine hospitals (group A) had a shortage of cefepime and 1 hospital had no shortage thanks to importation of cefepime from abroad. Main outcome measures: Underlying trend of use before the withdrawal, and changes in the level and in the trend of use after the withdrawal. Results: Before the withdrawal, the average estimated underlying trend (coefficient b1) for cefepime was decreasing by -0.047 (95% CI -0.086, -0.009) DDD/100 bed-days per month and was significant in three hospitals (group A, P\0.01). Cefepime withdrawal was associated with a significant increase in level of use (b2) of piperacillin/tazobactam and imipenem/cilastin in, respectively, one and five hospitals from group A. After the withdrawal, the average estimated trend (b3) was greatest for piperacillin/tazobactam (+0.043 DDD/100 bed-days per month; 95% CI -0.001, 0.089) and was significant in four hospitals from group A (P\0.05). The hospital without drug shortage showed no significant change in the trend and the level of use. The hypothesis of seasonality was rejected in all hospitals. Conclusions: The decreased use of cefepime already observed before its withdrawal from the market could be explained by pre-existing difficulty in drug supply. The withdrawal of cefepime resulted in change in level for piperacillin/tazobactam and imipenem/cilastin. Moreover, an increase in trend was found for piperacillin/tazobactam thereafter. As these changes generally occur at the price of lower bacterial susceptibility, a manufacturers' commitment to avoid shortages in the supply of their products would be important. As perspectives, we will measure the impact of the changes in cost and sensitivity rates of these antibiotics.
Resumo:
The original cefepime product was withdrawn from the Swiss market in January 2007, and replaced by a generic 10 months later. The goals of the study were to assess the impact of this cefepime shortage on the use and costs of alternative broad-spectrum antibiotics, on antibiotic policy, and on resistance of Pseudomonas aeruginosa towards carbapenems, ceftazidime and piperacillin-tazobactam. A generalized regression-based interrupted time series model assessed how much the shortage changed the monthly use and costs of cefepime and of selected alternative broad-spectrum antibiotics (ceftazidime, imipenem-cilastatin, meropenem, piperacillin-tazobactam) in 15 Swiss acute care hospitals from January 2005 to December 2008. Resistance of P. aeruginosa was compared before and after the cefepime shortage. There was a statistically significant increase in the consumption of piperacillin-tazobactam in hospitals with definitive interruption of cefepime supply, and of meropenem in hospitals with transient interruption of cefepime supply. Consumption of each alternative antibiotic tended to increase during the cefepime shortage and to decrease when the cefepime generic was released. These shifts were associated with significantly higher overall costs. There was no significant change in hospitals with uninterrupted cefepime supply. The alternative antibiotics for which an increase in consumption showed the strongest association with a progression of resistance were the carbapenems. The use of alternative antibiotics after cefepime withdrawal was associated with a significant increase in piperacillin-tazobactam and meropenem use and in overall costs, and with a decrease in susceptibility of P. aeruginosa in hospitals. This warrants caution with regard to shortages and withdrawals of antibiotics.
Resumo:
Findings on the effects of weather on health, especially the effects of ambient temperature on overall morbidity, remain inconsistent. We conducted a time series study to examine the acute effects of meteorological factors (mainly air temperature) on daily hospital outpatient admissions for cardiovascular disease (CVD) in Zunyi City, China, from January 1, 2007 to November 30, 2009. We used the generalized additive model with penalized splines to analyze hospital outpatient admissions, climatic parameters, and covariate data. Results show that, in Zunyi, air temperature was associated with hospital outpatient admission for CVD. When air temperature was less than 10°C, hospital outpatient admissions for CVD increased 1.07-fold with each increase of 1°C, and when air temperature was more than 10°C, an increase in air temperature by 1°C was associated with a 0.99-fold decrease in hospital outpatient admissions for CVD over the previous year. Our analyses provided statistically significant evidence that in China meteorological factors have adverse effects on the health of the general population. Further research with consistent methodology is needed to clarify the magnitude of these effects and to show which populations and individuals are vulnerable.
Resumo:
A compositional time series is obtained when a compositional data vector is observed at different points in time. Inherently, then, a compositional time series is a multivariate time series with important constraints on the variables observed at any instance in time. Although this type of data frequently occurs in situations of real practical interest, a trawl through the statistical literature reveals that research in the field is very much in its infancy and that many theoretical and empirical issues still remain to be addressed. Any appropriate statistical methodology for the analysis of compositional time series must take into account the constraints which are not allowed for by the usual statistical techniques available for analysing multivariate time series. One general approach to analyzing compositional time series consists in the application of an initial transform to break the positive and unit sum constraints, followed by the analysis of the transformed time series using multivariate ARIMA models. In this paper we discuss the use of the additive log-ratio, centred log-ratio and isometric log-ratio transforms. We also present results from an empirical study designed to explore how the selection of the initial transform affects subsequent multivariate ARIMA modelling as well as the quality of the forecasts
Resumo:
We have applied time series analytical techniques to the flux of lava from an extrusive eruption. Tilt data acting as a proxy for flux are used in a case study of the May–August 1997 period of the eruption at Soufrière Hills Volcano, Montserrat. We justify the use of such a proxy by simple calibratory arguments. Three techniques of time series analysis are employed: spectral, spectrogram and wavelet methods. In addition to the well-known ~9-hour periodicity shown by these data, a previously unknown periodic flux variability is revealed by the wavelet analysis as a 3-day cycle of frequency modulation during June–July 1997, though the physical mechanism responsible is not clear. Such time series analysis has potential for other lava flux proxies at other types of volcanoes.
Resumo:
A new structure of Radial Basis Function (RBF) neural network called the Dual-orthogonal RBF Network (DRBF) is introduced for nonlinear time series prediction. The hidden nodes of a conventional RBF network compare the Euclidean distance between the network input vector and the centres, and the node responses are radially symmetrical. But in time series prediction where the system input vectors are lagged system outputs, which are usually highly correlated, the Euclidean distance measure may not be appropriate. The DRBF network modifies the distance metric by introducing a classification function which is based on the estimation data set. Training the DRBF networks consists of two stages. Learning the classification related basis functions and the important input nodes, followed by selecting the regressors and learning the weights of the hidden nodes. In both cases, a forward Orthogonal Least Squares (OLS) selection procedure is applied, initially to select the important input nodes and then to select the important centres. Simulation results of single-step and multi-step ahead predictions over a test data set are included to demonstrate the effectiveness of the new approach.
Resumo:
Expectations of future market conditions are generally acknowledged to be crucial for the development decision and hence for shaping the built environment. This empirical study of the Central London office market from 1987 to 2009 tests for evidence of adaptive and naive expectations. Applying VAR models and a recursive OLS regression with one-step forecasts, we find evidence of adaptive and naïve, rather than rational expectations of developers. Although the magnitude of the errors and the length of time lags vary over time and development cycles, the results confirm that developers’ decisions are explained to a large extent by contemporaneous and past conditions in both London submarkets. The corollary of this finding is that developers may be able to generate excess profits by exploiting market inefficiencies but this may be hindered in practice by the long periods necessary for planning and construction of the asset. More generally, the results of this study suggest that real estate cycles are largely generated endogenously rather than being the result of unexpected exogenous shocks.
Resumo:
Simulations of 15 coupled chemistry climate models, for the period 1960–2100, are presented. The models include a detailed stratosphere, as well as including a realistic representation of the tropospheric climate. The simulations assume a consistent set of changing greenhouse gas concentrations, as well as temporally varying chlorofluorocarbon concentrations in accordance with observations for the past and expectations for the future. The ozone results are analyzed using a nonparametric additive statistical model. Comparisons are made with observations for the recent past, and the recovery of ozone, indicated by a return to 1960 and 1980 values, is investigated as a function of latitude. Although chlorine amounts are simulated to return to 1980 values by about 2050, with only weak latitudinal variations, column ozone amounts recover at different rates due to the influence of greenhouse gas changes. In the tropics, simulated peak ozone amounts occur by about 2050 and thereafter total ozone column declines. Consequently, simulated ozone does not recover to values which existed prior to the early 1980s. The results also show a distinct hemispheric asymmetry, with recovery to 1980 values in the Northern Hemisphere extratropics ahead of the chlorine return by about 20 years. In the Southern Hemisphere midlatitudes, ozone is simulated to return to 1980 levels only 10 years ahead of chlorine. In the Antarctic, annually averaged ozone recovers at about the same rate as chlorine in high latitudes and hence does not return to 1960s values until the last decade of the simulations.