921 resultados para Non-linear time series


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Box-Cox transformation is a technique mostly utilized to turn the probabilistic distribution of a time series data into approximately normal. And this helps statistical and neural models to perform more accurate forecastings. However, it introduces a bias when the reversion of the transformation is conducted with the predicted data. The statistical methods to perform a bias-free reversion require, necessarily, the assumption of Gaussianity of the transformed data distribution, which is a rare event in real-world time series. So, the aim of this study was to provide an effective method of removing the bias when the reversion of the Box-Cox transformation is executed. Thus, the developed method is based on a focused time lagged feedforward neural network, which does not require any assumption about the transformed data distribution. Therefore, to evaluate the performance of the proposed method, numerical simulations were conducted and the Mean Absolute Percentage Error, the Theil Inequality Index and the Signal-to-Noise ratio of 20-step-ahead forecasts of 40 time series were compared, and the results obtained indicate that the proposed reversion method is valid and justifies new studies. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, a non-linear Boundary Element Method (BEM) formulation with damage model is extended for numerical simulation of structural masonry walls in 2D stress analysis. The formulation is reoriented to analyse structural masonry, the component materials of which, clay bricks and mortar, are considered as damaged materials. Also considered are the internal variables and cell discretization of the domain. A damage model is used to represent the material behaviour and the domain discretization is also proposed and discussed. The paper presents the numerical parameters of the damage model for the material properties of the masonry components, clay bricks and mortar. Some examples are shown to validate the formulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The scope of this paper was to analyze the association between homicides and public security indicators in Sao Paulo between 1996 and 2008, after monitoring the unemployment rate and the proportion of youths in the population. A time-series ecological study for 1996 and 2008 was conducted with Sao Paulo as the unit of analysis. Dependent variable: number of deaths by homicide per year. Main independent variables: arrest-incarceration rate, access to firearms, police activity. Data analysis was conducted using Stata. IC 10.0 software. Simple and multivariate negative binomial regression models were created. Deaths by homicide and arrest-incarceration, as well as police activity were significantly associated in simple regression analysis. Access to firearms was not significantly associated to the reduction in the number of deaths by homicide (p>0,05). After adjustment, the associations with both the public security indicators were not significant. In Sao Paulo the role of public security indicators are less important as explanatory factors for a reduction in homicide rates, after adjustment for unemployment rate and a reduction in the proportion of youths. The results reinforce the importance of socioeconomic and demographic factors for a change in the public security scenario in Sao Paulo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The plasma density evolution in sawtooth regime on the Tore Supra tokamak is analyzed. The density is measured using fast-sweeping X-mode reflectometry which allows tomographic reconstructions. There is evidence that density is governed by the perpendicular electric flows, while temperature evolution is dominated by parallel diffusion. Postcursor oscillations sometimes lead to the formation of a density plateau, which is explained in terms of convection cells associated with the kink mode. A crescent-shaped density structure located inside q = 1 is often visible just after the crash and indicates that some part of the density withstands the crash. 3D full MHD nonlinear simulations with the code XTOR-2F recover this structure and show that it arises from the perpendicular flows emerging from the reconnection layer. The proportion of density reinjected inside the q = 1 surface is determined, and the implications in terms of helium ash transport are discussed. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4766893]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Complexity in time series is an intriguing feature of living dynamical systems, with potential use for identification of system state. Although various methods have been proposed for measuring physiologic complexity, uncorrelated time series are often assigned high values of complexity, errouneously classifying them as a complex physiological signals. Here, we propose and discuss a method for complex system analysis based on generalized statistical formalism and surrogate time series. Sample entropy (SampEn) was rewritten inspired in Tsallis generalized entropy, as function of q parameter (qSampEn). qSDiff curves were calculated, which consist of differences between original and surrogate series qSampEn. We evaluated qSDiff for 125 real heart rate variability (HRV) dynamics, divided into groups of 70 healthy, 44 congestive heart failure (CHF), and 11 atrial fibrillation (AF) subjects, and for simulated series of stochastic and chaotic process. The evaluations showed that, for nonperiodic signals, qSDiff curves have a maximum point (qSDiff(max)) for q not equal 1. Values of q where the maximum point occurs and where qSDiff is zero were also evaluated. Only qSDiff(max) values were capable of distinguish HRV groups (p-values 5.10 x 10(-3); 1.11 x 10(-7), and 5.50 x 10(-7) for healthy vs. CHF, healthy vs. AF, and CHF vs. AF, respectively), consistently with the concept of physiologic complexity, and suggests a potential use for chaotic system analysis. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4758815]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A deep theoretical analysis of the graph cut image segmentation framework presented in this paper simultaneously translates into important contributions in several directions. The most important practical contribution of this work is a full theoretical description, and implementation, of a novel powerful segmentation algorithm, GC(max). The output of GC(max) coincides with a version of a segmentation algorithm known as Iterative Relative Fuzzy Connectedness, IRFC. However, GC(max) is considerably faster than the classic IRFC algorithm, which we prove theoretically and show experimentally. Specifically, we prove that, in the worst case scenario, the GC(max) algorithm runs in linear time with respect to the variable M=|C|+|Z|, where |C| is the image scene size and |Z| is the size of the allowable range, Z, of the associated weight/affinity function. For most implementations, Z is identical to the set of allowable image intensity values, and its size can be treated as small with respect to |C|, meaning that O(M)=O(|C|). In such a situation, GC(max) runs in linear time with respect to the image size |C|. We show that the output of GC(max) constitutes a solution of a graph cut energy minimization problem, in which the energy is defined as the a"" (a) norm ayenF (P) ayen(a) of the map F (P) that associates, with every element e from the boundary of an object P, its weight w(e). This formulation brings IRFC algorithms to the realm of the graph cut energy minimizers, with energy functions ayenF (P) ayen (q) for qa[1,a]. Of these, the best known minimization problem is for the energy ayenF (P) ayen(1), which is solved by the classic min-cut/max-flow algorithm, referred to often as the Graph Cut algorithm. We notice that a minimization problem for ayenF (P) ayen (q) , qa[1,a), is identical to that for ayenF (P) ayen(1), when the original weight function w is replaced by w (q) . Thus, any algorithm GC(sum) solving the ayenF (P) ayen(1) minimization problem, solves also one for ayenF (P) ayen (q) with qa[1,a), so just two algorithms, GC(sum) and GC(max), are enough to solve all ayenF (P) ayen (q) -minimization problems. We also show that, for any fixed weight assignment, the solutions of the ayenF (P) ayen (q) -minimization problems converge to a solution of the ayenF (P) ayen(a)-minimization problem (ayenF (P) ayen(a)=lim (q -> a)ayenF (P) ayen (q) is not enough to deduce that). An experimental comparison of the performance of GC(max) and GC(sum) algorithms is included. This concentrates on comparing the actual (as opposed to provable worst scenario) algorithms' running time, as well as the influence of the choice of the seeds on the output.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The leaf area index (LAI) is a key characteristic of forest ecosystems. Estimations of LAI from satellite images generally rely on spectral vegetation indices (SVIs) or radiative transfer model (RTM) inversions. We have developed a new and precise method suitable for practical application, consisting of building a species-specific SVI that is best-suited to both sensor and vegetation characteristics. Such an SVI requires calibration on a large number of representative vegetation conditions. We developed a two-step approach: (1) estimation of LAI on a subset of satellite data through RTM inversion; and (2) the calibration of a vegetation index on these estimated LAI. We applied this methodology to Eucalyptus plantations which have highly variable LAI in time and space. Previous results showed that an RTM inversion of Moderate Resolution Imaging Spectroradiometer (MODIS) near-infrared and red reflectance allowed good retrieval performance (R-2 = 0.80, RMSE = 0.41), but was computationally difficult. Here, the RTM results were used to calibrate a dedicated vegetation index (called "EucVI") which gave similar LAI retrieval results but in a simpler way. The R-2 of the regression between measured and EucVI-simulated LAI values on a validation dataset was 0.68, and the RMSE was 0.49. The additional use of stand age and day of year in the SVI equation slightly increased the performance of the index (R-2 = 0.77 and RMSE = 0.41). This simple index opens the way to an easily applicable retrieval of Eucalyptus LAI from MODIS data, which could be used in an operational way.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Brazil is the largest sugarcane producer in the world and has a privileged position to attend to national and international market places. To maintain the high production of sugarcane, it is fundamental to improve the forecasting models of crop seasons through the use of alternative technologies, such as remote sensing. Thus, the main purpose of this article is to assess the results of two different statistical forecasting methods applied to an agroclimatic index (the water requirement satisfaction index; WRSI) and the sugarcane spectral response (normalized difference vegetation index; NDVI) registered on National Oceanic and Atmospheric Administration Advanced Very High Resolution Radiometer (NOAA-AVHRR) satellite images. We also evaluated the cross-correlation between these two indexes. According to the results obtained, there are meaningful correlations between NDVI and WRSI with time lags. Additionally, the adjusted model for NDVI presented more accurate results than the forecasting models for WRSI. Finally, the analyses indicate that NDVI is more predictable due to its seasonality and the WRSI values are more variable making it difficult to forecast.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stochastic methods based on time-series modeling combined with geostatistics can be useful tools to describe the variability of water-table levels in time and space and to account for uncertainty. Monitoring water-level networks can give information about the dynamic of the aquifer domain in both dimensions. Time-series modeling is an elegant way to treat monitoring data without the complexity of physical mechanistic models. Time-series model predictions can be interpolated spatially, with the spatial differences in water-table dynamics determined by the spatial variation in the system properties and the temporal variation driven by the dynamics of the inputs into the system. An integration of stochastic methods is presented, based on time-series modeling and geostatistics as a framework to predict water levels for decision making in groundwater management and land-use planning. The methodology is applied in a case study in a Guarani Aquifer System (GAS) outcrop area located in the southeastern part of Brazil. Communication of results in a clear and understandable form, via simulated scenarios, is discussed as an alternative, when translating scientific knowledge into applications of stochastic hydrogeology in large aquifers with limited monitoring network coverage like the GAS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work investigates the behavior of the sunspot number and Southern Oscillation Index (SOI) signal recorded in the tree ring time series for three different locations in Brazil: Humaita in Amaznia State, Porto Ferreira in So Paulo State, and Passo Fundo in Rio Grande do Sul State, using wavelet and cross-wavelet analysis techniques. The wavelet spectra of tree ring time series showed signs of 11 and 22 years, possibly related to the solar activity, and periods of 2-8 years, possibly related to El Nio events. The cross-wavelet spectra for all tree ring time series from Brazil present a significant response to the 11-year solar cycle in the time interval between 1921 to after 1981. These tree ring time series still have a response to the second harmonic of the solar cycle (5.5 years), but in different time intervals. The cross-wavelet maps also showed that the relationship between the SOI x tree ring time series is more intense, for oscillation in the range of 4-8 years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Background A popular model for gene regulatory networks is the Boolean network model. In this paper, we propose an algorithm to perform an analysis of gene regulatory interactions using the Boolean network model and time-series data. Actually, the Boolean network is restricted in the sense that only a subset of all possible Boolean functions are considered. We explore some mathematical properties of the restricted Boolean networks in order to avoid the full search approach. The problem is modeled as a Constraint Satisfaction Problem (CSP) and CSP techniques are used to solve it. Results We applied the proposed algorithm in two data sets. First, we used an artificial dataset obtained from a model for the budding yeast cell cycle. The second data set is derived from experiments performed using HeLa cells. The results show that some interactions can be fully or, at least, partially determined under the Boolean model considered. Conclusions The algorithm proposed can be used as a first step for detection of gene/protein interactions. It is able to infer gene relationships from time-series data of gene expression, and this inference process can be aided by a priori knowledge available.