146 resultados para stationarity


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The soil microflora is very heterogeneous in its spatial distribution. The origins of this heterogeneity and its significance for soil function are not well understood. A problem for understanding spatial variation better is the assumption of statistical stationarity that is made in most of the statistical methods used to assess it. These assumptions are made explicit in geostatistical methods that have been increasingly used by soil biologists in recent years. Geostatistical methods are powerful, particularly for local prediction, but they require the assumption that the variability of a property of interest is spatially uniform, which is not always plausible given what is known about the complexity of the soil microflora and the soil environment. We have used the wavelet transform, a relatively new innovation in mathematical analysis, to investigate the spatial variation of abundance of Azotobacter in the soil of a typical agricultural landscape. The wavelet transform entails no assumptions of stationarity and is well suited to the analysis of variables that show intermittent or transient features at different spatial scales. In this study, we computed cross-variograms of Azotobacter abundance with the pH, water content and loss on ignition of the soil. These revealed scale-dependent covariation in all cases. The wavelet transform also showed that the correlation of Azotobacter abundance with all three soil properties depended on spatial scale, the correlation generally increased with spatial scale and was only significantly different from zero at some scales. However, the wavelet analysis also allowed us to show how the correlation changed across the landscape. For example, at one scale Azotobacter abundance was strongly correlated with pH in part of the transect, and not with soil water content, but this was reversed elsewhere on the transect. The results show how scale-dependent variation of potentially limiting environmental factors can induce a complex spatial pattern of abundance in a soil organism. The geostatistical methods that we used here make assumptions that are not consistent with the spatial changes in the covariation of these properties that our wavelet analysis has shown. This suggests that the wavelet transform is a powerful tool for future investigation of the spatial structure and function of soil biota. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Atmospheric factors Governing Banded Orographic Convection The three-dimensional structure of shallow orographic convection is investigated through simulations performed with a cloud-resolving numerical model. In moist flows that overcome a given topographic barrier to form statically unstable cap clouds, the organization of the convection depends on both the atmospheric structure and the mechanism by which the convection is initiated. Convection initiated by background thermal fluctuations embedded in the flow over a smooth mountain (without any small-scale topographic features) tends to be cellular and disorganized except that shear-parallel bands may form in flows with strong unidirectional vertical shear. The development of well-organized bands is favored when there is weak static instability inside the cloud and when the dry air surrounding the cloud is strongly stable. These bands move with the flow and distribute their cumulative precipitation evenly over the mountain upslope. Similar shear-parallel bands also develop in flows where convection is initiated by small-scale topographic noise superimposed onto the main mountain profile, but in this case stronger circulations are also triggered that create stationary rainbands parallel to the low-level flow. This second dominant mode, which is less sensitive to the atmospheric structure and the strength of forcing, is triggered by lee waves that form over small-scale topographic bumps near the upstream edge of the main orographic cloud. Due to their stationarity, these flow-parallel bands can produce locally heavy precipitation amounts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Time/frequency and temporal analyses have been widely used in biomedical signal processing. These methods represent important characteristics of a signal in both time and frequency domain. In this way, essential features of the signal can be viewed and analysed in order to understand or model the physiological system. Historically, Fourier spectral analyses have provided a general method for examining the global energy/frequency distributions. However, an assumption inherent to these methods is the stationarity of the signal. As a result, Fourier methods are not generally an appropriate approach in the investigation of signals with transient components. This work presents the application of a new signal processing technique, empirical mode decomposition and the Hilbert spectrum, in the analysis of electromyographic signals. The results show that this method may provide not only an increase in the spectral resolution but also an insight into the underlying process of the muscle contraction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current methods for estimating event-related potentials (ERPs) assume stationarity of the signal. Empirical Mode Decomposition (EMD) is a data-driven decomposition technique that does not assume stationarity. We evaluated an EMD-based method for estimating the ERP. On simulated data, EMD substantially reduced background EEG while retaining the ERP. EMD-denoised single trials also estimated shape, amplitude, and latency of the ERP better than raw single trials. On experimental data, EMD-denoised trials revealed event-related differences between two conditions (condition A and B) more effectively than trials lowpass filtered at 40 Hz. EMD also revealed event-related differences on both condition A and condition B that were clearer and of longer duration than those revealed by low-pass filtering at 40 Hz. Thus, EMD-based denoising is a promising data-driven, nonstationary method for estimating ERPs and should be investigated further.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The occurrence of strong and persistent mid-latitude anticyclonic ridges over the Eastern North Atlantic is a major contributor to the occurrence of severe winter droughts over Western Iberia. We analyze the development of strong and persistent ridge episodes within 40–50°N; 40°W–5°E, which are defined as 300 hPa geopotential height anomalies above 50 gpm that persist for at least 10 consecutive days. Results suggest that the generation and maintenance of these episodes, with positive stratospheric geopotential anomalies over the North American continent and the adjacent North Pacific, are associated with an intensified polar jet. Such positive anomalies tend to detach from the main stratospheric anomaly and propagate eastwards and downwards as Rossby tropospheric waves. Furthermore, the Eastern North Atlantic ridge is generated and repeatedly reinforced until the stratospheric anomaly dissipates. Results also show evidence for waves breaking anticyclonically during the episodes, which is dynamically coherent with their persistency and quasi-stationarity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter applies rigorous statistical analysis to existing datasets of medieval exchange rates quoted in merchants’ letters sent from Barcelona, Bruges and Venice between 1380 and 1310, which survive in the archive of Francesco di Marco Datini of Prato. First, it tests the exchange rates for stationarity. Second, it uses regression analysis to examine the seasonality of exchange rates at the three financial centres and compares them against contemporary descriptions by the merchant Giovanni di Antonio da Uzzano. Third, it tests for structural breaks in the exchange rate series.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a selection of methodologies for using the palaeo-climate model component of the Coupled Model Intercomparison Project (Phase 5) (CMIP5) to attempt to constrain future climate projections using the same models. The constraints arise from measures of skill in hindcasting palaeo-climate changes from the present over three periods: the Last Glacial Maximum (LGM) (21 000 yr before present, ka), the mid-Holocene (MH) (6 ka) and the Last Millennium (LM) (850–1850 CE). The skill measures may be used to validate robust patterns of climate change across scenarios or to distinguish between models that have differing outcomes in future scenarios. We find that the multi-model ensemble of palaeo-simulations is adequate for addressing at least some of these issues. For example, selected benchmarks for the LGM and MH are correlated to the rank of future projections of precipitation/temperature or sea ice extent to indicate that models that produce the best agreement with palaeo-climate information give demonstrably different future results than the rest of the models. We also explore cases where comparisons are strongly dependent on uncertain forcing time series or show important non-stationarity, making direct inferences for the future problematic. Overall, we demonstrate that there is a strong potential for the palaeo-climate simulations to help inform the future projections and urge all the modelling groups to complete this subset of the CMIP5 runs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lagged correlation analysis is often used to infer intraseasonal dynamical effects but is known to be affected by non-stationarity. We highlight a pronounced quasi-two-year peak in the anomalous zonal wind and eddy momentum flux convergence power spectra in the Southern Hemisphere, which is prima facie evidence for non-stationarity. We then investigate the consequences of this non-stationarity for the Southern Annular Mode and for eddy momentum flux convergence. We argue that positive lagged correlations previously attributed to the existence of an eddy feedback are more plausibly attributed to non-stationary interannual variability external to any potential feedback process in the mid-latitude troposphere. The findings have implications for the diagnosis of feedbacks in both models and re-analysis data as well as for understanding the mechanisms underlying variations in the zonal wind.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Madden-Julian oscillation (MJO) is the most prominent form of tropical intraseasonal variability. This study investigated the following questions. Do inter-annual-to-decadal variations in tropical sea surface temperature (SST) lead to substantial changes in MJO activity? Was there a change in the MJO in the 1970s? Can this change be associated to SST anomalies? What was the level of MJO activity in the pre-reanalysis era? These questions were investigated with a stochastic model of the MJO. Reanalysis data (1948-2008) were used to develop a nine-state first order Markov model capable to simulate the non-stationarity of the MJO. The model is driven by observed SST anomalies and a large ensemble of simulations was performed to infer the activity of the MJO in the instrumental period (1880-2008). The model is capable to reproduce the activity of the MJO during the reanalysis period. The simulations indicate that the MJO exhibited a regime of near normal activity in 1948-1972 (3.4 events year(-1)) and two regimes of high activity in 1973-1989 (3.9 events) and 1990-2008 (4.6 events). Stochastic simulations indicate decadal shifts with near normal levels in 1880-1895 (3.4 events), low activity in 1896 1917 (2.6 events) and a return to near normal levels during 1918-1947 (3.3 events). The results also point out to significant decadal changes in probabilities of very active years (5 or more MJO events): 0.214 (1880-1895), 0.076 (1896-1917), 0.197 (1918-1947) and 0.193 (1948-1972). After a change in behavior in the 1970s, this probability has increased to 0.329 (1973-1989) and 0.510 (1990-2008). The observational and stochastic simulations presented here call attention to the need to further understand the variability of the MJO on a wide range of time scales.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work is an assessment of frequency of extreme values (EVs) of daily rainfall in the city of Sao Paulo. Brazil, over the period 1933-2005, based on the peaks-over-threshold (POT) and Generalized Pareto Distribution (GPD) approach. Usually. a GPD model is fitted to a sample of POT Values Selected With a constant threshold. However. in this work we use time-dependent thresholds, composed of relatively large p quantities (for example p of 0.97) of daily rainfall amounts computed from all available data. Samples of POT values were extracted with several Values of p. Four different GPD models (GPD-1, GPD-2, GPD-3. and GDP-4) were fitted to each one of these samples by the maximum likelihood (ML) method. The shape parameter was assumed constant for the four models, but time-varying covariates were incorporated into scale parameter of GPD-2. GPD-3, and GPD-4, describing annual cycle in GPD-2. linear trend in GPD-3, and both annual cycle and linear trend in GPD-4. The GPD-1 with constant scale and shape parameters is the simplest model. For identification of the best model among the four models WC used rescaled Akaike Information Criterion (AIC) with second-order bias correction. This criterion isolates GPD-3 as the best model, i.e. the one with positive linear trend in the scale parameter. The slope of this trend is significant compared to the null hypothesis of no trend, for about 98% confidence level. The non-parametric Mann-Kendall test also showed presence of positive trend in the annual frequency of excess over high thresholds. with p-value being virtually zero. Therefore. there is strong evidence that high quantiles of daily rainfall in the city of Sao Paulo have been increasing in magnitude and frequency over time. For example. 0.99 quantiles of daily rainfall amount have increased by about 40 mm between 1933 and 2005. Copyright (C) 2008 Royal Meteorological Society

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this article is to find out the influence of the parameters of the ARIMA-GARCH models in the prediction of artificial neural networks (ANN) of the feed forward type, trained with the Levenberg-Marquardt algorithm, through Monte Carlo simulations. The paper presents a study of the relationship between ANN performance and ARIMA-GARCH model parameters, i.e. the fact that depending on the stationarity and other parameters of the time series, the ANN structure should be selected differently. Neural networks have been widely used to predict time series and their capacity for dealing with non-linearities is a normally outstanding advantage. However, the values of the parameters of the models of generalized autoregressive conditional heteroscedasticity have an influence on ANN prediction performance. The combination of the values of the GARCH parameters with the ARIMA autoregressive terms also implies in ANN performance variation. Combining the parameters of the ARIMA-GARCH models and changing the ANN`s topologies, we used the Theil inequality coefficient to measure the prediction of the feed forward ANN.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of the study was to see if any relationship between government spending andunemployment could be empirically found. To test if government spending affectsunemployment, a statistical model was applied on data from Sweden. The data was quarterlydata from the year 1994 until 2012, unit-root test were conducted and the variables wheretransformed to its first-difference so ensure stationarity. This transformation changed thevariables to growth rates. This meant that the interpretation deviated a little from the originalgoal. Other studies reviewed indicate that when government spending increases and/or taxesdecreases output increases. Studies show that unemployment decreases when governmentspending/GDP ratio increases. Some studies also indicated that with an already largegovernment sector increasing the spending it could have negative effect on output. The modelwas a VAR-model with unemployment, output, interest rate, taxes and government spending.Also included in the model were a linear and three quarterly dummies. The model used 7lags. The result was not statistically significant for most lags but indicated that as governmentspending growth rate increases holding everything else constant unemployment growth rateincreases. The result for taxes was even less statistically significant and indicates norelationship with unemployment. Post-estimation test indicates that there were problems withnon-normality in the model. So the results should be interpreted with some scepticism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The presence of deterministic or stochastic trend in U.S. GDP has been a continuing debate in the literature of macroeconomics. Ben-David and Papell (1995) found evindence in favor of trend stationarity using the secular sample of Maddison (1995). More recently, Murray and Nelson (2000) correctly criticized this nding arguing that the Maddison data are plagued with additive outliers (AO), which bias inference towards stationarity. Hence, they propose to set the secular sample aside and conduct inference using a more homogeneous but shorter time-span post-WWII sample. In this paper we re-visit the Maddison data by employing a test that is robust against AO s. Our results suggest the U.S. GDP can be modeled as a trend stationary process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper develops a family of autoregressive conditional duration (ACD) models that encompasses most specifications in the literature. The nesting relies on a Box-Cox transformation with shape parameter λ to the conditional duration process and a possibly asymmetric shocks impact curve. We establish conditions for the existence of higher-order moments, strict stationarity, geometric ergodicity and β-mixing property with exponential decay. We next derive moment recursion relations and the autocovariance function of the power λ of the duration process. Finally, we assess the practical usefulness of our family of ACD models using NYSE transactions data, with special attention to IBM price durations. The results warrant the extra flexibility provided either by the Box-Cox transformation or by the asymmetric response to shocks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Empirical evidence suggests that real exchange rate is characterized by the presence of near-unity and additive outliers. Recent studeis have found evidence on favor PPP reversion by using the quasi-differencing (Elliott et al., 1996) unit root tests (ERS), which is more efficient against local alternatives but is still based on least squares estimation. Unit root tests basead on least saquares method usually tend to bias inference towards stationarity when additive out liers are present. In this paper, we incorporate quasi-differencing into M-estimation to construct a unit root test that is robust not only against near-unity root but also against nonGaussian behavior provoked by assitive outliers. We re-visit the PPP hypothesis and found less evidemce in favor PPP reversion when non-Gaussian behavior in real exchange rates is taken into account.