26 resultados para nonstationarity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regional impacts of climate change remain subject to large uncertainties accumulating from various sources, including those due to choice of general circulation models (GCMs), scenarios, and downscaling methods. Objective constraints to reduce the uncertainty in regional predictions have proven elusive. In most studies to date the nature of the downscaling relationship (DSR) used for such regional predictions has been assumed to remain unchanged in a future climate. However,studies have shown that climate change may manifest in terms of changes in frequencies of occurrence of the leading modes of variability, and hence, stationarity of DSRs is not really a valid assumption in regional climate impact assessment. This work presents an uncertainty modeling framework where, in addition to GCM and scenario uncertainty, uncertainty in the nature of the DSR is explored by linking downscaling with changes in frequencies of such modes of natural variability. Future projections of the regional hydrologic variable obtained by training a conditional random field (CRF) model on each natural cluster are combined using the weighted Dempster-Shafer (D-S) theory of evidence combination. Each projection is weighted with the future projected frequency of occurrence of that cluster (''cluster linking'') and scaled by the GCM performance with respect to the associated cluster for the present period (''frequency scaling''). The D-S theory was chosen for its ability to express beliefs in some hypotheses, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The methodology is tested for predicting monsoon streamflow of the Mahanadi River at Hirakud Reservoir in Orissa, India. The results show an increasing probability of extreme, severe, and moderate droughts due to limate change. Significantly improved agreement between GCM predictions owing to cluster linking and frequency scaling is seen, suggesting that by linking regional impacts to natural regime frequencies, uncertainty in regional predictions can be realistically quantified. Additionally, by using a measure of GCM performance in simulating natural regimes, this uncertainty can be effectively constrained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tese de dout., Métodos Quantitativos Aplicados à Economia e à Gestão, Faculdade de Economia, Universidade do Algarve, 2008

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many unit root and cointegration tests require an estimate of the spectral density function at frequency zero at some process. Kernel estimators based on weighted sums of autocovariances constructed using estimated residuals from an AR(1) regression are commonly used. However, it is known that with substantially correlated errors, the OLS estimate of the AR(1) parameter is severely biased. in this paper, we first show that this least squares bias induces a significant increase in the bias and mean-squared error of kernel-based estimators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reanalysis data provide an excellent test bed for impacts prediction systems. because they represent an upper limit on the skill of climate models. Indian groundnut (Arachis hypogaea L.) yields have been simulated using the General Large-Area Model (GLAM) for annual crops and the European Centre for Medium-Range Weather Forecasts (ECMWF) 40-yr reanalysis (ERA-40). The ability of ERA-40 to represent the Indian summer monsoon has been examined. The ability of GLAM. when driven with daily ERA-40 data, to model both observed yields and observed relationships between subseasonal weather and yield has been assessed. Mean yields "were simulated well across much of India. Correlations between observed and modeled yields, where these are significant. are comparable to correlations between observed yields and ERA-40 rainfall. Uncertainties due to the input planting window, crop duration, and weather data have been examined. A reduction in the root-mean-square error of simulated yields was achieved by applying bias correction techniques to the precipitation. The stability of the relationship between weather and yield over time has been examined. Weather-yield correlations vary on decadal time scales. and this has direct implications for the accuracy of yield simulations. Analysis of the skewness of both detrended yields and precipitation suggest that nonclimatic factors are partly responsible for this nonstationarity. Evidence from other studies, including data on cereal and pulse yields, indicates that this result is not particular to groundnut yield. The detection and modeling of nonstationary weather-yield relationships emerges from this study as an important part of the process of understanding and predicting the impacts of climate variability and change on crop yields.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We examine a method recently proposed by Hinich and Patterson (mimeo, University of Texas at Austin, 1995) for testing the validity of specifying a GARCH error structure for financial time series data in the context of a set of ten daily Sterling exchange rates. The results demonstrate that there are statistical structures present in the data that cannot be captured by a GARCH model, or any of its variants. This result has important implications for the interpretation of the recent voluminous literature which attempts to model financial asset returns using this family of models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To derive tests for randomness, nonlinear-independence, and stationarity, we combine surrogates with a nonlinear prediction error, a nonlinear interdependence measure, and linear variability measures, respectively. We apply these tests to intracranial electroencephalographic recordings (EEG) from patients suffering from pharmacoresistant focal-onset epilepsy. These recordings had been performed prior to and independent from our study as part of the epilepsy diagnostics. The clinical purpose of these recordings was to delineate the brain areas to be surgically removed in each individual patient in order to achieve seizure control. This allowed us to define two distinct sets of signals: One set of signals recorded from brain areas where the first ictal EEG signal changes were detected as judged by expert visual inspection ("focal signals") and one set of signals recorded from brain areas that were not involved at seizure onset ("nonfocal signals"). We find more rejections for both the randomness and the nonlinear-independence test for focal versus nonfocal signals. In contrast more rejections of the stationarity test are found for nonfocal signals. Furthermore, while for nonfocal signals the rejection of the stationarity test increases the rejection probability of the randomness and nonlinear-independence test substantially, we find a much weaker influence for the focal signals. In consequence, the contrast between the focal and nonfocal signals obtained from the randomness and nonlinear-independence test is further enhanced when we exclude signals for which the stationarity test is rejected. To study the dependence between the randomness and nonlinear-independence test we include only focal signals for which the stationarity test is not rejected. We show that the rejection of these two tests correlates across signals. The rejection of either test is, however, neither necessary nor sufficient for the rejection of the other test. Thus, our results suggest that EEG signals from epileptogenic brain areas are less random, more nonlinear-dependent, and more stationary compared to signals recorded from nonepileptogenic brain areas. We provide the data, source code, and detailed results in the public domain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Australasian marsupials include three major radiations, the insectivorous/carnivorous Dasyuromorphia, the omnivorous bandicoots (Peramelemorphia), and the largely herbivorous diprotodontians. Morphologists have generally considered the bandicoots and diprotodontians to be closely related, most prominently because they are both syndactylous (with the 2nd and 3rd pedal digits being fused). Molecular studies have been unable to confirm or reject this Syndactyla hypothesis. Here we present new mitochondrial (mt) genomes from a spiny bandicoot (Echymipera rufescens) and two dasyurids, a fat-tailed dunnart (Sminthopsis crassicaudata) and a northern quoll (Dasyurus hallucatus). By comparing trees derived from pairwise base-frequency differences between taxa with standard (absolute, uncorrected) distance trees, we infer that composition bias among mt protein-coding and RNA sequences is sufficient to mislead tree reconstruction. This can explain incongruence between trees obtained from mt and nuclear data sets. However, after excluding major sources of compositional heterogeneity, both the “reduced-bias” mt and nuclear data sets clearly favor a bandicoot plus dasyuromorphian association, as well as a grouping of kangaroos and possums (Phalangeriformes) among diprotodontians. Notably, alternatives to these groupings could only be confidently rejected by combining the mt and nuclear data. Elsewhere on the tree, Dromiciops appears to be sister to the monophyletic Australasian marsupials, whereas the placement of the marsupial mole (Notoryctes) remains problematic. More generally, we contend that it is desirable to combine mt genome and nuclear sequences for inferring vertebrate phylogeny, but as separately modeled process partitions. This strategy depends on detecting and excluding (or accounting for) major sources of nonhistorical signal, such as from compositional nonstationarity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to generate skilled and efficient actions, the motor system must find solutions to several problems inherent in sensorimotor control, including nonlinearity, nonstationarity, delays, redundancy, uncertainty, and noise. We review these problems and five computational mechanisms that the brain may use to limit their deleterious effects: optimal feedback control, impedance control, predictive control, Bayesian decision theory, and sensorimotor learning. Together, these computational mechanisms allow skilled and fluent sensorimotor behavior.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper demonstrates the nonstationarity of algal population behaviors by analyzing the historical populations of Nostocales spp. in the River Darling, Australia. Freshwater ecosystems are more likely to be nonstationary, instead of stationary. Nonstationarity implies that only the near past behaviors could forecast the near future for the system. However, nonstionarity was not considered seriously in previous research efforts for modeling and predicting algal population behaviors. Therefore the moving window technique was incorporated with radial basis function neural network (RBFNN) approach to deal with nonstationarity when modeling and forecasting the population behaviors of Nostocales spp. in the River Darling. The results showed that the RBFNN model could predict the timing and magnitude of algal blooms of Nostocales spp. with high accuracy. Moreover, a combined model based on individual RBFNN models was implemented, which showed superiority over the individual RBFNN models. Hence, the combined model was recommended for the modeling and forecasting of the phytoplankton populations, especially for the forecasting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tremor is a clinical feature characterized by oscillations of a part of the body. The detection and study of tremor is an important step in investigations seeking to explain underlying control strategies of the central nervous system under natural (or physiological) and pathological conditions. It is well established that tremorous activity is composed of deterministic and stochastic components. For this reason, the use of digital signal processing techniques (DSP) which take into account the nonlinearity and nonstationarity of such signals may bring new information into the signal analysis which is often obscured by traditional linear techniques (e.g. Fourier analysis). In this context, this paper introduces the application of the empirical mode decomposition (EMD) and Hilbert spectrum (HS), which are relatively new DSP techniques for the analysis of nonlinear and nonstationary time-series, for the study of tremor. Our results, obtained from the analysis of experimental signals collected from 31 patients with different neurological conditions, showed that the EMD could automatically decompose acquired signals into basic components, called intrinsic mode functions (IMFs), representing tremorous and voluntary activity. The identification of a physical meaning for IMFs in the context of tremor analysis suggests an alternative and new way of detecting tremorous activity. These results may be relevant for those applications requiring automatic detection of tremor. Furthermore, the energy of IMFs was visualized as a function of time and frequency by means of the HS. This analysis showed that the variation of energy of tremorous and voluntary activity could be distinguished and characterized on the HS. Such results may be relevant for those applications aiming to identify neurological disorders. In general, both the HS and EMD demonstrated to be very useful to perform objective analysis of any kind of tremor and can therefore be potentially used to perform functional assessment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we investigate fiscal sustainability by using a quantile autoregression (QAR) model. We propose a novel methodology to separate periods of nonstationarity from stationary ones, which allows us to identify various trajectories of public debt that are compatible with fiscal sustainability. We use such trajectories to construct a debt ceiling, that is, the largest value of public debt that does not jeopardize long-run fiscal sustainability. We make out-of-sample forecast of such a ceiling and show how it could be used by Policy makers interested in keeping the public debt on a sustainable path. We illustrate the applicability of our results using Brazilian data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is composed of three essays referent to the subjects of macroeconometrics and Önance. In each essay, which corresponds to one chapter, the objective is to investigate and analyze advanced econometric techniques, applied to relevant macroeconomic questions, such as the capital mobility hypothesis and the sustainability of public debt. A Önance topic regarding portfolio risk management is also investigated, through an econometric technique used to evaluate Value-at-Risk models. The Örst chapter investigates an intertemporal optimization model to analyze the current account. Based on Campbell & Shillerís (1987) approach, a Wald test is conducted to analyze a set of restrictions imposed to a VAR used to forecast the current account. The estimation is based on three di§erent procedures: OLS, SUR and the two-way error decomposition of Fuller & Battese (1974), due to the presence of global shocks. A note on Granger causality is also provided, which is shown to be a necessary condition to perform the Wald test with serious implications to the validation of the model. An empirical exercise for the G-7 countries is presented, and the results substantially change with the di§erent estimation techniques. A small Monte Carlo simulation is also presented to investigate the size and power of the Wald test based on the considered estimators. The second chapter presents a study about Öscal sustainability based on a quantile autoregression (QAR) model. A novel methodology to separate periods of nonstationarity from stationary ones is proposed, which allows one to identify trajectories of public debt that are not compatible with Öscal sustainability. Moreover, such trajectories are used to construct a debt ceiling, that is, the largest value of public debt that does not jeopardize long-run Öscal sustainability. An out-of-sample forecast of such a ceiling is also constructed, and can be used by policy makers interested in keeping the public debt on a sustainable path. An empirical exercise by using Brazilian data is conducted to show the applicability of the methodology. In the third chapter, an alternative backtest to evaluate the performance of Value-at-Risk (VaR) models is proposed. The econometric methodology allows one to directly test the overall performance of a VaR model, as well as identify periods of an increased risk exposure, which seems to be a novelty in the literature. Quantile regressions provide an appropriate environment to investigate VaR models, since they can naturally be viewed as a conditional quantile function of a given return series. An empirical exercise is conducted for daily S&P500 series, and a Monte Carlo simulation is also presented, revealing that the proposed test might exhibit more power in comparison to other backtests.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We analyze three sets of doubly-censored cohort data on incubation times, estimating incubation distributions using semi-parametric methods and assessing the comparability of the estimates. Weibull models appear to be inappropriate for at least one of the cohorts, and the estimates for the different cohorts are substantially different. We use these estimates as inputs for backcalculation, using a nonparametric method based on maximum penalized likelihood. The different incubations all produce fits to the reported AIDS counts that are as good as the fit from a nonstationary incubation distribution that models treatment effects, but the estimated infection curves are very different. We also develop a method for estimating nonstationarity as part of the backcalculation procedure and find that such estimates also depend very heavily on the assumed incubation distribution. We conclude that incubation distributions are so uncertain that meaningful error bounds are difficult to place on backcalculated estimates and that backcalculation may be too unreliable to be used without being supplemented by other sources of information in HIV prevalence and incidence.