930 resultados para Threshold estimation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to analyze extremal events using Generalized Pareto Distributions (GPD), considering explicitly the uncertainty about the threshold. Current practice empirically determines this quantity and proceeds by estimating the GPD parameters based on data beyond it, discarding all the information available be10w the threshold. We introduce a mixture model that combines a parametric form for the center and a GPD for the tail of the distributions and uses all observations for inference about the unknown parameters from both distributions, the threshold inc1uded. Prior distribution for the parameters are indirectly obtained through experts quantiles elicitation. Posterior inference is available through Markov Chain Monte Carlo (MCMC) methods. Simulations are carried out in order to analyze the performance of our proposed mode1 under a wide range of scenarios. Those scenarios approximate realistic situations found in the literature. We also apply the proposed model to a real dataset, Nasdaq 100, an index of the financiai market that presents many extreme events. Important issues such as predictive analysis and model selection are considered along with possible modeling extensions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Anaerobic threshold (AT) is usually estimated as a change point problem by visual analysis of the cardiorespiratory response to incremental dynamic exercise. In this study, two phase linear (TPL) models of the linear-linear and linear-quadratic type were used for the estimation of AT. The correlation coefficient between the classical and statistical approaches was 0.88, and 0.89 after outlier exclusion. The TPL models provide a simple method for estimating AT that can be easily implemented using a digital computer for the automatic pattern recognition of AT.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fixed-step-size (FSS) and Bayesian staircases are widely used methods to estimate sensory thresholds in 2AFC tasks, although a direct comparison of both types of procedure under identical conditions has not previously been reported. A simulation study and an empirical test were conducted to compare the performance of optimized Bayesian staircases with that of four optimized variants of FSS staircase differing as to up-down rule. The ultimate goal was to determine whether FSS or Bayesian staircases are the best choice in experimental psychophysics. The comparison considered the properties of the estimates (i.e. bias and standard errors) in relation to their cost (i.e. the number of trials to completion). The simulation study showed that mean estimates of Bayesian and FSS staircases are dependable when sufficient trials are given and that, in both cases, the standard deviation (SD) of the estimates decreases with number of trials, although the SD of Bayesian estimates is always lower than that of FSS estimates (and thus, Bayesian staircases are more efficient). The empirical test did not support these conclusions, as (1) neither procedure rendered estimates converging on some value, (2) standard deviations did not follow the expected pattern of decrease with number of trials, and (3) both procedures appeared to be equally efficient. Potential factors explaining the discrepancies between simulation and empirical results are commented upon and, all things considered, a sensible recommendation is for psychophysicists to run no fewer than 18 and no more than 30 reversals of an FSS staircase implementing the 1-up/3-down rule.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Threshold estimation with sequential procedures is justifiable on the surmise that the index used in the so-called dynamic stopping rule has diagnostic value for identifying when an accurate estimate has been obtained. The performance of five types of Bayesian sequential procedure was compared here to that of an analogous fixed-length procedure. Indices for use in sequential procedures were: (1) the width of the Bayesian probability interval, (2) the posterior standard deviation, (3) the absolute change, (4) the average change, and (5) the number of sign fluctuations. A simulation study was carried out to evaluate which index renders estimates with less bias and smaller standard error at lower cost (i.e. lower average number of trials to completion), in both yes–no and two-alternative forced-choice (2AFC) tasks. We also considered the effect of the form and parameters of the psychometric function and its similarity with themodel function assumed in the procedure. Our results show that sequential procedures do not outperform fixed-length procedures in yes–no tasks. However, in 2AFC tasks, sequential procedures not based on sign fluctuations all yield minimally better estimates than fixed-length procedures, although most of the improvement occurs with short runs that render undependable estimates and the differences vanish when the procedures run for a number of trials (around 70) that ensures dependability. Thus, none of the indices considered here (some of which are widespread) has the diagnostic value that would justify its use. In addition, difficulties of implementation make sequential procedures unfit as alternatives to fixed-length procedures.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We look at at the empirical validity of Schelling’s models for racial residential segregation applied to the case of Chicago. Most of the empirical literature has focused exclusively the single neighborhood model, also known as the tipping point model and neglected a multineighborhood approach or a unified approach. The multi-neighborhood approach introduced spatial interaction across the neighborhoods, in particular we look at spatial interaction across neighborhoods sharing a border. An initial exploration of the data indicates that spatial contiguity might be relevant to properly analyse the so call tipping phenomena of predominately non-Hispanic white neighborhoods to predominantly minority neighborhoods within a decade. We introduce an econometric model that combines an approach to estimate tipping point using threshold effects and a spatial autoregressive model. The estimation results from the model disputes the existence of a tipping point, that is a discontinuous change in the rate of growth of the non-Hispanic white population due to a small increase in the minority share of the neighborhood. In addition we find that racial distance between the neighborhood of interest and it surrounding neighborhoods has an important effect on the dynamics of racial segregation in Chicago.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study examined the use of non-standard parameters to investigate the visual field, with particular reference to the detection of glaucomatous visual field loss. Evaluation of the new perimetric strategy for threshold estimation - FASTPAC, demonstrated a reduction in the examination time of normals compared to the standard strategy. Despite an increased within-test variability the FASTPAC strategy produced a similar mean sensitivity to the standard strategy, reducing the effects of patient fatigue. The new technique of Blue-Yellow perimetry was compared to White-White perimetry for the detection of glaucomatous field loss in OHT and POAG. Using a database of normal subjects, confidence limits for normality were constructed to account for the increased between-subject variability with increase in age and eccentricity and for the greater variability of the Blue-Yellow field compared to the White-White field. Effects of individual ocular media absorption had little effect on Blue-Yellow field variability. Total and pattern probability analysis revealed five of 27 OHTs to exhibit Blue-Yellow focal abnormalities; two of these patients subsequently developed White-White loss. Twelve of the 24 POAGs revealed wider and/or deeper Blue-Yellow loss compared with the White-White field. Blue-Yellow perimetry showed good sensitivity and specificity characteristics, however, lack of perimetric experience and the presence of cataract influenced the Blue-Yellow visual field and may confound the interpretation of Blue-Yellow visual field loss. Visual field indices demonstrated a moderate relationship to the structural parameters of the optic nerve head using scanning laser tomography. No abnormalities in Blue-Yellow or Red-Green colour CS was apparent for the OHT patients. A greater vulnerability of the SWS pathway in glaucoma was demonstrated using Blue-Yellow perimetry however predicting which patients may benefit from B-Y perimetric examination is difficult. Furthermore, cataract and the extent of the field loss may limit the extent to which the integrity of the SWS channels can be selectively examined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The magnitude of the basic reproduction ratio R(0) of an epidemic can be estimated in several ways, namely, from the final size of the epidemic, from the average age at first infection, or from the initial growth phase of the outbreak. In this paper, we discuss this last method for estimating R(0) for vector-borne infections. Implicit in these models is the assumption that there is an exponential phase of the outbreaks, which implies that in all cases R(0) > 1. We demonstrate that an outbreak is possible, even in cases where R(0) is less than one, provided that the vector-to-human component of R(0) is greater than one and that a certain number of infected vectors are introduced into the affected population. This theory is applied to two real epidemiological dengue situations in the southeastern part of Brazil, one where R(0) is less than one, and other one where R(0) is greater than one. In both cases, the model mirrors the real situations with reasonable accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we implement estimating procedures in order to estimate threshold parameters for the continuous time threshold models driven by stochastic di®erential equations. The ¯rst procedure is based on the EM (expectation-maximization) algorithm applied to the threshold model built from the Brownian motion with drift process. The second procedure mimics one of the fundamental ideas in the estimation of the thresholds in time series context, that is, conditional least squares estimation. We implement this procedure not only for the threshold model built from the Brownian motion with drift process but also for more generic models as the ones built from the geometric Brownian motion or the Ornstein-Uhlenbeck process. Both procedures are implemented for simu- lated data and the least squares estimation procedure is also implemented for real data of daily prices from a set of international funds. The ¯rst fund is the PF-European Sus- tainable Equities-R fund from the Pictet Funds company and the second is the Parvest Europe Dynamic Growth fund from the BNP Paribas company. The data for both funds are daily prices from the year 2004. The last fund to be considered is the Converging Europe Bond fund from the Schroder company and the data are daily prices from the year 2005.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research Project submited as partial fulfilment for the Master Degree in Statistics and Information Management

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use a threshold seemingly unrelated regressions specification to assess whether the Central and East European countries (CEECs) are synchronized in their business cycles to the Euro-area. This specification is useful in two ways: First, it takes into account the common institutional factors and the similarities across CEECs in their process of economic transition. Second, it captures business cycle asymmetries by allowing for the presence of two distinct regimes for the CEECs. As the CEECs are strongly affected by the Euro-area these regimes may be associated with Euro-area expansions and contractions. We discuss representation, estimation by maximum likelihood and inference. The methodology is illustrated by using monthly industrial production in 8 CEECs. The results show that apart from Lithuania the rest of the CEECs experience “normal” growth when the Euro-area contracts and “high” growth when the Euro-area expands. Given that the CEECs are “catching up” with the Euro-area this result shows that most CEECs seem synchronized to the Euro-area cycle. Keywords: Threshold SURE; asymmetry; business cycles; CEECs. JEL classification: C33; C50; E32.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a contemporaneous-threshold multivariate smooth transition autoregressive (C-MSTAR) model in which the regime weights depend on the ex ante probabilities that latent regime-specific variables exceed certain threshold values. A key feature of the model is that the transition function depends on all the parameters of the model as well as on the data. Since the mixing weights are also a function of the regime-specific innovation covariance matrix, the model can account for contemporaneous regime-specific co-movements of the variables. The stability and distributional properties of the proposed model are discussed, as well as issues of estimation, testing and forecasting. The practical usefulness of the C-MSTAR model is illustrated by examining the relationship between US stock prices and interest rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a parsimonious regime-switching approach to model the correlations between assets, the threshold conditional correlation (TCC) model. This method allows the dynamics of the correlations to change from one state (or regime) to another as a function of observable transition variables. Our model is similar in spirit to Silvennoinen and Teräsvirta (2009) and Pelletier (2006) but with the appealing feature that it does not suffer from the course of dimensionality. In particular, estimation of the parameters of the TCC involves a simple grid search procedure. In addition, it is easy to guarantee a positive definite correlation matrix because the TCC estimator is given by the sample correlation matrix, which is positive definite by construction. The methodology is illustrated by evaluating the behaviour of international equities, govenrment bonds and major exchange rates, first separately and then jointly. We also test and allow for different parts in the correlation matrix to be governed by different transition variables. For this, we estimate a multi-threshold TCC specification. Further, we evaluate the economic performance of the TCC model against a constant conditional correlation (CCC) estimator using a Diebold-Mariano type test. We conclude that threshold correlation modelling gives rise to a significant reduction in portfolio´s variance.