757 resultados para estimator


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we propose a parsimonious regime-switching approach to model the correlations between assets, the threshold conditional correlation (TCC) model. This method allows the dynamics of the correlations to change from one state (or regime) to another as a function of observable transition variables. Our model is similar in spirit to Silvennoinen and Teräsvirta (2009) and Pelletier (2006) but with the appealing feature that it does not suffer from the course of dimensionality. In particular, estimation of the parameters of the TCC involves a simple grid search procedure. In addition, it is easy to guarantee a positive definite correlation matrix because the TCC estimator is given by the sample correlation matrix, which is positive definite by construction. The methodology is illustrated by evaluating the behaviour of international equities, govenrment bonds and major exchange rates, first separately and then jointly. We also test and allow for different parts in the correlation matrix to be governed by different transition variables. For this, we estimate a multi-threshold TCC specification. Further, we evaluate the economic performance of the TCC model against a constant conditional correlation (CCC) estimator using a Diebold-Mariano type test. We conclude that threshold correlation modelling gives rise to a significant reduction in portfolio´s variance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a seminal paper, Aitchison and Lauder (1985) introduced classical kernel densityestimation techniques in the context of compositional data analysis. Indeed, they gavetwo options for the choice of the kernel to be used in the kernel estimator. One ofthese kernels is based on the use the alr transformation on the simplex SD jointly withthe normal distribution on RD-1. However, these authors themselves recognized thatthis method has some deficiencies. A method for overcoming these dificulties based onrecent developments for compositional data analysis and multivariate kernel estimationtheory, combining the ilr transformation with the use of the normal density with a fullbandwidth matrix, was recently proposed in Martín-Fernández, Chacón and Mateu-Figueras (2006). Here we present an extensive simulation study that compares bothmethods in practice, thus exploring the finite-sample behaviour of both estimators

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A variety of behavioural traits have substantial effects on the gene dynamics and genetic structure of local populations. The mating system is a plastic trait that varies with environmental conditions in the domestic cat (Felis catus) allowing an intraspecific comparison of the impact of this feature on genetic characteristics of the population. To assess the potential effect of the heterogenity of males' contribution to the next generation on variance effective size, we applied the ecological approach of Nunney & Elam (1994) based upon a demographic and behavioural study, and the genetic 'temporal methods' of Waples (1989) and Berthier et al. (2002) using microsatellite markers. The two cat populations studied were nearly closed, similar in size and survival parameters, but differed in their mating system. Immigration appeared extremely restricted in both cases due to environmental and social constraints. As expected, the ratio of effective size to census number (Ne/N) was higher in the promiscuous cat population (harmonic mean = 42%) than in the polygynous one (33%), when Ne was calculated from the ecological method. Only the genetic results based on Waples' estimator were consistent with the ecological results, but failed to evidence an effect of the mating system. Results based on the estimation of Berthier et al. (2002) were extremely variable, with Ne sometimes exceeding census size. Such low reliability in the genetic results should retain attention for conservation purposes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reports of triatomine infestation in urban areas have increased. We analysed the spatial distribution of infestation by triatomines in the urban area of Diamantina, in the state of Minas Gerais, Brazil. Triatomines were obtained by community-based entomological surveillance. Spatial patterns of infestation were analysed by Ripley’s K function and Kernel density estimator. Normalised difference vegetation index (NDVI) and land cover derived from satellite imagery were compared between infested and uninfested areas. A total of 140 adults of four species were captured (100 Triatoma vitticeps, 25Panstrongylus geniculatus, 8 Panstrongylus megistus, and 7 Triatoma arthurneivai specimens). In total, 87.9% were captured within domiciles. Infection by trypanosomes was observed in 19.6% of 107 examined insects. The spatial distributions ofT. vitticeps, P. geniculatus, T. arthurneivai, and trypanosome-positive triatomines were clustered, occurring mainly in peripheral areas. NDVI values were statistically higher in areas infested by T. vitticeps and P. geniculatus. Buildings infested by these species were located closer to open fields, whereas infestations of P. megistus andT. arthurneivai were closer to bare soil. Human occupation and modification of natural areas may be involved in triatomine invasion, exposing the population to these vectors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A graphical processing unit (GPU) is a hardware device normally used to manipulate computer memory for the display of images. GPU computing is the practice of using a GPU device for scientific or general purpose computations that are not necessarily related to the display of images. Many problems in econometrics have a structure that allows for successful use of GPU computing. We explore two examples. The first is simple: repeated evaluation of a likelihood function at different parameter values. The second is a more complicated estimator that involves simulation and nonparametric fitting. We find speedups from 1.5 up to 55.4 times, compared to computations done on a single CPU core. These speedups can be obtained with very little expense, energy consumption, and time dedicated to system maintenance, compared to equivalent performance solutions using CPUs. Code for the examples is provided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

SummaryDiscrete data arise in various research fields, typically when the observations are count data.I propose a robust and efficient parametric procedure for estimation of discrete distributions. The estimation is done in two phases. First, a very robust, but possibly inefficient, estimate of the model parameters is computed and used to indentify outliers. Then the outliers are either removed from the sample or given low weights, and a weighted maximum likelihood estimate (WML) is computed.The weights are determined via an adaptive process such that if the data follow the model, then asymptotically no observation is downweighted.I prove that the final estimator inherits the breakdown point of the initial one, and that its influence function at the model is the same as the influence function of the maximum likelihood estimator, which strongly suggests that it is asymptotically fully efficient.The initial estimator is a minimum disparity estimator (MDE). MDEs can be shown to have full asymptotic efficiency, and some MDEs have very high breakdown points and very low bias under contamination. Several initial estimators are considered, and the performances of the WMLs based on each of them are studied.It results that in a great variety of situations the WML substantially improves the initial estimator, both in terms of finite sample mean square error and in terms of bias under contamination. Besides, the performances of the WML are rather stable under a change of the MDE even if the MDEs have very different behaviors.Two examples of application of the WML to real data are considered. In both of them, the necessity for a robust estimator is clear: the maximum likelihood estimator is badly corrupted by the presence of a few outliers.This procedure is particularly natural in the discrete distribution setting, but could be extended to the continuous case, for which a possible procedure is sketched.RésuméLes données discrètes sont présentes dans différents domaines de recherche, en particulier lorsque les observations sont des comptages.Je propose une méthode paramétrique robuste et efficace pour l'estimation de distributions discrètes. L'estimation est faite en deux phases. Tout d'abord, un estimateur très robuste des paramètres du modèle est calculé, et utilisé pour la détection des données aberrantes (outliers). Cet estimateur n'est pas nécessairement efficace. Ensuite, soit les outliers sont retirés de l'échantillon, soit des faibles poids leur sont attribués, et un estimateur du maximum de vraisemblance pondéré (WML) est calculé.Les poids sont déterminés via un processus adaptif, tel qu'asymptotiquement, si les données suivent le modèle, aucune observation n'est dépondérée.Je prouve que le point de rupture de l'estimateur final est au moins aussi élevé que celui de l'estimateur initial, et que sa fonction d'influence au modèle est la même que celle du maximum de vraisemblance, ce qui suggère que cet estimateur est pleinement efficace asymptotiquement.L'estimateur initial est un estimateur de disparité minimale (MDE). Les MDE sont asymptotiquement pleinement efficaces, et certains d'entre eux ont un point de rupture très élevé et un très faible biais sous contamination. J'étudie les performances du WML basé sur différents MDEs.Le résultat est que dans une grande variété de situations le WML améliore largement les performances de l'estimateur initial, autant en terme du carré moyen de l'erreur que du biais sous contamination. De plus, les performances du WML restent assez stables lorsqu'on change l'estimateur initial, même si les différents MDEs ont des comportements très différents.Je considère deux exemples d'application du WML à des données réelles, où la nécessité d'un estimateur robuste est manifeste : l'estimateur du maximum de vraisemblance est fortement corrompu par la présence de quelques outliers.La méthode proposée est particulièrement naturelle dans le cadre des distributions discrètes, mais pourrait être étendue au cas continu.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates a simple procedure to estimate robustly the mean of an asymmetric distribution. The procedure removes the observations which are larger or smaller than certain limits and takes the arithmetic mean of the remaining observations, the limits being determined with the help of a parametric model, e.g., the Gamma, the Weibull or the Lognormal distribution. The breakdown point, the influence function, the (asymptotic) variance, and the contamination bias of this estimator are explored and compared numerically with those of competing estimates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: The interhemispheric asymmetries that originate from connectivity-related structuring of the cerebral cortex are compromised in schizophrenia (SZ). Recently, we have revealed the whole-head topography of EEG synchronization in SZ (Jalili et al. 2007; Knyazeva et al. 2008). Here we extended the analysis to assess the abnormality in the asymmetry of synchronization, which is further motivated by the evidence that the interhemispheric asymmetries suspected to be abnormal in SZ originate from the connectivity-related structuring of the cortex. Methods: Thirteen right-handed SZ patients and thirteen matched controls, participated in this study and the multichannel (128) EEGs were recorded for 3-5 minutes at rest. Then, Laplacian EEG (LEEG) were calculated using a 2-D spline. The LEEGs were analysis through calculating the power spectral density using Welch's average periodogram method. Furthermore, using a state-space based multivariate synchronization measure, S-estimator, we analyzed the correlate of the functional cortico-cortical connectivity in SZ patients compared to the controls. The values of S-estimator were obtained at three different special scales: first-order neighbors for each sensor location, second-order neighbors, and the whole hemisphere. The synchronization measures based on LEEG of alpha and beta bands were applied and tuned to various spatial scales including local, intraregional, and long-distance levels. To assess the between-group differences, we used a permutation version of Hotelling's T2 test. For correlation analysis, Spearman Rank Correlation was calculated. Results: Compared to the controls, who had rightward asymmetry at a local level (LEEG power), rightward anterior and leftward posterior asymmetries at an intraregional level (first- and second-order S-estimator), and rightward global asymmetry (hemispheric S-estimator), SZ patients showed generally attenuated asymmetry, the effect being strongest for intraregional synchronization. This deviation in asymmetry across the anterior-to-posterior axis is consistent with the cerebral form of the so-called Yakovlevian or anticlockwise cerebral torque. Moreover, the negative occipital and positive frontal asymmetry values suggest higher regional synchronization among the left occipital and the right frontal locations relative to their symmetrical counterparts. Correlation analysis linked the posterior intraregional and hemispheric abnormalities to the negative SZ symptoms, whereas the asymmetry of LEEG power appeared to be weakly coupled to clinical ratings. The posterior intraregional abnormalities of asymmetry were shown to increase with the duration of the disease. The tentative links between these findings and gross anatomical asymmetries, including the cerebral torque and gyrification pattern in normal subjects and SZ patients, are discussed. Conclusions: Overall, our findings reveal the abnormalities in the synchronization asymmetry in SZ patients and heavy involvement of the right hemisphere in these abnormalities. These results indicate that anomalous asymmetry of cortico-cortical connections in schizophrenia is amenable to electrophysiological analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wireless “MIMO” systems, employing multiple transmit and receive antennas, promise a significant increase of channel capacity, while orthogonal frequency-division multiplexing (OFDM) is attracting a good deal of attention due to its robustness to multipath fading. Thus, the combination of both techniques is an attractive proposition for radio transmission. The goal of this paper is the description and analysis of a new and novel pilot-aided estimator of multipath block-fading channels. Typical models leading to estimation algorithms assume the number of multipath components and delays to be constant (and often known), while their amplitudes are allowed to vary with time. Our estimator is focused instead on the more realistic assumption that the number of channel taps is also unknown and varies with time following a known probabilistic model. The estimation problem arising from these assumptions is solved using Random-Set Theory (RST), whereby one regards the multipath-channel response as a single set-valued random entity.Within this framework, Bayesian recursive equations determine the evolution with time of the channel estimator. Due to the lack of a closed form for the solution of Bayesian equations, a (Rao–Blackwellized) particle filter (RBPF) implementation ofthe channel estimator is advocated. Since the resulting estimator exhibits a complexity which grows exponentially with the number of multipath components, a simplified version is also introduced. Simulation results describing the performance of our channel estimator demonstrate its effectiveness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we introduce a pilot-aided multipath channel estimator for Multiple-Input Multiple-Output (MIMO) Orthogonal Frequency Division Multiplexing (OFDM) systems. Typical estimation algorithms assume the number of multipath components and delays to be known and constant, while theiramplitudes may vary in time. In this work, we focus on the more realistic assumption that also the number of channel taps is unknown and time-varying. The estimation problem arising from this assumption is solved using Random Set Theory (RST), which is a probability theory of finite sets. Due to the lack of a closed form of the optimal filter, a Rao-Blackwellized Particle Filter (RBPF) implementation of the channel estimator is derived. Simulation results demonstrate the estimator effectiveness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents several algorithms for joint estimation of the target number and state in a time-varying scenario. Building on the results presented in [1], which considers estimation of the target number only, we assume that not only the target number, but also their state evolution must be estimated. In this context, we extend to this new scenario the Rao-Blackwellization procedure of [1] to compute Bayes recursions, thus defining reduced-complexity solutions for the multi-target set estimator. A performance assessmentis finally given both in terms of Circular Position Error Probability - aimed at evaluating the accuracy of the estimated track - and in terms of Cardinality Error Probability, aimed at evaluating the reliability of the target number estimates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: To better understand the structure of the Patient Assessment of Chronic Illness Care (PACIC) instrument. More specifically to test all published validation models, using one single data set and appropriate statistical tools. DESIGN: Validation study using data from cross-sectional survey. PARTICIPANTS: A population-based sample of non-institutionalized adults with diabetes residing in Switzerland (canton of Vaud). MAIN OUTCOME MEASURE: French version of the 20-items PACIC instrument (5-point response scale). We conducted validation analyses using confirmatory factor analysis (CFA). The original five-dimension model and other published models were tested with three types of CFA: based on (i) a Pearson estimator of variance-covariance matrix, (ii) a polychoric correlation matrix and (iii) a likelihood estimation with a multinomial distribution for the manifest variables. All models were assessed using loadings and goodness-of-fit measures. RESULTS: The analytical sample included 406 patients. Mean age was 64.4 years and 59% were men. Median of item responses varied between 1 and 4 (range 1-5), and range of missing values was between 5.7 and 12.3%. Strong floor and ceiling effects were present. Even though loadings of the tested models were relatively high, the only model showing acceptable fit was the 11-item single-dimension model. PACIC was associated with the expected variables of the field. CONCLUSIONS: Our results showed that the model considering 11 items in a single dimension exhibited the best fit for our data. A single score, in complement to the consideration of single-item results, might be used instead of the five dimensions usually described.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper studies the apparent contradiction between two strands of the literature on the effects of financial intermediation on economic activity. On the one hand, the empirical growth literature finds a positive effect of financial depth as measured by, for instance, private domestic credit and liquid liabilities (e.g., Levine, Loayza, and Beck 2000). On the other hand, the banking and currency crisis literature finds that monetary aggregates, such as domestic credit, are among the best predictors of crises and their related economic downturns (e.g., Kaminski and Reinhart 1999). The paper accounts for these contrasting effects based on the distinction between the short- and long-run impacts of financial intermediation. Working with a panel of cross-country and time-series observations, the paper estimates an encompassing model of short- and long-run effects using the Pooled Mean Group estimator developed by Pesaran, Shin, and Smith (1999). The conclusion from this analysis is that a positive long-run relationship between financial intermediation and output growth co-exists with a, mostly, negative short-run relationship. The paper further develops an explanation for these contrasting effects by relating them to recent theoretical models, by linking the estimated short-run effects to measures of financial fragility (namely, banking crises and financial volatility), and by jointly analyzing the effects of financial depth and fragility in classic panel growth regressions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we propose a subsampling estimator for the distribution ofstatistics diverging at either known rates when the underlying timeseries in strictly stationary abd strong mixing. Based on our results weprovide a detailed discussion how to estimate extreme order statisticswith dependent data and present two applications to assessing financialmarket risk. Our method performs well in estimating Value at Risk andprovides a superior alternative to Hill's estimator in operationalizingSafety First portofolio selection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduce simple nonparametric density estimators that generalize theclassical histogram and frequency polygon. The new estimators are expressed as linear combination of density functions that are piecewisepolynomials, where the coefficients are optimally chosen in order to minimize the integrated square error of the estimator. We establish the asymptotic behaviour of the proposed estimators, and study theirperformance in a simulation study.