747 resultados para intractable empirical likelihood


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We evaluate the use of Generalized Empirical Likelihood (GEL) estimators in portfolios efficiency tests for asset pricing models in the presence of conditional information. Estimators from GEL family presents some optimal statistical properties, such as robustness to misspecification and better properties in finite samples. Unlike GMM, the bias for GEL estimators do not increase as more moment conditions are included, which is expected in conditional efficiency analysis. We found some evidences that estimators from GEL class really performs differently in small samples, where efficiency tests using GEL generate lower estimates compared to tests using the standard approach with GMM. With Monte Carlo experiments we see that GEL has better performance when distortions are present in data, especially under heavy tails and Gaussian shocks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In longitudinal data analysis, our primary interest is in the regression parameters for the marginal expectations of the longitudinal responses; the longitudinal correlation parameters are of secondary interest. The joint likelihood function for longitudinal data is challenging, particularly for correlated discrete outcome data. Marginal modeling approaches such as generalized estimating equations (GEEs) have received much attention in the context of longitudinal regression. These methods are based on the estimates of the first two moments of the data and the working correlation structure. The confidence regions and hypothesis tests are based on the asymptotic normality. The methods are sensitive to misspecification of the variance function and the working correlation structure. Because of such misspecifications, the estimates can be inefficient and inconsistent, and inference may give incorrect results. To overcome this problem, we propose an empirical likelihood (EL) procedure based on a set of estimating equations for the parameter of interest and discuss its characteristics and asymptotic properties. We also provide an algorithm based on EL principles for the estimation of the regression parameters and the construction of a confidence region for the parameter of interest. We extend our approach to variable selection for highdimensional longitudinal data with many covariates. In this situation it is necessary to identify a submodel that adequately represents the data. Including redundant variables may impact the model’s accuracy and efficiency for inference. We propose a penalized empirical likelihood (PEL) variable selection based on GEEs; the variable selection and the estimation of the coefficients are carried out simultaneously. We discuss its characteristics and asymptotic properties, and present an algorithm for optimizing PEL. Simulation studies show that when the model assumptions are correct, our method performs as well as existing methods, and when the model is misspecified, it has clear advantages. We have applied the method to two case examples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since its discovery, chaos has been a very interesting and challenging topic of research. Many great minds spent their entire lives trying to give some rules to it. Nowadays, thanks to the research of last century and the advent of computers, it is possible to predict chaotic phenomena of nature for a certain limited amount of time. The aim of this study is to present a recently discovered method for the parameter estimation of the chaotic dynamical system models via the correlation integral likelihood, and give some hints for a more optimized use of it, together with a possible application to the industry. The main part of our study concerned two chaotic attractors whose general behaviour is diff erent, in order to capture eventual di fferences in the results. In the various simulations that we performed, the initial conditions have been changed in a quite exhaustive way. The results obtained show that, under certain conditions, this method works very well in all the case. In particular, it came out that the most important aspect is to be very careful while creating the training set and the empirical likelihood, since a lack of information in this part of the procedure leads to low quality results.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Two main approaches are commonly used to empirically evaluate linear factor pricingmodels: regression and SDF methods, with centred and uncentred versions of the latter.We show that unlike standard two-step or iterated GMM procedures, single-step estimatorssuch as continuously updated GMM yield numerically identical values for prices of risk,pricing errors, Jensen s alphas and overidentifying restrictions tests irrespective of the modelvalidity. Therefore, there is arguably a single approach regardless of the factors being tradedor not, or the use of excess or gross returns. We illustrate our results by revisiting Lustigand Verdelhan s (2007) empirical analysis of currency returns.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We propose new spanning tests that assess if the initial and additional assets share theeconomically meaningful cost and mean representing portfolios. We prove their asymptoticequivalence to existing tests under local alternatives. We also show that unlike two-step oriterated procedures, single-step methods such as continuously updated GMM yield numericallyidentical overidentifyng restrictions tests, so there is arguably a single spanning test.To prove these results, we extend optimal GMM inference to deal with singularities in thelong run second moment matrix of the influence functions. Finally, we test for spanningusing size and book-to-market sorted US stock portfolios.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este trabalho analisa as propriedades de uma nova medida de má especificação de modelos de apreçamento, que está relacionada com o tamanho do ajuste multiplicativo necessário para que o modelo seja corretamente especificado. A partir disso, caracterizamos o parâmetro que minimiza a medida a partir de um programa dual, de solução mais simples. Os estimadores naturais para esse parâmetro pertencem à classe de Generalized Empirical Likelihood. Derivamos as propriedades assintóticas deste estimador sob a hipótese de má especificação. A metodologia é empregada para estudar como se comportam em amostras finitas as estimativas de aversão relativa ao risco em uma economia de desastres quando os estimadores estão associados a nossa medida de má especificação. Nas simulações vemos que em média a aversão ao risco é superestimada, mesmo quando ocorre um número significativo de desastres.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Economists and other social scientists often face situations where they have access to two datasets that they can use but one set of data suffers from censoring or truncation. If the censored sample is much bigger than the uncensored sample, it is common for researchers to use the censored sample alone and attempt to deal with the problem of partial observation in some manner. Alternatively, they simply use only the uncensored sample and ignore the censored one so as to avoid biases. It is rarely the case that researchers use both datasets together, mainly because they lack guidance about how to combine them. In this paper, we develop a tractable semiparametric framework for combining the censored and uncensored datasets so that the resulting estimators are consistent, asymptotically normal, and use all information optimally. When the censored sample, which we refer to as the master sample, is much bigger than the uncensored sample (which we call the refreshment sample), the latter can be thought of as providing identification where it is otherwise absent. In contrast, when the refreshment sample is large and could typically be used alone, our methodology can be interpreted as using information from the censored sample to increase effciency. To illustrate our results in an empirical setting, we show how to estimate the effect of changes in compulsory schooling laws on age at first marriage, a variable that is censored for younger individuals. We also demonstrate how refreshment samples for this application can be created by matching cohort information across census datasets.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many datasets used by economists and other social scientists are collected by stratified sampling. The sampling scheme used to collect the data induces a probability distribution on the observed sample that differs from the target or underlying distribution for which inference is to be made. If this effect is not taken into account, subsequent statistical inference can be seriously biased. This paper shows how to do efficient semiparametric inference in moment restriction models when data from the target population is collected by three widely used sampling schemes: variable probability sampling, multinomial sampling, and standard stratified sampling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper studies what drives firms to voluntary delist from capital markets and what differs in firms’ behavior and fundamentals between public-to-private transactions and M&A deals with listed corporations. Moreover, I study the relationship between ownership percentage in controlling shareholders’ hands and cumulative returns around the delisting public announcement. I perform my tests both for the Italian and the US markets and I compare the findings to better understand how the phenomenon works in these different institutional environments. Consistent with my expectations, I find that the likelihood of delisting is mainly related to size, underperformance and undervaluation, while shareholders are more rewarded when their companies are involved in PTP transactions than in M&As with public firms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This comment corrects the errors in the estimation process that appear in Martins (2001). The first error is in the parametric probit estimation, as the previously presented results do not maximize the log-likelihood function. In the global maximum more variables become significant. As for the semiparametric estimation method, the kernel function used in Martins (2001) can take on both positive and negative values, which implies that the participation probability estimates may be outside the interval [0,1]. We have solved the problem by applying local smoothing in the kernel estimation, as suggested by Klein and Spady (1993).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper empirically analyses the hypothesis of the existence of a dual market for contracts in local services. Large firms that operate on a national basis control the contracts for delivery in the most populated and/or urban municipalities, whereas small firms that operate at a local level have the contracts in the least populated and/or rural municipalities. The dual market implies the high concentration and dominance of major firms in large municipalities, and local monopolies in the smaller ones. This market structure is harmful to competition for the market as the effective number of competitors is low across all municipalities. Thus, it damages the likelihood of obtaining cost savings from privatization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Land cover classification is a key research field in remote sensing and land change science as thematic maps derived from remotely sensed data have become the basis for analyzing many socio-ecological issues. However, land cover classification remains a difficult task and it is especially challenging in heterogeneous tropical landscapes where nonetheless such maps are of great importance. The present study aims to establish an efficient classification approach to accurately map all broad land cover classes in a large, heterogeneous tropical area of Bolivia, as a basis for further studies (e.g., land cover-land use change). Specifically, we compare the performance of parametric (maximum likelihood), non-parametric (k-nearest neighbour and four different support vector machines - SVM), and hybrid classifiers, using both hard and soft (fuzzy) accuracy assessments. In addition, we test whether the inclusion of a textural index (homogeneity) in the classifications improves their performance. We classified Landsat imagery for two dates corresponding to dry and wet seasons and found that non-parametric, and particularly SVM classifiers, outperformed both parametric and hybrid classifiers. We also found that the use of the homogeneity index along with reflectance bands significantly increased the overall accuracy of all the classifications, but particularly of SVM algorithms. We observed that improvements in producer’s and user’s accuracies through the inclusion of the homogeneity index were different depending on land cover classes. Earlygrowth/degraded forests, pastures, grasslands and savanna were the classes most improved, especially with the SVM radial basis function and SVM sigmoid classifiers, though with both classifiers all land cover classes were mapped with producer’s and user’s accuracies of around 90%. Our approach seems very well suited to accurately map land cover in tropical regions, thus having the potential to contribute to conservation initiatives, climate change mitigation schemes such as REDD+, and rural development policies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper empirically explores the link between quality and concentration in a cross-section of manufactured goods. Using concentration data and product quality indicators, an ordered probit estimation explores the impact of concentration on quality that is defined as an index of quality characteristics. The results demonstrate that market concentration and quality are positively correlated across different industries. When industry concentration increases, the likelihood of the product being higher quality increases and the likelihood of observing a lower quality decreases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Barbiturate-induced coma can be used in patients to treat intractable intracranial hypertension when other therapies, such as osmotic therapy and sedation, have failed. Despite control of intracranial pressure, cerebral infarction may still occur in some patients, and the effect of barbiturates on outcome remains uncertain. In this study, we examined the relationship between barbiturate infusion and brain tissue oxygen (PbtO2). METHODS: Ten volume-resuscitated brain-injured patients who were treated with pentobarbital infusion for intracranial hypertension and underwent PbtO2 monitoring were studied in a neurosurgical intensive care unit at a university-based Level I trauma center. PbtO2, intracranial pressure (ICP), mean arterial pressure, cerebral perfusion pressure (CPP), and brain temperature were continuously monitored and compared in settings in which barbiturates were or were not administered. RESULTS: Data were available from 1595 hours of PbtO2 monitoring. When pentobarbital administration began, the mean ICP, CPP, and PbtO2 were 18 +/- 10, 72 +/- 18, and 28 +/- 12 mm Hg, respectively. During the 3 hours before barbiturate infusion, the maximum ICP was 24 +/- 13 mm Hg and the minimum CPP was 65 +/- 20 mm Hg. In the majority of patients (70%), we observed an increase in PbtO2 associated with pentobarbital infusion. Within this group, logistic regression analysis demonstrated that a higher likelihood of compromised brain oxygen (PbtO2 < 20 mm Hg) was associated with a decrease in pentobarbital dose after controlling for ICP and other physiological parameters (P < 0.001). In the remaining 3 patients, pentobarbital was associated with lower PbtO2 levels. These patients had higher ICP, lower CPP, and later initiation of barbiturates compared with patients whose PbtO2 increased. CONCLUSION: Our preliminary findings suggest that pentobarbital administered for intractable intracranial hypertension is associated with a significant and independent increase in PbtO2 in the majority of patients. However, in some patients with more compromised brain physiology, pentobarbital may have a negative effect on PbtO2, particularly if administered late. Larger studies are needed to examine the relationship between barbiturates and cerebral oxygenation in brain-injured patients with refractory intracranial hypertension and to determine whether PbtO2 responses can help guide therapy.