67 resultados para misspecification


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper extends prior work on the linkage between politically connected (PCON) firms and capital structure in developing countries. Specifically, this paper focuses on the association between Malaysian PCON firms and leverage, and is motivated by the results of Fraser et al. (2006) who report a positive association between leverage and political patronage. Controlling for a potential misspecification in that paper, this study documents that a significant proportion (almost 12%) of the Malaysian PCON firms have negative equity, and builds on the previous paper by providing fresh evidence that market to book ratio is positively associated with leverage, and that borrowing PCON firms have significantly lower ROA compared to non-PCON firms. © 2012.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The concept of stochastic discount factor pervades the Modern Theory of Asset Pricing. Initially, such object allows unattached pricing models to be discussed under the same terms. However, Hansen and Jagannathan have shown there is worthy information to be brought forth from such powerful concept which undelies asset pricing models. From security market data sets, one is able to explore the behavior of such random variable, determining a useful variance bound. Furthermore, through that instrument, they explore one pitfall on modern asset pricing: model misspecification. Those major contributions, alongside with some of its extensions, are thoroughly investigated in this exposition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The questlon of the crowding-out of private !nvestment by public expenditure, public investment in particular , ln the Brazilian economy has been discussed more in ideological terrns than on empirical grounds. The present paper tries to avoid the limitation of previous studies by estlmatlng an equation for private investment whlch makes it possible to evaluate the effect of economic policies on prlvate investment. The private lnvestment equation was deduced modifylng the optimal flexible accelerator medel (OFAM) incorporating some channels through which public expendlture influences privateinvestment. The OFAM consists in adding adjustment costs to the neoclassical theory of investrnent. The investment fuction deduced is quite general and has the following explanatory variables: relative prices (user cost of capitaljimput prices ratios), real interest rates, real product, public expenditures and lagged private stock of capital. The model was estimated for private manufacturing industry data. The procedure adopted in estimating the model was to begin with a model as general as possible and apply restrictions to the model ' s parameters and test their statistical significance. A complete diagnostic testing was also made in order to test the stability of estirnated equations. This procedure avoids ' the shortcomings of estimating a model with a apriori restrictions on its parameters , which may lead to model misspecification. The main findings of the present study were: the increase in public expenditure, at least in the long run, has in general a positive expectation effect on private investment greater than its crowding-out effect on priva te investment owing to the simultaneous rise in interst rates; a change in economlc policy, such as that one of Geisel administration, may have an important effect on private lnvestment; and reI ative prices are relevant in determining the leveI of desired stock of capital and private investrnent.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neste trabalho investigamos as propriedades em pequena amostra e a robustez das estimativas dos parâmetros de modelos DSGE. Tomamos o modelo de Smets and Wouters (2007) como base e avaliamos a performance de dois procedimentos de estimação: Método dos Momentos Simulados (MMS) e Máxima Verossimilhança (MV). Examinamos a distribuição empírica das estimativas dos parâmetros e sua implicação para as análises de impulso-resposta e decomposição de variância nos casos de especificação correta e má especificação. Nossos resultados apontam para um desempenho ruim de MMS e alguns padrões de viés nas análises de impulso-resposta e decomposição de variância com estimativas de MV nos casos de má especificação considerados.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Includes bibliography

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The issue of assessing variance components is essential in deciding on the inclusion of random effects in the context of mixed models. In this work we discuss this problem by supposing nonlinear elliptical models for correlated data by using the score-type test proposed in Silvapulle and Silvapulle (1995). Being asymptotically equivalent to the likelihood ratio test and only requiring the estimation under the null hypothesis, this test provides a fairly easy computable alternative for assessing one-sided hypotheses in the context of the marginal model. Taking into account the possible non-normal distribution, we assume that the joint distribution of the response variable and the random effects lies in the elliptical class, which includes light-tailed and heavy-tailed distributions such as Student-t, power exponential, logistic, generalized Student-t, generalized logistic, contaminated normal, and the normal itself, among others. We compare the sensitivity of the score-type test under normal, Student-t and power exponential models for the kinetics data set discussed in Vonesh and Carter (1992) and fitted using the model presented in Russo et al. (2009). Also, a simulation study is performed to analyze the consequences of the kurtosis misspecification.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Contracts paying a guaranteed minimum rate of return and a fraction of a positive excess rate, which is specified relative to a benchmark portfolio, are closely related to unit-linked life-insurance products and can be considered as alternatives to direct investment in the underlying benchmark. They contain an embedded power option, and the key issue is the tractable and realistic hedging of this option, in order to rigorously justify valuation by arbitrage arguments and prevent the guarantees from becoming uncontrollable liabilities to the issuer. We show how to determine the contract parameters conservatively and implement robust risk-management strategies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper revisits the issue of conditional volatility in real GDP growth rates for Canada, Japan, the United Kingdom, and the United States. Previous studies find high persistence in the volatility. This paper shows that this finding largely reflects a nonstationary variance. Output growth in the four countries became noticeably less volatile over the past few decades. In this paper, we employ the modified ICSS algorithm to detect structural change in the unconditional variance of output growth. One structural break exists in each of the four countries. We then use generalized autoregressive conditional heteroskedasticity (GARCH) specifications modeling output growth and its volatility with and without the break in volatility. The evidence shows that the time-varying variance falls sharply in Canada, Japan, and the U.K. and disappears in the U.S., excess kurtosis vanishes in Canada, Japan, and the U.S. and drops substantially in the U.K., once we incorporate the break in the variance equation of output for the four countries. That is, the integrated GARCH (IGARCH) effect proves spurious and the GARCH model demonstrates misspecification, if researchers neglect a nonstationary unconditional variance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We evaluate the use of Generalized Empirical Likelihood (GEL) estimators in portfolios efficiency tests for asset pricing models in the presence of conditional information. Estimators from GEL family presents some optimal statistical properties, such as robustness to misspecification and better properties in finite samples. Unlike GMM, the bias for GEL estimators do not increase as more moment conditions are included, which is expected in conditional efficiency analysis. We found some evidences that estimators from GEL class really performs differently in small samples, where efficiency tests using GEL generate lower estimates compared to tests using the standard approach with GMM. With Monte Carlo experiments we see that GEL has better performance when distortions are present in data, especially under heavy tails and Gaussian shocks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper estimates the immediate impact of the European Central Bank’s asset purchase programmes on sovereign bond spreads in the euro area between 2008 and 2015 using a country-by-country GARCH model. The baseline estimates are rigorously diagnosed for misspecification and subjected to a wide range of sensitivity tests. Among others, changes in the dependent variable, the independent variables and the number of (G)ARCH terms are tested. Moreover, the model is applied to subsamples and dynamic conditional correlations are analyzed to estimate the effects of the asset purchases on the contagion of spread movements. Generally, it is found that the asset purchase programmes triggered an reduction of sovereign bond spreads. More specifically, the Securities Markets Programme (SMP) had the most significant immediate effects on sovereign bond spreads across the euro area. The announcements related to the Outright Monetary Transactions (OMT) programme also yielded substantial spread compression in the periphery. In contrast to that, the most recent Public Sector Purchase Programme (PSPP) announced in January 2015 and implemented since March 2015 had no significant immediate effects on sovereign bond spreads, except for Irish spreads. Hence, immediate effects seem to be dependent upon the size of the programme, the extent to which it targets distressed sovereigns and the way in which it is communicated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pharmacodynamics (PD) is the study of the biochemical and physiological effects of drugs. The construction of optimal designs for dose-ranging trials with multiple periods is considered in this paper, where the outcome of the trial (the effect of the drug) is considered to be a binary response: the success or failure of a drug to bring about a particular change in the subject after a given amount of time. The carryover effect of each dose from one period to the next is assumed to be proportional to the direct effect. It is shown for a logistic regression model that the efficiency of optimal parallel (single-period) or crossover (two-period) design is substantially greater than a balanced design. The optimal designs are also shown to be robust to misspecification of the value of the parameters. Finally, the parallel and crossover designs are combined to provide the experimenter with greater flexibility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study presents some quantitative evidence from a number of simulation experiments on the accuracy of the productivitygrowth estimates derived from growthaccounting (GA) and frontier-based methods (namely data envelopment analysis-, corrected ordinary least squares-, and stochastic frontier analysis-based malmquist indices) under various conditions. These include the presence of technical inefficiency, measurement error, misspecification of the production function (for the GA and parametric approaches) and increased input and price volatility from one period to the next. The study finds that the frontier-based methods usually outperform GA, but the overall performance varies by experiment. Parametric approaches generally perform best when there is no functional form misspecification, but their accuracy greatly diminishes otherwise. The results also show that the deterministic approaches perform adequately even under conditions of (modest) measurement error and when measurement error becomes larger, the accuracy of all approaches (including stochastic approaches) deteriorates rapidly, to the point that their estimates could be considered unreliable for policy purposes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Productivity at the macro level is a complex concept but also arguably the most appropriate measure of economic welfare. Currently, there is limited research available on the various approaches that can be used to measure it and especially on the relative accuracy of said approaches. This thesis has two main objectives: firstly, to detail some of the most common productivity measurement approaches and assess their accuracy under a number of conditions and secondly, to present an up-to-date application of productivity measurement and provide some guidance on selecting between sometimes conflicting productivity estimates. With regards to the first objective, the thesis provides a discussion on the issues specific to macro-level productivity measurement and on the strengths and weaknesses of the three main types of approaches available, namely index-number approaches (represented by Growth Accounting), non-parametric distance functions (DEA-based Malmquist indices) and parametric production functions (COLS- and SFA-based Malmquist indices). The accuracy of these approaches is assessed through simulation analysis, which provided some interesting findings. Probably the most important were that deterministic approaches are quite accurate even when the data is moderately noisy, that no approaches were accurate when noise was more extensive, that functional form misspecification has a severe negative effect in the accuracy of the parametric approaches and finally that increased volatility in inputs and prices from one period to the next adversely affects all approaches examined. The application was based on the EU KLEMS (2008) dataset and revealed that the different approaches do in fact result in different productivity change estimates, at least for some of the countries assessed. To assist researchers in selecting between conflicting estimates, a new, three step selection framework is proposed, based on findings of simulation analyses and established diagnostics/indicators. An application of this framework is also provided, based on the EU KLEMS dataset.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In longitudinal data analysis, our primary interest is in the regression parameters for the marginal expectations of the longitudinal responses; the longitudinal correlation parameters are of secondary interest. The joint likelihood function for longitudinal data is challenging, particularly for correlated discrete outcome data. Marginal modeling approaches such as generalized estimating equations (GEEs) have received much attention in the context of longitudinal regression. These methods are based on the estimates of the first two moments of the data and the working correlation structure. The confidence regions and hypothesis tests are based on the asymptotic normality. The methods are sensitive to misspecification of the variance function and the working correlation structure. Because of such misspecifications, the estimates can be inefficient and inconsistent, and inference may give incorrect results. To overcome this problem, we propose an empirical likelihood (EL) procedure based on a set of estimating equations for the parameter of interest and discuss its characteristics and asymptotic properties. We also provide an algorithm based on EL principles for the estimation of the regression parameters and the construction of a confidence region for the parameter of interest. We extend our approach to variable selection for highdimensional longitudinal data with many covariates. In this situation it is necessary to identify a submodel that adequately represents the data. Including redundant variables may impact the model’s accuracy and efficiency for inference. We propose a penalized empirical likelihood (PEL) variable selection based on GEEs; the variable selection and the estimation of the coefficients are carried out simultaneously. We discuss its characteristics and asymptotic properties, and present an algorithm for optimizing PEL. Simulation studies show that when the model assumptions are correct, our method performs as well as existing methods, and when the model is misspecified, it has clear advantages. We have applied the method to two case examples.