4 resultados para Idiosyncratic skewness

em Universidade Complutense de Madrid


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Cronbach's alpha is the most widely used method for estimating internal consistency reliability. This procedure has proved very resistant to the passage of time, even if its limitations are well documented and although there are better options as omega coefficient or the different versions of glb, with obvious advantages especially for applied research in which the ítems differ in quality or have skewed distributions. In this paper, using Monte Carlo simulation, the performance of these reliability coefficients under a one-dimensional model is evaluated in terms of skewness and no tau-equivalence. The results show that omega coefficient is always better choice than alpha and in the presence of skew items is preferable to use omega and glb coefficients even in small samples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation goes into the new field from applied linguistics called forensic linguistics, which studies the language as an evidence for criminal cases. There are many subfields within forensic linguistics, however, this study belongs to authorship attribution analysis, where the authorship of a text is attributed to an author through an exhaustive linguistic analysis. Within this field, this study analyzes the morphosyntactic and discursive-pragmatic variables that remain constant in the intra-variation or personal style of a speaker in the oral and written discourse, and at the same time have a high difference rate in the interspeaker variation, or from one speaker to another. The theoretical base of this study is the term used by professor Maria Teresa Turell called “idiolectal style”. This term establishes that the idiosyncratic choices that the speaker makes from the language build a style for each speaker that is constant in the intravariation of the speaker’s discourse. This study comes as a consequence of the problem appeared in authorship attribution analysis, where the absence of some known texts impedes the analysis for the attribution of the authorship of an uknown text. Thus, through a methodology based on qualitative analysis, where the variables are studied exhaustively, and on quantitative analysis, where the findings from qualitative analysis are statistically studied, some conclusions on the evidence of such variables in both oral and written discourses will be drawn. The results of this analysis will lead to further implications on deeper analyses where larger amount of data can be used.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper empirically investigates volatility transmission among stock and foreign exchange markets in seven major world economies during the period July 1988 to January 2015. To this end, we first perform a static and dynamic analysis to measure the total volatility connectedness in the entire period (the system-wide approach) using a framework recently proposed by Diebold and Yilmaz (2014). Second, we make use of a dynamic analysis to evaluate the net directional connectedness for each market. To gain further insights, we examine the time-varying behaviour of net pair-wise directional connectedness during the financial turmoil periods experienced in the sample period Our results suggest that slightly more than half of the total variance of the forecast errors is explained by shocks across markets rather than by idiosyncratic shocks. Furthermore, we find that volatility connectedness varies over time, with a surge during periods of increasing economic and financial instability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

¿What have we learnt from the 2006-2012 crisis, including events such as the subprime crisis, the bankruptcy of Lehman Brothers or the European sovereign debt crisis, among others? It is usually assumed that in firms that have a CDS quotation, this CDS is the key factor in establishing the credit premiumrisk for a new financial asset. Thus, the CDS is a key element for any investor in taking relative value opportunities across a firm’s capital structure. In the first chapter we study the most relevant aspects of the microstructure of the CDS market in terms of pricing, to have a clear idea of how this market works. We consider that such an analysis is a necessary point for establishing a solid base for the rest of the chapters in order to carry out the different empirical studies we perform. In its document “Basel III: A global regulatory framework for more resilient banks and banking systems”, Basel sets the requirement of a capital charge for credit valuation adjustment (CVA) risk in the trading book and its methodology for the computation for the capital requirement. This regulatory requirement has added extra pressure for in-depth knowledge of the CDS market and this motivates the analysis performed in this thesis. The problem arises in estimating of the credit risk premium for those counterparties without a directly quoted CDS in the market. How can we estimate the credit spread for an issuer without CDS? In addition to this, given the high volatility period in the credit market in the last few years and, in particular, after the default of Lehman Brothers on 15 September 2008, we observe the presence of big outliers in the distribution of credit spread in the different combinations of rating, industry and region. After an exhaustive analysis of the results from the different models studied, we have reached the following conclusions. It is clear that hierarchical regression models fit the data much better than those of non-hierarchical regression. Furthermore,we generally prefer the median model (50%-quantile regression) to the mean model (standard OLS regression) due to its robustness when assigning the price to a new credit asset without spread,minimizing the “inversion problem”. Finally, an additional fundamental reason to prefer the median model is the typical "right skewness" distribution of CDS spreads...