480 resultados para SKEWNESS


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study presents aggradation rates supplemented for the first time by carbonate accumulation rates from Mediterranean cold-water coral sites considering three different regional and geomorphological settings: (i) a cold-water coral ridge (eastern Melilla coral province, Alboran Sea), (ii) a cold-water coral rubble talus deposit at the base of a submarine cliff (Urania Bank, Strait of Sicily) and (iii) a cold-water coral deposit rooted on a predefined topographic high overgrown by cold-water corals (Santa Maria di Leuca coral province, Ionian Sea). The mean aggradation rates of the respective cold-water coral deposits vary between 10 and 530 cm kyr?1 and the mean carbonate accumulation rates range between 8 and 396 g cm?2 kyr?1 with a maximum of 503 g cm?2 kyr?1 reached in the eastern Melilla coral province. Compared to other deep-water depositional environments the Mediterranean cold-water coral sites reveal significantly higher carbonate accumulation rates that were even in the range of the highest productive shallow-water Mediterranean carbonate factories (e.g. Cladocora caespitosa coral reefs). Focusing exclusively on cold-water coral occurrences, the carbonate accumulation rates of the Mediterranean cold-water coral sites are in the lower range of those obtained for the prolific Norwegian coral occurrences, but exhibit much higher rates than the cold-water coral mounds off Ireland. This study clearly indicates that cold-water corals have the potential to act as important carbonate factories and regional carbonate sinks within the Mediterranean Sea. Moreover, the data highlight the potential of cold-water corals to store carbonate with rates in the range of tropical shallow-water reefs. In order to evaluate the contribution of the cold-water coral carbonate factory to the regional or global carbonate/carbon cycle, an improved understanding of the temporal and spatial variability in aggradation and carbonate accumulation rates and areal estimates of the respective regions is needed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introspection is the process by which individuals question their attitudes; either questioning why they hold their attitudes (Why introspection), or how they feel about a particular attitude object (How introspection). Previous research has suggested that Why-introspection induces attitude change, and that Why and How introspection influence attitude-behaviour consistency,persuasion, and other effects. Generally, psychologists have assumed that affective and cognitive attitude bases are the mechanism by which introspection leads to these effects. Leading perspectives originating from these findings suggest that either Why introspection changes the content of cognitive attitude bases (the skewness hypothesis), or increases the salience of cognitive attitude bases (the dominance hypothesis); whereas How introspection may increase the salience of affective attitude bases (another part of the dominance hypothesis). However, direct evidence for these mechanisms is lacking, and the distinction between structural and meta bases has not been considered. Two studies investigated this gap in the existing literature. Both studies measured undergraduate students’ attitudes and attitude bases (both structural and meta, affective and cognitive) before and after engaging in an introspection manipulation (Why introspection / How introspection / control), and after reading a (affective / cognitive) persuasive passage about the attitude object. No evidence was found supporting either the skewness or dominance hypotheses. Furthermore, previous introspection effects were not replicated in the present data. Possible reasons for these null findings are proposed, and several unexpected effects are examined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ABSTRACT The purpose of this study was to examine the technical adequacy of the Developmental Reading Assessment (Beaver & Carter, 2004). Internal consistency analysis, factor analysis, and linear regression analyses were used to test whether the DRA is a statistically reliable measuring of reading comprehension for Grades 7 and 8 students. Correlational analyses, decision consistency analyses, and a focus group of experienced Intermediate (Grades 7 and 8) teachers examined whether there is evidence that the results from the DRA provide valid interpretations regarding students’ reading skills and comprehension. Results indicated that, as currently scored, internal consistency is low and skewness of distribution is high. Factor analyses did not replicate those cited by the DRA developers to prove construct validity. Two-way contingency analyses determined that decision consistency did not vary greatly between the DRA, EQAO, scores and report card marks. Views expressed during the focus group echoed many of the challenges to validity found in the statistical analysis. The teachers found that the DRA was somewhat useful, as there were limited alternative reading assessments available for the classroom, but did not endorse it strongly. The study found little evidence that the DRA provides valid interpretations regarding Intermediate students’ reading skills. Indicated changes to the structure and administration procedures of the DRA may ameliorate some of these issues.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite the volume of work that has been conducted on the topic, the role of surface topography in mediating bacterial cell adhesion is not well understood. The primary reason for this lack of understanding is the relatively limited extent of topographical characterisation employed in many studies. In the present study, the topographies of three sub-nanometrically smooth titanium (Ti) surfaces were comprehensively characterised, using nine individual parameters that together describe the height, shape and distribution of their surface features. This topographical analysis was then correlated with the adhesion behaviour of the pathogenic bacteria Staphylococcus aureus and Pseudomonas aeruginosa, in an effort to understand the role played by each aspect of surface architecture in influencing bacterial attachment. While P. aeruginosa was largely unable to adhere to any of the three sub-nanometrically smooth Ti surfaces, the extent of S. aureus cell attachment was found to be greater on surfaces with higher average, RMS and maximum roughness and higher surface areas. The cells also attached in greater numbers to surfaces that had shorter autocorrelation lengths and skewness values that approached zero, indicating a preference for less ordered surfaces with peak heights and valley depths evenly distributed around the mean plane. Across the sub-nanometrically smooth range of surfaces tested, it was shown that S. aureus more easily attached to surfaces with larger features that were evenly distributed between peaks and valleys, with higher levels of randomness. This study demonstrated that the traditionally employed amplitudinal roughness parameters are not the only determinants of bacterial adhesion, and that spatial parameters can also be used to predict the extent of attachment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since asset returns have been recognized as not normally distributed, the avenue of research regarding portfolio higher moments soon emerged. To account for uncertainty and vagueness of portfolio returns as well as of higher moment risks, we proposed a new portfolio selection model employing fuzzy sets in this paper. A fuzzy multi-objective linear programming (MOLP) for portfolio optimization is formulated using marginal impacts of assets on portfolio higher moments, which are modelled by trapezoidal fuzzy numbers. Through a consistent centroid-based ranking of fuzzy numbers, the fuzzy MOLP is transformed into an MOLP that is then solved by the maximin method. By taking portfolio higher moments into account, the approach enables investors to optimize not only the normal risk (variance) but also the asymmetric risk (skewness) and the risk of fat-tails (kurtosis). An illustrative example demonstrates the efficiency of the proposed methodology comparing to previous portfolio optimization models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation is a collection of three economics essays on different aspects of carbon emission trading markets. The first essay analyzes the dynamic optimal emission control strategies of two nations. With a potential to become the largest buyer under the Kyoto Protocol, the US is assumed to be a monopsony, whereas with a large number of tradable permits on hand Russia is assumed to be a monopoly. Optimal costs of emission control programs are estimated for both the countries under four different market scenarios: non-cooperative no trade, US monopsony, Russia monopoly, and cooperative trading. The US monopsony scenario is found to be the most Pareto cost efficient. The Pareto efficient outcome, however, would require the US to make side payments to Russia, which will even out the differences in the cost savings from cooperative behavior. The second essay analyzes the price dynamics of the Chicago Climate Exchange (CCX), a voluntary emissions trading market. By examining the volatility in market returns using AR-GARCH and Markov switching models, the study associates the market price fluctuations with two different political regimes of the US government. Further, the study also identifies a high volatility in the returns few months before the market collapse. Three possible regulatory and market-based forces are identified as probable causes of market volatility and its ultimate collapse. Organizers of other voluntary markets in the US and worldwide may closely watch for these regime switching forces in order to overcome emission market crashes. The third essay compares excess skewness and kurtosis in carbon prices between CCX and EU ETS (European Union Emission Trading Scheme) Phase I and II markets, by examining the tail behavior when market expectations exceed the threshold level. Dynamic extreme value theory is used to find out the mean price exceedence of the threshold levels and estimate the risk loss. The calculated risk measures suggest that CCX and EU ETS Phase I are extremely immature markets for a risk investor, whereas EU ETS Phase II is a more stable market that could develop as a mature carbon market in future years.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

¿What have we learnt from the 2006-2012 crisis, including events such as the subprime crisis, the bankruptcy of Lehman Brothers or the European sovereign debt crisis, among others? It is usually assumed that in firms that have a CDS quotation, this CDS is the key factor in establishing the credit premiumrisk for a new financial asset. Thus, the CDS is a key element for any investor in taking relative value opportunities across a firm’s capital structure. In the first chapter we study the most relevant aspects of the microstructure of the CDS market in terms of pricing, to have a clear idea of how this market works. We consider that such an analysis is a necessary point for establishing a solid base for the rest of the chapters in order to carry out the different empirical studies we perform. In its document “Basel III: A global regulatory framework for more resilient banks and banking systems”, Basel sets the requirement of a capital charge for credit valuation adjustment (CVA) risk in the trading book and its methodology for the computation for the capital requirement. This regulatory requirement has added extra pressure for in-depth knowledge of the CDS market and this motivates the analysis performed in this thesis. The problem arises in estimating of the credit risk premium for those counterparties without a directly quoted CDS in the market. How can we estimate the credit spread for an issuer without CDS? In addition to this, given the high volatility period in the credit market in the last few years and, in particular, after the default of Lehman Brothers on 15 September 2008, we observe the presence of big outliers in the distribution of credit spread in the different combinations of rating, industry and region. After an exhaustive analysis of the results from the different models studied, we have reached the following conclusions. It is clear that hierarchical regression models fit the data much better than those of non-hierarchical regression. Furthermore,we generally prefer the median model (50%-quantile regression) to the mean model (standard OLS regression) due to its robustness when assigning the price to a new credit asset without spread,minimizing the “inversion problem”. Finally, an additional fundamental reason to prefer the median model is the typical "right skewness" distribution of CDS spreads...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estimating the process capability index (PCI) for non-normal processes has been discussed by many researches. There are two basic approaches to estimating the PCI for non-normal processes. The first commonly used approach is to transform the non-normal data into normal data using transformation techniques and then use a conventional normal method to estimate the PCI for transformed data. This is a straightforward approach and is easy to deploy. The alternate approach is to use non-normal percentiles to calculate the PCI. The latter approach is not easy to implement and a deviation in estimating the distribution of the process may affect the efficacy of the estimated PCI. The aim of this paper is to estimate the PCI for non-normal processes using a transformation technique called root transformation. The efficacy of the proposed technique is assessed by conducting a simulation study using gamma, Weibull, and beta distributions. The root transformation technique is used to estimate the PCI for each set of simulated data. These results are then compared with the PCI obtained using exact percentiles and the Box-Cox method. Finally, a case study based on real-world data is presented.