988 resultados para value distribution


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the frequency of extreme events for three LIFFE futures contracts for the calculation of minimum capital risk requirements (MCRRs). We propose a semiparametric approach where the tails are modelled by the Generalized Pareto Distribution and smaller risks are captured by the empirical distribution function. We compare the capital requirements form this approach with those calculated from the unconditional density and from a conditional density - a GARCH(1,1) model. Our primary finding is that both in-sample and for a hold-out sample, our extreme value approach yields superior results than either of the other two models which do not explicitly model the tails of the return distribution. Since the use of these internal models will be permitted under the EC-CAD II, they could be widely adopted in the near future for determining capital adequacies. Hence, close scrutiny of competing models is required to avoid a potentially costly misallocation capital resources while at the same time ensuring the safety of the financial system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ensemble forecasting of nonlinear systems involves the use of a model to run forward a discrete ensemble (or set) of initial states. Data assimilation techniques tend to focus on estimating the true state of the system, even though model error limits the value of such efforts. This paper argues for choosing the initial ensemble in order to optimise forecasting performance rather than estimate the true state of the system. Density forecasting and choosing the initial ensemble are treated as one problem. Forecasting performance can be quantified by some scoring rule. In the case of the logarithmic scoring rule, theoretical arguments and empirical results are presented. It turns out that, if the underlying noise dominates model error, we can diagnose the noise spread.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tepe Pardis, a significant Neolithic–Chalcolithic site on the Tehran Plain in Iran, is, like many sites in the area, under threat from development. The site contains detailed evidence of (1) the Neolithic–Chalcolithic transition, (2) an Iron Age cemetery and (3) how the inhabitants adapted to an unstable fan environment through resource exploitation (of clay deposits for relatively large-scale ceramic production by c. 5000 BC, and importantly, possible cutting of artificial water channels). Given this significance, models have been produced to better understand settlement distribution and change in the region. However, these models must be tied into a greater understanding of the impact of the geosphere on human development over this period. Forming part of a larger project focusing on the transformation of simple, egalitarian Neolithic communities into more hierarchical Chalcolithic ones, the site has become the focus of a multidisciplinary project to address this issue. Through the combined use of sedimentary and limited pollen analysis, radiocarbon and optically stimulated luminescence dating (the application of the last still rare in Iran), a greater understanding of the impact of alluvial fan development on human settlement through alluviation and the development of river channel sequences is possible. Notably, the findings presented here suggest that artificial irrigation was occurring at the site as early as 6.7±0.4 ka (4300–5100 BC).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a method for describing the distribution of observed temperatures on any day of the year such that the distribution and summary statistics of interest derived from the distribution vary smoothly through the year. The method removes the noise inherent in calculating summary statistics directly from the data thus easing comparisons of distributions and summary statistics between different periods. The method is demonstrated using daily effective temperatures (DET) derived from observations of temperature and wind speed at De Bilt, Holland. Distributions and summary statistics are obtained from 1985 to 2009 and compared to the period 1904–1984. A two-stage process first obtains parameters of a theoretical probability distribution, in this case the generalized extreme value (GEV) distribution, which describes the distribution of DET on any day of the year. Second, linear models describe seasonal variation in the parameters. Model predictions provide parameters of the GEV distribution, and therefore summary statistics, that vary smoothly through the year. There is evidence of an increasing mean temperature, a decrease in the variability in temperatures mainly in the winter and more positive skew, more warm days, in the summer. In the winter, the 2% point, the value below which 2% of observations are expected to fall, has risen by 1.2 °C, in the summer the 98% point has risen by 0.8 °C. Medians have risen by 1.1 and 0.9 °C in winter and summer, respectively. The method can be used to describe distributions of future climate projections and other climate variables. Further extensions to the methodology are suggested.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The influence of the size distribution of particles on the viscous property of an electrorheological fluid has been investigated by the molecular dynamic simulation method. The shear stress of the fluid is found to decrease with the increase of the variance sigma(2) of the Gaussian distribution of the particle size, and then reach a steady value when sigma is larger than 0.5. This phenomenon is attributed to the influence of the particle size distribution on the dynamic structural evolution in the fluid as well as the strength of the different chain-like structures formed by the particles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The replacement of fat and sugar in cakes is a challenge as they have an important effect on the structural and sensory properties. Moreover, there is the possibility to incorporate an additional value using novel replacers. In this work, inulin and oligofructose were used as fat and sugar replacers, respectively. Different combinations of replacement levels were investigated: fat replacement (0 and 50 %) and sugar replacement (0, 20, 30, 40 and 50 %). Simulated microbaking was carried out to study bubble size distribution during baking. Batter viscosity and weight loss during baking were also analysed. Cake characteristics were studied in terms of cell crumb structure, height, texture and sensory properties. Fat and sugar replacement gave place to batters with low apparent viscosity values. During heating, bubbles underwent a marked expansion in replaced cakes if compared to the control cake. The low batter stability in fat-replaced samples increased bubble movement, giving place to cakes with bigger cells and less height than the control. Sugar-replaced samples had smaller and fewer cells and lower height than the control. Moreover, sugar replacement decreased hardness and cohesiveness and in- creased springiness, which could be related with a denser crumb and an easily crumbled product. Regarding the sensory analysis, a replacement up to 50 % of fat and 30 % of sugar, separately and simultaneously, did not change remarkably the overall acceptability of the cakes. However, the sponginess and the sweetness could be improved in all the replaced cakes, according to the Just About Right scales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The number of bidders, N, involved in a construction procurement auction is known to have an important effect on the value of the lowest bid and the mark-up applied by bidders. In practice, for example, it is important for a bidder to have a good estimate of N when bidding for a current contract. One approach, instigated by Friedman in 1956, is to make such an estimate by statistical analysis and modelling. Since then, however, finding a suitable model for N has been an enduring problem for researchers and, despite intensive research activity in the subsequent 30 years, little progress has been made, due principally to the absence of new ideas and perspectives. The debate is resumed by checking old assumptions, providing new evidence relating to concomitant variables and proposing a new model. In doing this and in order to ensure universality, a novel approach is developed and tested by using a unique set of 12 construction tender databases from four continents. This shows the new model provides a significant advancement on previous versions. Several new research questions are also posed and other approaches identified for future study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we proposed a new two-parameter lifetime distribution with increasing failure rate, the complementary exponential geometric distribution, which is complementary to the exponential geometric model proposed by Adamidis and Loukas (1998). The new distribution arises on a latent complementary risks scenario, in which the lifetime associated with a particular risk is not observable; rather, we observe only the maximum lifetime value among all risks. The properties of the proposed distribution are discussed, including a formal proof of its probability density function and explicit algebraic formulas for its reliability and failure rate functions, moments, including the mean and variance, variation coefficient, and modal value. The parameter estimation is based on the usual maximum likelihood approach. We report the results of a misspecification simulation study performed in order to assess the extent of misspecification errors when testing the exponential geometric distribution against our complementary one in the presence of different sample size and censoring percentage. The methodology is illustrated on four real datasets; we also make a comparison between both modeling approaches. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scale mixtures of the skew-normal (SMSN) distribution is a class of asymmetric thick-tailed distributions that includes the skew-normal (SN) distribution as a special case. The main advantage of these classes of distributions is that they are easy to simulate and have a nice hierarchical representation facilitating easy implementation of the expectation-maximization algorithm for the maximum-likelihood estimation. In this paper, we assume an SMSN distribution for the unobserved value of the covariates and a symmetric scale mixtures of the normal distribution for the error term of the model. This provides a robust alternative to parameter estimation in multivariate measurement error models. Specific distributions examined include univariate and multivariate versions of the SN, skew-t, skew-slash and skew-contaminated normal distributions. The results and methods are applied to a real data set.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this note, in an independent private values auction framework, I discuss the relationship between the set of types and the distribution of types. I show that any set of types, finite dimensional or not, can be extended to a larger set of types preserving incentive compatibility constraints, expected revenue and bidder’s expected utilities. Thus for example we may convexify a set of types making our model amenable to the large body of theory in economics and mathematics that relies on convexity assumptions. An interesting application of this extension procedure is to show that although revenue equivalence is not valid in general if the set of types is not convex these mechanism have underlying distinct allocation mechanism in the extension. Thus we recover in these situations the revenue equivalence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Há mais de uma década, o Value-at-Risk (VaR) é utilizado por instituições financeiras e corporações não financeiras para controlar o risco de mercado de carteiras de investimentos. O fato dos métodos paramétricos assumirem a hipótese de normalidade da distribuição de retornos dos fatores de risco de mercado, leva alguns gestores de risco a utilizar métodos por simulação histórica para calcular o VaR das carteiras. A principal crítica à simulação histórica tradicional é, no entanto, dar o mesmo peso na distribuição à todos os retornos encontrados no período. Este trabalho testa o modelo de simulação histórica com atualização de volatilidade proposto por Hull e White (1998) com dados do mercado brasileiro de ações e compara seu desempenho com o modelo tradicional. Os resultados mostraram um desempenho superior do modelo de Hull e White na previsão de perdas para as carteiras e na sua velocidade de adaptação à períodos de ruptura da volatilidade do mercado.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Competitive Strategy literature predicts three different mechanisms of performance generation, thus distinguishing between firms that have competitive advantage, firms that have competitive disadvantage or firms that have neither. Nonetheless, previous works in the field have fitted a single normal distribution to model firm performance. Here, we develop a new approach that distinguishes among performance generating mechanisms and allows the identification of firms with competitive advantage or disadvantage. Theorizing on the positive feedback loops by which firms with competitive advantage have facilitated access to acquire new resources, we proposed a distribution we believe data on firm performance should follow. We illustrate our model by assessing its fit to data on firm performance, addressing its theoretical implications and comparing it to previous works.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

his paper bridges the gap between the buyer-supplier literature and the definition of competitive advantage as value creation found in the strategic management literature. This study proposes and tests an integrative definition of the relational value that is created and appropriated in a dyad

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to perform a photoelastic analysis of stress distribution on straight and angulated implants with different crowns (screwed and cemented). Three models were made of photoelastic resin PL-2: model 1: external hexagon implant 3.75 x 10.00 mm at 0 degrees; model 2: external hexagon implant 3.75 x 10.00 mm at 17 degrees; model 3: external hexagon implant 3.75 x 10.00 mm at 30 degrees. Axial and oblique (45 degrees) load (100 N) was applied with a universal testing machine. The photoelastic fringes on the models were recorded with a digital camera and visualized in a graphic software for qualitative analysis. The axial loading generated the same pattern of stress distribution. The highest stresses were concentrated between medium and apical thirds. The oblique loading generated a similar pattern of stress distribution in the models with similar implant angulation; the highest stress was located on the cervical region opposite to implant angulation and on the apical third. It was concluded that the higher the implant angulation, the higher the stress value, independent of crown type. The screwed prostheses exhibited the highest stress concentration. The oblique load generated higher stress value and concentration than the axial load.