944 resultados para Variable pricing model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In chemical analyses performed by laboratories, one faces the problem of determining the concentration of a chemical element in a sample. In practice, one deals with the problem using the so-called linear calibration model, which considers that the errors associated with the independent variables are negligible compared with the former variable. In this work, a new linear calibration model is proposed assuming that the independent variables are subject to heteroscedastic measurement errors. A simulation study is carried out in order to verify some properties of the estimators derived for the new model and it is also considered the usual calibration model to compare it with the new approach. Three applications are considered to verify the performance of the new approach. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Influence diagnostics methods are extended in this article to the Grubbs model when the unknown quantity x (latent variable) follows a skew-normal distribution. Diagnostic measures are derived from the case-deletion approach and the local influence approach under several perturbation schemes. The observed information matrix to the postulated model and Delta matrices to the corresponding perturbed models are derived. Results obtained for one real data set are reported, illustrating the usefulness of the proposed methodology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurement error models often arise in epidemiological and clinical research. Usually, in this set up it is assumed that the latent variable has a normal distribution. However, the normality assumption may not be always correct. Skew-normal/independent distribution is a class of asymmetric thick-tailed distributions which includes the Skew-normal distribution as a special case. In this paper, we explore the use of skew-normal/independent distribution as a robust alternative to null intercept measurement error model under a Bayesian paradigm. We assume that the random errors and the unobserved value of the covariate (latent variable) follows jointly a skew-normal/independent distribution, providing an appealing robust alternative to the routine use of symmetric normal distribution in this type of model. Specific distributions examined include univariate and multivariate versions of the skew-normal distribution, the skew-t distributions, the skew-slash distributions and the skew contaminated normal distributions. The methods developed is illustrated using a real data set from a dental clinical trial. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is a note about proxy variables and instruments for identification of structural parameters in regression models. We have experienced that in the econometric textbooks these two issues are treated separately, although in practice these two concepts are very often combined. Usually, proxy variables are inserted in instrument variable regressions with the motivation they are exogenous. Implicitly meaning they are exogenous in a reduced form model and not in a structural model. Actually if these variables are exogenous they should be redundant in the structural model, e.g. IQ as a proxy for ability. Valid proxies reduce unexplained variation and increases the efficiency of the estimator of the structural parameter of interest. This is especially important in situations when the instrument is weak. With a simple example we demonstrate what is required of a proxy and an instrument when they are combined. It turns out that when a researcher has a valid instrument the requirements on the proxy variable is weaker than if no such instrument exists

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sweden, together with Norway, Finland and Denmark, have created a multi-national electricity market called NordPool. In this market, producers and retailers of electricity can buy and sell electricity, and the retailers then offers this electricity to end consumers such as households and industries. Previous studies have shown that pricing at the NordPool market is functioning quite well, but no other study has to my knowledge studied if pricing in the retail market to consumers in Sweden is well functioning. If the market is well functioning, with competition and low transaction costs when changing electricity retailer, we would expect that a homogeneous good such as electricity would be sold at the approximately same price, and that price changes would be highly correlated, in this market. Thus, the aim of this study is to test whether the price of Vattenfall, the largest energy firm in the Swedish market, is highly correlated to the price of other firms in the Swedish retail market for electricity. Descriptive statistics indicate that the price offered by Vattenfall is quite similar to the price of other firms in the market. In addition, regression analysis show that the correlation between the price of Vattenfall and other firms is as high as 0.98.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A procedure for characterizing global uncertainty of a rainfall-runoff simulation model based on using grey numbers is presented. By using the grey numbers technique the uncertainty is characterized by an interval; once the parameters of the rainfall-runoff model have been properly defined as grey numbers, by using the grey mathematics and functions it is possible to obtain simulated discharges in the form of grey numbers whose envelope defines a band which represents the vagueness/uncertainty associated with the simulated variable. The grey numbers representing the model parameters are estimated in such a way that the band obtained from the envelope of simulated grey discharges includes an assigned percentage of observed discharge values and is at the same time as narrow as possible. The approach is applied to a real case study highlighting that a rigorous application of the procedure for direct simulation through the rainfall-runoff model with grey parameters involves long computational times. However, these times can be significantly reduced using a simplified computing procedure with minimal approximations in the quantification of the grey numbers representing the simulated discharges. Relying on this simplified procedure, the conceptual rainfall-runoff grey model is thus calibrated and the uncertainty bands obtained both downstream of the calibration process and downstream of the validation process are compared with those obtained by using a well-established approach, like the GLUE approach, for characterizing uncertainty. The results of the comparison show that the proposed approach may represent a valid tool for characterizing the global uncertainty associable with the output of a rainfall-runoff simulation model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este trabalho tem por objetivo apresentar a fundamentação teórica e efetuar uma aplicação prática de uma das mais importantes descobertas no campo das finanças: o modelo de precificação de ativos de capital padrão, denominado de Capital Asset Price Model (CAPM). Na realização da aplicação prática, comparou-se a performance entre os retornos dos investimentos exigidos pelo referido modelo e os realmente obtidos. Foram analisadas cinco ações com a maior participação relativa na carteira teórica do Ibovespa e com retornos publicados de junho de 1998 a maio de 2001. Os dados foram obtidos da Economática da UFRGS e testados utilizando-se o Teste-t (duas amostras em par para médias) na ferramenta MS Excel. Os resultados foram tabelados e analisados, de onde se concluiu que, estatisticamente, com índice de confiança de 95%, não houve diferença de performance entre os retornos esperados e os realmente obtidos dos ativos objeto desta dissertação, no período estudado.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Parametric term structure models have been successfully applied to innumerous problems in fixed income markets, including pricing, hedging, managing risk, as well as studying monetary policy implications. On their turn, dynamic term structure models, equipped with stronger economic structure, have been mainly adopted to price derivatives and explain empirical stylized facts. In this paper, we combine flavors of those two classes of models to test if no-arbitrage affects forecasting. We construct cross section (allowing arbitrages) and arbitrage-free versions of a parametric polynomial model to analyze how well they predict out-of-sample interest rates. Based on U.S. Treasury yield data, we find that no-arbitrage restrictions significantly improve forecasts. Arbitrage-free versions achieve overall smaller biases and Root Mean Square Errors for most maturities and forecasting horizons. Furthermore, a decomposition of forecasts into forward-rates and holding return premia indicates that the superior performance of no-arbitrage versions is due to a better identification of bond risk premium.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Estimating the parameters of the instantaneous spot interest rate process is of crucial importance for pricing fixed income derivative securities. This paper presents an estimation for the parameters of the Gaussian interest rate model for pricing fixed income derivatives based on the term structure of volatility. We estimate the term structure of volatility for US treasury rates for the period 1983 - 1995, based on a history of yield curves. We estimate both conditional and first differences term structures of volatility and subsequently estimate the implied parameters of the Gaussian model with non-linear least squares estimation. Results for bond options illustrate the effects of differing parameters in pricing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Verdelhan (2009) mostra que desejando-se explicar o comporta- mento do prêmio de risco nos mercados de títulos estrangeiros usando- se o modelo de formação externa de hábitos proposto por Campbell e Cochrane (1999) será necessário especi car o retorno livre de risco de equilíbrio de maneira pró-cíclica. Mostramos que esta especi cação só é possível sobre parâmetros de calibração implausíveis. Ainda no processo de calibração, para a maioria dos parâmetros razoáveis, a razão preço-consumo diverge. Entretanto, adotando a sugestão pro- posta por Verdelhan (2009) - de xar a função sensibilidade (st) no seu valor de steady-state durante a calibração e liberá-la apenas du- rante a simulação dos dados para se garantir taxas livre de risco pró- cíclicas - conseguimos encontrar um valor nito e bem comportado para a razão preço-consumo de equilíbrio e replicar o foward premium anom- aly. Desconsiderando possíveis inconsistências deste procedimento, so- bre retornos livres de risco pró-cíclicos, conforme sugerido por Wachter (2006), o modelo utilizado gera curvas de yields reais decrescentes na maturidade, independentemente do estado da economia - resultado que se opõe à literatura subjacente e aos dados reais sobre yields.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A motivação para este trabalho vem dos principais resultados de Carvalho e Schwartzman (2008), onde a heterogeneidade surge a partir de diferentes regras de ajuste de preço entre os setores. Os momentos setoriais da duração da rigidez nominal são su cientes para explicar certos efeitos monetários. Uma vez que concordamos que a heterogeneidade é relevante para o estudo da rigidez de preços, como poderíamos escrever um modelo com o menor número possível de setores, embora com um mínimo de heterogeneidade su ciente para produzir qualquer impacto monetário desejado, ou ainda, qualquer três momentos da duração? Para responder a esta questão, este artigo se restringe a estudar modelos com hazard-constante e considera que o efeito acumulado e a dinâmica de curto-prazo da política monetária são boas formas de se resumir grandes economias heterogêneas. Mostramos que dois setores são su cientes para resumir os efeitos acumulados de choques monetários, e economias com 3 setores são boas aproximações para a dinâmica destes efeitos. Exercícios numéricos para a dinâmica de curto prazo de uma economia com rigidez de informação mostram que aproximar 500 setores usando apenas 3 produz erros inferiores a 3%. Ou seja, se um choque monetário reduz o produto em 5%, a economia aproximada produzirá um impacto entre 4,85% e 5,15%. O mesmo vale para a dinâmica produzida por choques de nível de moeda em uma economia com rigidez de preços. Para choques na taxa de crescimento da moeda, a erro máximo por conta da aproximação é de 2,4%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este trabalho busca, através dos princípios de Finanças Corporativas e de Apreçamento de Ativos, mensurar o impacto do nível de liquidez das companhias na expectativa de retorno das ações no mercado acionário brasileiro. O pressuposto básico dessa relação é que a posição de caixa representa um tipo de risco não capturado por outras variáveis. Para mensurar esse risco, será utilizada a modelagem de fatores para apreçamento de ativos. O modelo básico utilizado será o de três fatores de Fama e French, adaptado para a inclusão da variável caixa. A partir da base de dados, se tentará estimar a sensibilidade do retorno esperado das ações brasileiras ao fator caixa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Verdelhan (2009) shows that if one is to explain the foreign exchange forward premium behavior using Campbell and Cochrane (1999)’s habit formation model one must specify it in such a way to generate pro-cyclical short term risk free rates. At the calibration procedure, we show that this is only possible in Campbell and Cochrane’s framework under implausible parameters specifications given that the price-consumption ratio diverges in almost all parameters sets. We, then, adopt Verdelhan’s shortcut of fixing the sensivity function λ(st) at its steady state level to attain a finite value for the price-consumption ratio and release it in the simulation stage to ensure pro-cyclical risk free rates. Beyond the potential inconsistencies that such procedure may generate, as suggested by Wachter (2006), with procyclical risk free rates the model generates a downward sloped real yield curve, which is at odds with the data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The real effects of an imperfectly credible disinflation depend critically on the extent of price rigidity. Therefore, the study of how policymakers’ credibility affects the outcome of an announced disinflation should not be dissociated from the analysis of the determinants of the frequency of price adjustments. In this paper we examine how credibility affects the outcome of a disinflation in a model with endogenous timedependent pricing rules. Both the initial degree of price ridigity, calculated optimally, and, more notably, the changes in contract length during disinflation play an important role in the explanation of the effects of imperfect credibility. We initially evaluate the costs of disinflation in a setup where credibility is exogenous, and then allow agents to use Bayes rule to update beliefs about the “type” of monetary authority that they face. In both cases, the interaction between the endogeneity of time-dependent rules and imperfect credibility increases the output costs of disinflation, but the pattern of the output path is more realistic in the case with learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

: In a model of a nancial market with an atomless continuum of assets, we give a precise and rigorous meaning to the intuitive idea of a \well-diversi ed" portfolio and to a notion of \exact arbitrage". We show this notion to be necessary and su cient for an APT pricing formula to hold, to be strictly weaker than the more conventional notion of \asymptotic arbitrage", and to have novel implications for the continuity of the cost functional as well as for various versions of APT asset pricing. We further justify the idealized measure-theoretic setting in terms of a pricing formula based on \essential" risk, one of the three components of a tri-variate decomposition of an asset's rate of return, and based on a speci c index portfolio constructed from endogenously extracted factors and factor loadings. Our choice of factors is also shown to satisfy an optimality property that the rst m factors always provide the best approximation. We illustrate how the concepts and results translate to markets with a large but nite number of assets, and relate to previous work.