874 resultados para Return-based pricing kernel


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel technique for selecting the poles of orthonormal basis functions (OBF) in Volterra models of any order is presented. It is well-known that the usual large number of parameters required to describe the Volterra kernels can be significantly reduced by representing each kernel using an appropriate basis of orthonormal functions. Such a representation results in the so-called OBF Volterra model, which has a Wiener structure consisting of a linear dynamic generated by the orthonormal basis followed by a nonlinear static mapping given by the Volterra polynomial series. Aiming at optimizing the poles that fully parameterize the orthonormal bases, the exact gradients of the outputs of the orthonormal filters with respect to their poles are computed analytically by using a back-propagation-through-time technique. The expressions relative to the Kautz basis and to generalized orthonormal bases of functions (GOBF) are addressed; the ones related to the Laguerre basis follow straightforwardly as a particular case. The main innovation here is that the dynamic nature of the OBF filters is fully considered in the gradient computations. These gradients provide exact search directions for optimizing the poles of a given orthonormal basis. Such search directions can, in turn, be used as part of an optimization procedure to locate the minimum of a cost-function that takes into account the error of estimation of the system output. The Levenberg-Marquardt algorithm is adopted here as the optimization procedure. Unlike previous related work, the proposed approach relies solely on input-output data measured from the system to be modeled, i.e., no information about the Volterra kernels is required. Examples are presented to illustrate the application of this approach to the modeling of dynamic systems, including a real magnetic levitation system with nonlinear oscillatory behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since last two decades researches have been working on developing systems that can assistsdrivers in the best way possible and make driving safe. Computer vision has played a crucialpart in design of these systems. With the introduction of vision techniques variousautonomous and robust real-time traffic automation systems have been designed such asTraffic monitoring, Traffic related parameter estimation and intelligent vehicles. Among theseautomatic detection and recognition of road signs has became an interesting research topic.The system can assist drivers about signs they don’t recognize before passing them.Aim of this research project is to present an Intelligent Road Sign Recognition System basedon state-of-the-art technique, the Support Vector Machine. The project is an extension to thework done at ITS research Platform at Dalarna University [25]. Focus of this research work ison the recognition of road signs under analysis. When classifying an image its location, sizeand orientation in the image plane are its irrelevant features and one way to get rid of thisambiguity is to extract those features which are invariant under the above mentionedtransformation. These invariant features are then used in Support Vector Machine forclassification. Support Vector Machine is a supervised learning machine that solves problemin higher dimension with the help of Kernel functions and is best know for classificationproblems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Parametric term structure models have been successfully applied to innumerous problems in fixed income markets, including pricing, hedging, managing risk, as well as studying monetary policy implications. On their turn, dynamic term structure models, equipped with stronger economic structure, have been mainly adopted to price derivatives and explain empirical stylized facts. In this paper, we combine flavors of those two classes of models to test if no-arbitrage affects forecasting. We construct cross section (allowing arbitrages) and arbitrage-free versions of a parametric polynomial model to analyze how well they predict out-of-sample interest rates. Based on U.S. Treasury yield data, we find that no-arbitrage restrictions significantly improve forecasts. Arbitrage-free versions achieve overall smaller biases and Root Mean Square Errors for most maturities and forecasting horizons. Furthermore, a decomposition of forecasts into forward-rates and holding return premia indicates that the superior performance of no-arbitrage versions is due to a better identification of bond risk premium.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Estimating the parameters of the instantaneous spot interest rate process is of crucial importance for pricing fixed income derivative securities. This paper presents an estimation for the parameters of the Gaussian interest rate model for pricing fixed income derivatives based on the term structure of volatility. We estimate the term structure of volatility for US treasury rates for the period 1983 - 1995, based on a history of yield curves. We estimate both conditional and first differences term structures of volatility and subsequently estimate the implied parameters of the Gaussian model with non-linear least squares estimation. Results for bond options illustrate the effects of differing parameters in pricing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the Pricing Equation, in a panel-data framework, we construct a novel consistent estimator of the stochastic discount factor (SDF) mimicking portfolio which relies on the fact that its logarithm is the ìcommon featureîin every asset return of the economy. Our estimator is a simple function of asset returns and does not depend on any parametric function representing preferences, making it suitable for testing di§erent preference speciÖcations or investigating intertemporal substitution puzzles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O Brasil possui um dos mercados de serviços de comunicação e mídia que mais cresce no mundo. Grande parte deste crescimento pode ser atribuída à evolução tecnológica e às recentes alterações no ambiente competitivo, os quais favoreceram a convergência de serviços antes vendidos separadamente - voz, dados e TV. Isto elevou a atratividade destes serviços, contribuindo também para a redução de seu preço, por meio, principalmente, dos ganhos de escala, impulsionando seu crescimento. Este crescimento atraiu mais competidores ao mercado, o que, intensificado por mudanças regulatórias, eleva a importância da correta precificação dos pacotes de serviços. Portanto, um entendimento apropriado dos atributos mais valorizados pelos consumidores no processo decisório de compra destes serviços é de fundamental importância para melhorar a alocação dos recursos e maximizar o retorno, sem perda de participação de mercado. Este trabalho aplica a Metodologia de Preços Hedônicos como ferramenta para auxiliar na identificação dos atributos relevantes dos pacotes de serviços multimídia comercializados no mercado da cidade de São Paulo e seus respectivos preços implícitos. São 371 pacotes distintos contendo de um a três tipos diferentes de serviços analisados com base nos preços praticados em março de 2009. Como resultado, foram identificados intensos prêmios de preço associados aos canais do grupo Telecine, aos canais esportivos, infantis e de filmes e séries, bem como às variáveis ligadas às velocidades de download e upload.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article is motivated by the prominence of one-sided S,s rules in the literature and by the unrealistic strict conditions necessary for their optimality. It aims to assess whether one-sided pricing rules could be an adequate individual rule for macroeconomic models, despite its suboptimality. It aims to answer two questions. First, since agents are not fully rational, is it plausible that they use such a non-optimal rule? Second, even if the agents adopt optimal rules, is the economist committing a serious mistake by assuming that agents use one-sided Ss rules? Using parameters based on real economy data, we found that since the additional cost involved in adopting the simpler rule is relatively small, it is plausible that one-sided rules are used in practice. We also found that suboptimal one-sided rules and optimal two-sided rules are in practice similar, since one of the bounds is not reached very often. We concluded that the macroeconomic effects when one-sided rules are suboptimal are similar to the results obtained under two-sided optimal rules, when they are close to each other. However, this is true only when one-sided rules are used in the context where they are not optimal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consumption is an important macroeconomic aggregate, being about 70% of GNP. Finding sub-optimal behavior in consumption decisions casts a serious doubt on whether optimizing behavior is applicable on an economy-wide scale, which, in turn, challenge whether it is applicable at all. This paper has several contributions to the literature on consumption optimality. First, we provide a new result on the basic rule-of-thumb regression, showing that it is observational equivalent to the one obtained in a well known optimizing real-business-cycle model. Second, for rule-of-thumb tests based on the Asset-Pricing Equation, we show that the omission of the higher-order term in the log-linear approximation yields inconsistent estimates when lagged observables are used as instruments. However, these are exactly the instruments that have been traditionally used in this literature. Third, we show that nonlinear estimation of a system of N Asset-Pricing Equations can be done efficiently even if the number of asset returns (N) is high vis-a-vis the number of time-series observations (T). We argue that efficiency can be restored by aggregating returns into a single measure that fully captures intertemporal substitution. Indeed, we show that there is no reason why return aggregation cannot be performed in the nonlinear setting of the Pricing Equation, since the latter is a linear function of individual returns. This forms the basis of a new test of rule-of-thumb behavior, which can be viewed as testing for the importance of rule-of-thumb consumers when the optimizing agent holds an equally-weighted portfolio or a weighted portfolio of traded assets. Using our setup, we find no signs of either rule-of-thumb behavior for U.S. consumers or of habit-formation in consumption decisions in econometric tests. Indeed, we show that the simple representative agent model with a CRRA utility is able to explain the time series data on consumption and aggregate returns. There, the intertemporal discount factor is significant and ranges from 0.956 to 0.969 while the relative risk-aversion coefficient is precisely estimated ranging from 0.829 to 1.126. There is no evidence of rejection in over-identifying-restriction tests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the Pricing Equation in a panel-data framework, we construct a novel consistent estimator of the stochastic discount factor (SDF) which relies on the fact that its logarithm is the "common feature" in every asset return of the economy. Our estimator is a simple function of asset returns and does not depend on any parametric function representing preferences. The techniques discussed in this paper were applied to two relevant issues in macroeconomics and finance: the first asks what type of parametric preference-representation could be validated by asset-return data, and the second asks whether or not our SDF estimator can price returns in an out-of-sample forecasting exercise. In formal testing, we cannot reject standard preference specifications used in the macro/finance literature. Estimates of the relative risk-aversion coefficient are between 1 and 2, and statistically equal to unity. We also show that our SDF proxy can price reasonably well the returns of stocks with a higher capitalization level, whereas it shows some difficulty in pricing stocks with a lower level of capitalization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study is to empirically analyze the main factors that determine the first-day return and the Flipping activity in Brazilian IPOs, taking into account expected results according to national and international researches. The data base encompasses IPOs that took place between May 2004 and February 2011, summing up to 129 IPOs and approximately R$ 128 billion offering. The first-day return, which means the “money left on the table”, was on average 4.6% taking into consideration the issue price, while the Flipping activity totalized R$ 7.2 billion, meaning 5.6% of the offering. The first-day return was analyzed before and after the first trade, and evidences were found supporting (a) the exogenous determination of the issue price, (b) the opening price dependence of prospectus disclosure and of other variables, observable previously to the bookbuilding process, and (c) the cascade behavior of investors in the pricing after the first trade, particularly driven by the underwriter behavior. In regards to the Flipping, it was notorious depending on how much the IPO succeeded, being concentrated in and homogeneous along the first-day, despite the intense negotiation in the first minute. As a general contribution to literature, it was concluded that Information Asymmetry Theory arguments are not sufficient to explain the first-day Underpricing and the Flipping, being necessary arguments based on Behavioral Finance adapted to an intraday perspective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O objetivo deste estudo é analisar três modelos de operação de uma usina de extração de óleo de palma, de óleo de palmiste, e de biodiesel, respectivamente, em conjunto com a produção de dendê pela agricultura familiar, visando o fornecimento exclusivo de matéria-prima para a indústria através da parceria com Programa Nacional da Agricultura Familiar. O projeto consiste na construção de uma usina de grande porte, a partir de investimentos de fontes públicas e privadas voltados à construção do complexo industrial e ao financiamento da implantação do cultivo de dendê pelos agricultores. Foram construídos quatro cenários, onde foram analisados os indicadores de performance econômico-financeira de avaliação do arranjo produtivo da indústria e da parte agrícola. Esses cenários visam a tomada de decisões de investimentos, através da utilização dos indicadores econômico-financeiros clássicos para avaliação da viabilidade do dendê como matéria-prima para o complexo industrial da usina. Entre os indicadores utilizados neste trabalho podemos citar como principais: o VPL (Valor Presente Líquido) e a TIR (Taxa Interna de Retorno), e indicadores secundários utilizados como suporte para as análises, tais como Payback Descontado, TIRM (Taxa Interna de Retorno Modificada), dentre outros. Além de indicadores sociais, como geração de renda para as famílias de agricultores, com o objetivo principal de criar valor monetário, gerar emprego e renda e pagar os recursos dos investidores púbicos. A implantação do dendezal está fundamentada no Zoneamento Agroecológico do dendê e será executada em áreas degradas por pastagens para garantir a sustentabilidade, contribuir para a recuperação ambiental através do sequestro de carbono e diminuir a pressão sobre as florestas nativas. Consequentemente, espera-se que o projeto contribua para evitar o avanço do desmatamento na região da Amazônia legal. Para tanto, foram realizadas visitas às plantações de dendê e à usina da Biopalma. Também foram realizadas visitas à Embrapa Ocidental na cidade de Manaus e Embrapa na Cidade de Campinas. Foi proposto um modelo de análise econômica financeira baseado em implantação de uma usina de grande porte para a produção de Biodiesel no Estado do Pará. O estudo apresentou, dentre suas limitações, o fato de ser complexo e amplo. A inexistência de projetos de usina grande envergadura totalmente implantados e funcionando com emprego de tecnologia e capital intensivo, como também a impossibilidade de se realizar esta pesquisa de amplo espectro sem um número consideravelmente maior de organizações envolvidas. Como resultado, os indicadores economicos analisados mostram que há viabilidade do projeto em três dos quatro cenários construídos, demonstrando que o capital dos investidores públicos, privados e do agricultor familiar serão remunerados utilizando às taxas de juros, prazos, condições do mercado de capitais. O objetivo é atrair investimentos para a produção de biodiesel, implantação de empreendimentos agrícolas, indústriais, geração emprego e renda para Agricultura Familiar na cadeia produtiva do dendê.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper constructs a unit root test baseei on partially adaptive estimation, which is shown to be robust against non-Gaussian innovations. We show that the limiting distribution of the t-statistic is a convex combination of standard normal and DF distribution. Convergence to the DF distribution is obtaineel when the innovations are Gaussian, implying that the traditional ADF test is a special case of the proposed testo Monte Carlo Experiments indicate that, if innovation has heavy tail distribution or are contaminated by outliers, then the proposed test is more powerful than the traditional ADF testo Nominal interest rates (different maturities) are shown to be stationary according to the robust test but not stationary according to the nonrobust ADF testo This result seems to suggest that the failure of rejecting the null of unit root in nominal interest rate may be due to the use of estimation and hypothesis testing procedures that do not consider the absence of Gaussianity in the data.Our results validate practical restrictions on the behavior of the nominal interest rate imposed by CCAPM, optimal monetary policy and option pricing models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop and empirically test a continuous time equilibrium model for the pricing of oil futures. The model provides a link between no-arbitrage models and expectation oriented models. It highlights the role of inventories for the identification of different pricing regimes. In an empirical study the hedging performance of our model is compared with five other one- and two-factor pricing models. The hedging problem considered is related to Metallgesellschaft´s strategy to hedge long-term forward commitments with short-term futures. The results show that the downside risk distribution of our inventory based model stochastically dominates those of the other models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wilson [16] introduced a general methodology to deal with monopolistic pricing in situations where customers have private information on their tastes (‘types’). It is based on the demand profile of customers: For each nonlinear tariff by the monopolist the demand at a given level of product (or quality) is the measure of customers’ types whose marginal utility is at least the marginal tariff (‘price’). When the customers’ marginal utility has a natural ordering (i.e., the Spence and Mirrlees Condition), such demand profile is very easy to perform. In this paper we will present a particular model with one-dimensional type where the Spence and Mirrlees condition (SMC) fails and the demand profile approach results in a suboptimal solution for the monopolist. Moreover, we will suggest a generalization of the demand profile procedure that improves the monopolist’s profit when the SMC does not hold.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O mercado brasileiro de Telecomunicações e Tecnologia da Informação (TIC) tem importância significativa para o desenvolvimento do Brasil, haja vista a evolução do mercado de telefonia móvel, que cresceu 600% nos últimos dez anos. A indústria de telecomunicações, que representa 4,7 % do PIB brasileiro (TELEBRASIL, 2013), passou a ter uma nova dinâmica a partir da elaboração da Lei Geral de Telecomunicações em 1997 e, posteriormente, com a privatização do setor. Esta rápida transformação da cadeia de valor do setor foi também impulsionada pela evolução das tecnologias e de novas arquiteturas de redes. Ademais, a utilização de tecnologias digitais, como aplicativos/APPs e a própria internet, tornou a cadeia de telecomunicações mais complexa, possibilitando o surgimento de novos atores e o desenvolvimento de novos serviços, modelos de negócios e precificação (SCHAPIRO e VARIAN, 2003). Este estudo tem como objetivo analisar os direcionadores e barreiras na adoção de novos modelos de precificação de serviços no mercado brasileiro de telecomunicações, considerando a transformação e evolução do setor. O estudo foi elaborado por meio de uma estratégia de pesquisa qualitativo-exploratória e construtivista baseando-se na abordagem Multinível (POZZEBON e DINIZ, 2012), que trabalha o contexto, o processo e as interações entre os grupos sociais relevantes. A partir desta análise, foi possível compreender os critérios, direcionadores e barreiras no processo de adoção de novos modelos de precificação, as quais destacam-se as demandas dos usuários, a alta concorrência e a necessidade de aumento do retorno do investimento como os direcionadores mais relevantes, enquanto que a qualidade das redes, a falta de sistemas, a situação financeira das operadoras, a complexidade da regulamentação e o surgimento de grupos sociais distintos dentro da empresa são apontados como as barreiras mais críticas neste processo. Dentro deste contexto, os modelos de precificação emergentes abrangem o empacotamento de serviços, ofertas por tempo limitado, modelos de patrocínio/gratuidade, em conjunto com exploração de novas áreas de negócios. Este estudo proporciona uma contribuição prática e acadêmica na medida em que permite uma melhor compreensão do dinamismo do mercado e suporte para as áreas de marketing estratégico e tático das operadoras, bem como na formulação de políticas e regulamentação do setor.