869 resultados para 150507 Pricing (incl. Consumer Value Estimation)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the period of 1990-2002 US households experienced a dramatic wealth cycle, induced by a 369% appreciation in the value of real per capita liquid stock market assets followed by a 55% decline. However, consumer spending in real terms continued to rise throughout this period. Using data from 1990-2005, traditional life-cycle approaches to estimating macroeconomic wealth effects confront two puzzles: (i) econometric evidence of a stable cointegrating relationship among consumption, income, and wealth is weak at best; and (ii) life-cycle models that rely on aggregate measures of wealth cannot explain why consumption did not collapse when the value of stock market assets declined so dramatically. We address both puzzles by decomposing wealth according to the liquidity of household assets. We find that the significant appreciation in the value of real estate assets that occurred after the peak of the wealth cycle helped sustain consumer spending from 2001 to 2005.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study aims to assess the empirical adherence of the permanent income theory and the consumption smoothing view in Latin America. Two present value models are considered, one describing household behavior and the other open economy macroeconomics. Following the methodology developed in Campbell and Schiller (1987), Bivariate Vector Autoregressions are estimated for the saving ratio and the real growth rate of income concerning the household behavior model and for the current account and the change in national cash ‡ow regarding the open economy model. The countries in the sample are considered separately in the estimation process (individual system estimation) as well as jointly (joint system estimation). Ordinary Least Squares (OLS) and Seemingly Unrelated Regressions (SURE) estimates of the coe¢cients are generated. Wald Tests are then conducted to verify if the VAR coe¢cient estimates are in conformity with those predicted by the theory. While the empirical results are sensitive to the estimation method and discount factors used, there is only weak evidence in favor of the permanent income theory and consumption smoothing view in the group of countries analyzed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Estimating the parameters of the instantaneous spot interest rate process is of crucial importance for pricing fixed income derivative securities. This paper presents an estimation for the parameters of the Gaussian interest rate model for pricing fixed income derivatives based on the term structure of volatility. We estimate the term structure of volatility for US treasury rates for the period 1983 - 1995, based on a history of yield curves. We estimate both conditional and first differences term structures of volatility and subsequently estimate the implied parameters of the Gaussian model with non-linear least squares estimation. Results for bond options illustrate the effects of differing parameters in pricing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper confronts the Capital Asset Pricing Model - CAPM - and the 3-Factor Fama-French - FF - model using both Brazilian and US stock market data for the same Sample period (1999-2007). The US data will serve only as a benchmark for comparative purposes. We use two competing econometric methods, the Generalized Method of Moments (GMM) by (Hansen, 1982) and the Iterative Nonlinear Seemingly Unrelated Regression Estimation (ITNLSUR) by Burmeister and McElroy (1988). Both methods nest other options based on the procedure by Fama-MacBeth (1973). The estimations show that the FF model fits the Brazilian data better than CAPM, however it is imprecise compared with the US analog. We argue that this is a consequence of an absence of clear-cut anomalies in Brazilian data, specially those related to firm size. The tests on the efficiency of the models - nullity of intercepts and fitting of the cross-sectional regressions - presented mixed conclusions. The tests on intercept failed to rejected the CAPM when Brazilian value-premium-wise portfolios were used, contrasting with US data, a very well documented conclusion. The ITNLSUR has estimated an economically reasonable and statistically significant market risk premium for Brazil around 6.5% per year without resorting to any particular data set aggregation. However, we could not find the same for the US data during identical period or even using a larger data set. Este estudo procura contribuir com a literatura empírica brasileira de modelos de apreçamento de ativos. Dois dos principais modelos de apreçamento são Infrontados, os modelos Capital Asset Pricing Model (CAPM)e de 3 fatores de Fama-French. São aplicadas ferramentas econométricas pouco exploradas na literatura nacional na estimação de equações de apreçamento: os métodos de GMM e ITNLSUR. Comparam-se as estimativas com as obtidas de dados americanos para o mesmo período e conclui-se que no Brasil o sucesso do modelo de Fama e French é limitado. Como subproduto da análise, (i) testa-se a presença das chamadas anomalias nos retornos, e (ii) calcula-se o prêmio de risco implícito nos retornos das ações. Os dados revelam a presença de um prêmio de valor, porém não de um prêmio de tamanho. Utilizando o método de ITNLSUR, o prêmio de risco de mercado é positivo e significativo, ao redor de 6,5% ao ano.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most studies around that try to verify the existence of regulatory risk look mainly at developed countries. Looking at regulatory risk in emerging market regulated sectors is no less important to improving and increasing investment in those markets. This thesis comprises three papers comprising regulatory risk issues. In the first Paper I check whether CAPM betas capture information on regulatory risk by using a two-step procedure. In the first step I run Kalman Filter estimates and then use these estimated betas as inputs in a Random-Effect panel data model. I find evidence of regulatory risk in electricity, telecommunications and all regulated sectors in Brazil. I find further evidence that regulatory changes in the country either do not reduce or even increase the betas of the regulated sectors, going in the opposite direction to the buffering hypothesis as proposed by Peltzman (1976). In the second Paper I check whether CAPM alphas say something about regulatory risk. I investigate a methodology similar to those used by some regulatory agencies around the world like the Brazilian Electricity Regulatory Agency (ANEEL) that incorporates a specific component of regulatory risk in setting tariffs for regulated sectors. I find using SUR estimates negative and significant alphas for all regulated sectors especially the electricity and telecommunications sectors. This runs in the face of theory that predicts alphas that are not statistically different from zero. I suspect that the significant alphas are related to misspecifications in the traditional CAPM that fail to capture true regulatory risk factors. On of the reasons is that CAPM does not consider factors that are proven to have significant effects on asset pricing, such as Fama and French size (ME) and price-to-book value (ME/BE). In the third Paper, I use two additional factors as controls in the estimation of alphas, and the results are similar. Nevertheless, I find evidence that the negative alphas may be the result of the regulated sectors premiums associated with the three Fama and French factors, particularly the market risk premium. When taken together, ME and ME/BE regulated sectors diminish the statistical significance of market factors premiums, especially for the electricity sector. This show how important is the inclusion of these factors, which unfortunately is scarce in emerging markets like Brazil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known that cointegration between the level of two variables (labeled Yt and yt in this paper) is a necessary condition to assess the empirical validity of a present-value model (PV and PVM, respectively, hereafter) linking them. The work on cointegration has been so prevalent that it is often overlooked that another necessary condition for the PVM to hold is that the forecast error entailed by the model is orthogonal to the past. The basis of this result is the use of rational expectations in forecasting future values of variables in the PVM. If this condition fails, the present-value equation will not be valid, since it will contain an additional term capturing the (non-zero) conditional expected value of future error terms. Our article has a few novel contributions, but two stand out. First, in testing for PVMs, we advise to split the restrictions implied by PV relationships into orthogonality conditions (or reduced rank restrictions) before additional tests on the value of parameters. We show that PV relationships entail a weak-form common feature relationship as in Hecq, Palm, and Urbain (2006) and in Athanasopoulos, Guillén, Issler and Vahid (2011) and also a polynomial serial-correlation common feature relationship as in Cubadda and Hecq (2001), which represent restrictions on dynamic models which allow several tests for the existence of PV relationships to be used. Because these relationships occur mostly with nancial data, we propose tests based on generalized method of moment (GMM) estimates, where it is straightforward to propose robust tests in the presence of heteroskedasticity. We also propose a robust Wald test developed to investigate the presence of reduced rank models. Their performance is evaluated in a Monte-Carlo exercise. Second, in the context of asset pricing, we propose applying a permanent-transitory (PT) decomposition based on Beveridge and Nelson (1981), which focus on extracting the long-run component of asset prices, a key concept in modern nancial theory as discussed in Alvarez and Jermann (2005), Hansen and Scheinkman (2009), and Nieuwerburgh, Lustig, Verdelhan (2010). Here again we can exploit the results developed in the common cycle literature to easily extract permament and transitory components under both long and also short-run restrictions. The techniques discussed herein are applied to long span annual data on long- and short-term interest rates and on price and dividend for the U.S. economy. In both applications we do not reject the existence of a common cyclical feature vector linking these two series. Extracting the long-run component shows the usefulness of our approach and highlights the presence of asset-pricing bubbles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pode-se observar uma considerável dispersão entre os preços que diferentes bancos comerciais no Brasil cobram por um mesmo pacote homogêneo de serviços— dispersão esta que é sustentada ao longo do tempo. Em uma tentativa de replicar esta observação empírica, foi desenvolvido um simples modelo que lança mão do arcabouço da literatura de custos de procura (search costs) e que baseia-se também na lealdade por parte dos consumidores. Em seguida, dados de preços referentes ao setor bancário brasileiro são aplicados ao modelo desenvolvido e alguns exercícios empíricos são então realizados. Esses exercícios permitem que: (i) os custos de procura incorridos pelos consumidores sejam estimados, ao fixar-se os valores dos demais parâmetros e (ii) as correspondentes perdas de peso-morto que surgem como consequência dos custos de procura incorridos pelos consumidores sejam também estimadas. Quando apenas 80% da população é livre para buscar por bancos que cobrem menores tarifas, à taxa de juros mensal de 0,5%, o valor estimado do custo de procura médio incorrido pelos consumidores chega a 1805,80 BRL, sendo a correspondente perda de peso-morto média na ordem de 233,71 BRL por consumidor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper has two original contributions. First, we show that the present value model (PVM hereafter), which has a wide application in macroeconomics and fi nance, entails common cyclical feature restrictions in the dynamics of the vector error-correction representation (Vahid and Engle, 1993); something that has been already investigated in that VECM context by Johansen and Swensen (1999, 2011) but has not been discussed before with this new emphasis. We also provide the present value reduced rank constraints to be tested within the log-linear model. Our second contribution relates to forecasting time series that are subject to those long and short-run reduced rank restrictions. The reason why appropriate common cyclical feature restrictions might improve forecasting is because it finds natural exclusion restrictions preventing the estimation of useless parameters, which would otherwise contribute to the increase of forecast variance with no expected reduction in bias. We applied the techniques discussed in this paper to data known to be subject to present value restrictions, i.e. the online series maintained and up-dated by Shiller. We focus on three different data sets. The fi rst includes the levels of interest rates with long and short maturities, the second includes the level of real price and dividend for the S&P composite index, and the third includes the logarithmic transformation of prices and dividends. Our exhaustive investigation of several different multivariate models reveals that better forecasts can be achieved when restrictions are applied to them. Moreover, imposing short-run restrictions produce forecast winners 70% of the time for target variables of PVMs and 63.33% of the time when all variables in the system are considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper constructs a unit root test baseei on partially adaptive estimation, which is shown to be robust against non-Gaussian innovations. We show that the limiting distribution of the t-statistic is a convex combination of standard normal and DF distribution. Convergence to the DF distribution is obtaineel when the innovations are Gaussian, implying that the traditional ADF test is a special case of the proposed testo Monte Carlo Experiments indicate that, if innovation has heavy tail distribution or are contaminated by outliers, then the proposed test is more powerful than the traditional ADF testo Nominal interest rates (different maturities) are shown to be stationary according to the robust test but not stationary according to the nonrobust ADF testo This result seems to suggest that the failure of rejecting the null of unit root in nominal interest rate may be due to the use of estimation and hypothesis testing procedures that do not consider the absence of Gaussianity in the data.Our results validate practical restrictions on the behavior of the nominal interest rate imposed by CCAPM, optimal monetary policy and option pricing models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

I examine the effects of uncertainty about the timing of de aIs (i.e. temporary price cuts or sales) on consumer behavior in a dynamic inventory model of consumer choice. I derive implications for purchase behavior and test them empirically, using two years of scanner data for soft drinks. I fmd that loyal consumers' decisions, both about the allocation of their purchases over time and the quantity to be purchased in a particular deal, are affected by the uncertainty about the timing of the deal for the product. Loyal consumers buy a higher fraction of their overall purchases during de ais as the uncertainty decreases. This effect increases with an increase in the product' s share of a given consumer' s purchase in the same category or if the consumer stockpiles (i.e., is a shopper). During a particular deal, loyal shoppers increase the quantity they purchase the more time that has passed since the previous de aI, and the higher the uncertainty about the deals' timing. For the non-Ioyal consumers these effects are not significant. These results hold for products that are frequently purchased, like soft-drinks and yogurt, but do not hold for less frequentIy purchased products, such as laundry detergents. The fmdings suggest that manufacturers and retailers should incorporate the effects of deals' timing on consumers' purchase' decisions when deriving optimal pricing strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper evaluates how information asymmetry affects the strength of competition in credit markets. A theory is presented in which adverse selection softens competition by decreasing the incentives creditors have for competing in the interest rate dimension. In equilibirum, although creditors compete, the outcome is similar to collusion. Three empirical implications arise. First, interest rate should respond asymmetrically to changes in the cost of funds: increases in cost of funds should, on average, have a larger effect on interest rates than decreases. Second, aggressiveness in pricing should be associated with a worseing in the bank level default rates. Third, bank level default rates should be endogenous. We then verify the validity of these three empirical implications using Brazilian data on consumer overdraft loans. The results in this paper rationalize seemingly abnormallly high interest rates in unsecured loans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is to analyze extremal events using Generalized Pareto Distributions (GPD), considering explicitly the uncertainty about the threshold. Current practice empirically determines this quantity and proceeds by estimating the GPD parameters based on data beyond it, discarding all the information available be10w the threshold. We introduce a mixture model that combines a parametric form for the center and a GPD for the tail of the distributions and uses all observations for inference about the unknown parameters from both distributions, the threshold inc1uded. Prior distribution for the parameters are indirectly obtained through experts quantiles elicitation. Posterior inference is available through Markov Chain Monte Carlo (MCMC) methods. Simulations are carried out in order to analyze the performance of our proposed mode1 under a wide range of scenarios. Those scenarios approximate realistic situations found in the literature. We also apply the proposed model to a real dataset, Nasdaq 100, an index of the financiai market that presents many extreme events. Important issues such as predictive analysis and model selection are considered along with possible modeling extensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new paradigm is modeling the World: evolutionary innovations in all fronts, new information technologies, huge mobility of capital, use of risky financial tools, globalization of production, new emerging powers and the impact of consumer concerns on governmental policies. These phenomena are shaping the World and forcing the advent of a new World Order in the Multilateral Monetary, Financial, and Trading System. The effects of this new paradigm are also transforming global governance. The political and economic orders established after the World War and centered on the multilateral model of UN, IMF, World Bank, and the GATT, leaded by the developed countries, are facing significant challenges. The rise of China and emerging countries shifted the old model to a polycentric World, where the governance of these organizations are threatened by emerging countries demanding a bigger participation in the role and decision boards of these international bodies. As a consequence, multilateralism is being confronted by polycentrism. Negotiations for a more representative voting process and the pressure for new rules to cope with the new demands are paralyzing important decisions. This scenario is affecting seriously not only the Monetary and Financial Systems but also the Multilateral Trading System. International trade is facing some significant challenges: a serious deadlock to conclude the last round of the multilateral negotiation at the WTO, the fragmentation of trade rules by the multiplication of preferential and mega agreements, the arrival of a new model of global production and trade leaded by global value chains that is threatening the old trade order, and the imposition of new sets of regulations by private bodies commanded by transnationals to support global value chains and non-governmental organizations to reflect the concerns of consumers in the North based on their precautionary attitude about sustainability of products made in the World. The lack of any multilateral order in this new regulation is creating a big cacophony of rules and developing a new regulatory war of the Global North against the Global South. The objective of this paper is to explore how these challenges are affecting the Tradinge System and how it can evolve to manage these new trends.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This doctoral thesis is about global brands under several perspectives, starting this study with and overview on the matter, followed by a "step ahead" in the conceptualization of brand equity and brand value. As the global marketplace dynamically increases, there are theoretical and empirical challenges concerning the global brands that ask for more branding researches, trying to tune and to contextualize meanings and attributes. Thereafter, the thesis intends to provide a discussion about the industry and country-of-origin effects (and their interactions) on the brand value and the firm market value. Finally, the thesis offers an interesting comparison about the practitioners’ perspectives on the dimensions of global brands, the brand equity and the brand value, branding and marketing, including highlights on the brand internationalization process. The thesis offers a general approach on the extant literature in the first chapter, and a specific literature review for each other chapter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objetivou-se com este trabalho avaliar a produção fecal por meio do indicador interno, fibra em detergente neutro indigestível (FDNi), e externos, cromo complexado com ácido etilenodiaminotetracético (Cr-EDTA ) e o cloreto de itérbio (YbCl3), além de estimar o fluxo duodenal da matéria seca e os coeficientes de digestibilidades aparentes total, ruminal e pós-ruminal, de diferentes nutrientes. Foram utilizadas oito novilhas mestiças Holandês/Zebu, distribuídas em duplo quadrado latino 4 x 4. Os indicadores Cr-EDTA, YbCl3 e o FDNi não estimaram produção fecal de forma eficiente (p < 0,05), obtendo resultado de 1,64; 1,71 e 2,71 kg dia-1, respectivamente, quando comparado à coleta total de fezes, que obteve resultado de 1,39 kg dia-1. Os valores estimados de fluxo de matéria seca, tanto para a metodologia de único, quanto para de duplo indicador, podem ser considerados biologicamente aceitáveis. Contudo, o valor obtido pela associação Cr-EDTA/YbCl3, utilizada na forma de duplo indicador, foi o mais confiável, pela melhor recuperação dos indicadores externos (Cr-EDTA e YbCl3), que obtiveram médias de 89 e 85%, respectivamente, em comparação ao interno (FDNi), que obteve média 67%. Os coeficientes de digestibilidade ruminal e pós ruminal, estimados pela associação Cr-EDTA/YbCl3, foram considerados melhores, em consequência do valor de fluxo de matéria seca estimado por esta associação.