999 resultados para Canvi exterior -- Models economètrics
Resumo:
This paper analyzes the linkages between the credibility of a target zone regime, the volatility of the exchange rate, and the width of the band where the exchange rate is allowed to fluctuate. These three concepts should be related since the band width induces a trade-off between credibility and volatility. Narrower bands should give less scope for the exchange rate to fluctuate but may make agents perceive a larger probability of realignment which by itself should increase the volatility of the exchange rate. We build a model where this trade-off is made explicit. The model is used to understand the reduction in volatility experienced by most EMS countries after their target zones were widened on August 1993. As a natural extension, the model also rationalizes the existence of non-official, implicit target zones (or fear of floating), suggested by some authors.
Resumo:
From the classical gold standard up to the current ERM2 arrangement of the European Union, target zones have been a widely used exchange regime in contemporary history. This paper presents a benchmark model that rationalizes the choice of target zones over the rest of regimes: the fixed rate, the free float and the managed float. It is shown that the monetary authority may gain efficiency by reducing volatility of both the exchange rate and the interest rate at the same time. Furthermore, the model is consistent with some known stylized facts in the empirical literature that previous models were not able to produce, namely, the positive relation between the exchange rate and the interest rate differential, the degree of non-linearity of the function linking the exchage rate to fundamentals and the shape of the exchange rate stochastic distribution.
Resumo:
This paper analyzes the persistence of shocks that affect the real exchange rates for a panel of seventeen OECD developed countries during the post-Bretton Woods era. The adoption of a panel data framework allows us to distinguish two different sources of shocks, i.e. the idiosyncratic and the common shocks, each of which may have di¤erent persistence patterns on the real exchange rates. We first investigate the stochastic properties of the panel data set using panel stationarity tests that simultaneously consider both the presence of cross-section dependence and multiple structural breaks that have not received much attention in previous persistence analyses. Empirical results indicate that real exchange rates are non-stationary when the analysis does not account for structural breaks, although this conclusion is reversed when they are modeled. Consequently, misspecification errors due to the non-consideration of structural breaks leads to upward biased shocks' persistence measures. The persistence measures for the idiosyncratic and common shocks have been estimated in this paper always turn out to be less than one year.
Resumo:
This paper provides empirical evidence that continuous time models with one factor of volatility, in some conditions, are able to fit the main characteristics of financial data. It also reports the importance of the feedback factor in capturing the strong volatility clustering of data, caused by a possible change in the pattern of volatility in the last part of the sample. We use the Efficient Method of Moments (EMM) by Gallant and Tauchen (1996) to estimate logarithmic models with one and two stochastic volatility factors (with and without feedback) and to select among them.
Resumo:
Based on an behavioral equilibrium exchange rate model, this paper examines the determinants of the real effective exchange rate and evaluates the degree of misalignment of a group of currencies since 1980. Within a panel cointegration setting, we estimate the relationship between exchange rate and a set of economic fundamentals, such as traded-nontraded productivity differentials and the stock of foreign assets. Having ascertained the variables are integrated and cointegrated, the long-run equilibrium value of the fundamentals are estimated and used to derive equilibrium exchange rates and misalignments. Although there is statistical homogeneity, some structural differences were found to exist between advanced and emerging economies.
Resumo:
This paper presents an analysis of motor vehicle insurance claims relating to vehicle damage and to associated medical expenses. We use univariate severity distributions estimated with parametric and non-parametric methods. The methods are implemented using the statistical package R. Parametric analysis is limited to estimation of normal and lognormal distributions for each of the two claim types. The nonparametric analysis presented involves kernel density estimation. We illustrate the benefits of applying transformations to data prior to employing kernel based methods. We use a log-transformation and an optimal transformation amongst a class of transformations that produces symmetry in the data. The central aim of this paper is to provide educators with material that can be used in the classroom to teach statistical estimation methods, goodness of fit analysis and importantly statistical computing in the context of insurance and risk management. To this end, we have included in the Appendix of this paper all the R code that has been used in the analysis so that readers, both students and educators, can fully explore the techniques described
Resumo:
In this paper we try to analyze the role of fiscal policy in fostering a higher participation of the different production factors in the human capital production sector in the long-run. Introducing a tax on physical capital and differentiating both a tax on raw labor wage and a tax on skills or human capital we also attempt to present a way to influence inequality as measured by the skill premium, thus trying to relate the increase in human capital with the decrease in income inequality. We will do that in the context of a non-scale growth model.The model here is capable to alter the shares of private factors devoted to each of the two production sectors, final output and human capital, and affect inequality in a different way according to the different tax changes. The simulation results derived in the paper show how a human capital (skills) tax cut, which could be interpreted as a reduction in progressivity, ends up increasing both the shares of labor and physical capital devoted to the production of knowledge and decreasing inequality. Moreover, a raw labor wage tax decrease, which could also be interpreted as an increase in the progressivity of the system, increases the share of labor devoted to the production of final output and increases inequality. Finally, a physical capital tax decrease reduces the share of physical capital devoted to the production of knowledge and allows for a lower inequality value. Nevertheless, none of the various types of taxes ends up changing the share of human capital in the knowledge production, which will deserve our future attention
Resumo:
Several unit root tests in panel data have recently been proposed. The test developed by Harris and Tzavalis (1999 JoE) performs particularly well when the time dimension is moderate in relation to the cross-section dimension. However, in common with the traditional tests designed for the unidimensional case, it was found to perform poorly when there is a structural break in the time series under the alternative. Here we derive the asymptotic distribution of the test allowing for a shift in the mean, and assess the small sample performance. We apply this new test to show how the hypothesis of (perfect) hysteresis in Spanish unemployment is rejected in favour of the alternative of the natural unemployment rate, when the possibility of a change in the latter is considered.
Resumo:
In this paper we try to analyze the role of fiscal policy in fostering a higher participation of the different production factors in the human capital production sector in the long-run. Introducing a tax on physical capital and differentiating both a tax on raw labor wage and a tax on skills or human capital we also attempt to present a way to influence inequality as measured by the skill premium, thus trying to relate the increase in human capital with the decrease in income inequality. We will do that in the context of a non-scale growth model.The model here is capable to alter the shares of private factors devoted to each of the two production sectors, final output and human capital, and affect inequality in a different way according to the different tax changes. The simulation results derived in the paper show how a human capital (skills) tax cut, which could be interpreted as a reduction in progressivity, ends up increasing both the shares of labor and physical capital devoted to the production of knowledge and decreasing inequality. Moreover, a raw labor wage tax decrease, which could also be interpreted as an increase in the progressivity of the system, increases the share of labor devoted to the production of final output and increases inequality. Finally, a physical capital tax decrease reduces the share of physical capital devoted to the production of knowledge and allows for a lower inequality value. Nevertheless, none of the various types of taxes ends up changing the share of human capital in the knowledge production, which will deserve our future attention
Resumo:
Several unit root tests in panel data have recently been proposed. The test developed by Harris and Tzavalis (1999 JoE) performs particularly well when the time dimension is moderate in relation to the cross-section dimension. However, in common with the traditional tests designed for the unidimensional case, it was found to perform poorly when there is a structural break in the time series under the alternative. Here we derive the asymptotic distribution of the test allowing for a shift in the mean, and assess the small sample performance. We apply this new test to show how the hypothesis of (perfect) hysteresis in Spanish unemployment is rejected in favour of the alternative of the natural unemployment rate, when the possibility of a change in the latter is considered.
Resumo:
En aquest article s’estimen models de comportament de la demanda turística alemanya i britànica posant èmfasi a treballar amb la sèrie del deflactor dels preus de l’hostaleria balear, tot evitant fer estimacions amb preus declarats de “paquets” turístics, que tenen l’ inconvenient de no recollir els descomptes reals de darrera hora, especialment importants en el cas del mercat britànic
Aplicación del DEA en el análisis de beneficios en un sistema integrado verticalmente hacia adelante
Resumo:
En el presente trabajo se diseñan tres modelos DEA a partir de un sistema de producción cuyos componentes están colocados en un arreglo en serie que se integran verticalmente hacia adelante. El primer modelo busca optimizar los beneficios del sistema agregado, así como la mejora de los mismos en cada uno de los subsistemas. En el segundo de los modelos, además del objetivo anterior, se incluyen restricciones de transferencia de los recursos específicos asociados a cada subsistema, y en el tercer modelo se estima el intervalo de variación para los precios de transferencia de los inputs intermedios entre ambos subsistemas. Los modelos han sido programados y simulados en el software GAMS a partir de datos generados por una función de producción Cobb-Douglas para los inputs intermedios y los outputs finales.
Resumo:
La elaboración de un índice de performance para la evaluación de carteras de inversión tiene como base la correcta definición de la medida de riesgo a emplear. Este trabajo tiene como objetivo proponer una medida de performance adecuada a la evaluación de carteras de fondos de inversión garantizados. Las particularidades de este tipo de fondos hacen necesario definir una medida explicativa de las características especificas de riesgo de este tipo de carteras. Partiendo de la estrategia de porfolio insurance se define una nueva medida de riesgo basada en el downside risk. Proponemos como medida de downside risk aquella parte del riesgo total de una cartera de títulos que se elimina con la estrategia de portfolio insurance. Por contraposición, proponemos como medida de upside risk aquella otra parte del riesgo total de la cartera que no desaparece con la estrategia de portfolio insurance. De este modo, la suma del upside risk y del downside risk es el riesgo total. Partiendo de la medida de riesgo upside risk y del modelo de valoración de activos C.A.P.M. se propone una medida de performance específica para evaluar los fondos de inversión garantizados.
Resumo:
En este trabajo evaluamos la utilidad de una medida de la eficiencia en la generación de ventas, para la predicción del resultado de explotación futuro, bajo la hipótesis de que si la medida de la eficiencia es capaz de capturar el componente permanente de los resultados, debería ser útil para la predicción de los resultados futuros, en adición a los resultados actuales. Con el objetivo anterior, en una primera etapa, utilizamos el Análisis Envolvente de Datos (DEA) para determinar la ineficiencia relativa de las empresas en el uso de los recursos a su disposición para generar el nivel máximo posible de ventas. Los inputs incorporados en el modelo DEA (gastos de personal, consumos de materias primas y otros, amortización, y otros gastos de explotación) se obtienen a partir de información contenida en la Cuenta de Pérdidas y Ganancias. En la segunda etapa, la medida de ineficiencia se introduce como variable explicativa en un modelo de regresión en el que la variable dependiente es el resultado de explotación en el año inmediatamente posterior. Los resultados del estudio empírico indican que la medida de ineficiencia relativa proporcionada por el modelo DEA tiene contenido informativo para la predicción del resultado de explotación futuro, en adición al resultado de explotación actual y pasado.
Resumo:
El objetivo de este documento es formalizar el valor de las acciones de una empresa endeudada, la responsabilidad limitada de los accionistas de una sociedad anónima y la rentabilidad exigida en un horizonte perpetuo, aplicando la teoría de opciones. El modelo que se elabora parte del establecido por Fisher Black y Myron Scholes en 1973 sobre el valor de las acciones de una empresa endeudada en un horizonte monoperiódico. En este modelo se encuentra un problema: la limitación del horizonte a un solo período. El modelo que se desarrolla en este documento, se basa en que el horizonte de la empresa es, en principio, ilimitado, y es frecuente que se mantenga un determinado grado de endeudamiento a lo largo de su vida. Es decir, se considera el endeudamiento de la empresa de horizonte perpetuo. Como consecuencia, los accionistas pueden declarar la quiebra en cualquier momento independientemente del vencimiento de la deuda, dejando la empresa en manos de los acreedores. Basándose en este modelo de horizonte perpetuo, se introduce la opción de abandonar. Se utiliza el valor de venta que tiene el activo de la empresa en el mercado, los accionistas lo comparan con el valor de las deudas, y deciden si es más rentable seguir con la empresa o vender los elementos del activo en el mercado. Para la valoración, se utiliza la fórmula deducida por Merton (1990) de la opción de venta americana perpetua. También se utiliza las opciones de barrera. Una vez valorada la responsabilidad limitada de los accionistas en un horizonte perpetuo, se puede calcular el interés efectivo en condiciones de riesgo.