980 resultados para Stochastic Frontier Models
Resumo:
This comment corrects the errors in the estimation process that appear in Martins (2001). The first error is in the parametric probit estimation, as the previously presented results do not maximize the log-likelihood function. In the global maximum more variables become significant. As for the semiparametric estimation method, the kernel function used in Martins (2001) can take on both positive and negative values, which implies that the participation probability estimates may be outside the interval [0,1]. We have solved the problem by applying local smoothing in the kernel estimation, as suggested by Klein and Spady (1993).
Resumo:
This paper evaluates the forecasting performance of a continuous stochastic volatility model with two factors of volatility (SV2F) and compares it to those of GARCH and ARFIMA models. The empirical results show that the volatility forecasting ability of the SV2F model is better than that of the GARCH and ARFIMA models, especially when volatility seems to change pattern. We use ex-post volatility as a proxy of the realized volatility obtained from intraday data and the forecasts from the SV2F are calculated using the reprojection technique proposed by Gallant and Tauchen (1998).
Resumo:
From the classical gold standard up to the current ERM2 arrangement of the European Union, target zones have been a widely used exchange regime in contemporary history. This paper presents a benchmark model that rationalizes the choice of target zones over the rest of regimes: the fixed rate, the free float and the managed float. It is shown that the monetary authority may gain efficiency by reducing volatility of both the exchange rate and the interest rate at the same time. Furthermore, the model is consistent with some known stylized facts in the empirical literature that previous models were not able to produce, namely, the positive relation between the exchange rate and the interest rate differential, the degree of non-linearity of the function linking the exchage rate to fundamentals and the shape of the exchange rate stochastic distribution.
Resumo:
This paper investigates the role of variable capacity utilization as a source of asymmetries in the relationship between monetary policy and economic activity within a dynamic stochastic general equilibrium framework. The source of the asymmetry is directly linked to the bottlenecks and stock-outs that emerge from the existence of capacity constraints in the real side of the economy. Money has real effects due to the presence of rigidities in households' portfolio decisions in the form of a Luces-Fuerst 'limited participation' constraint. The model features variable capacity utilization rates across firms due to demand uncertainty. A monopolistic competitive structure provides additional effects through optimal mark-up changes. The overall message of this paper for monetary policy is that the same actions may have different effects depending on the capacity utilization rate of the economy.
Resumo:
In this paper, a new class of generalized backward doubly stochastic differential equations is investigated. This class involves an integral with respect to an adapted continuous increasing process. A probabilistic representation for viscosity solutions of semi-linear stochastic partial differential equations with a Neumann boundary condition is given.
Resumo:
Expectations are central to behaviour. Despite the existence of subjective expectations data, the standard approach is to ignore these, to hypothecate a model of behaviour and to infer expectations from realisations. In the context of income models, we reveal the informational gain obtained from using both a canonical model and subjective expectations data. We propose a test for this informational gain, and illustrate our approach with an application to the problem of measuring income risk.
Resumo:
In this paper we study one-dimensional reflected backward stochastic differential equation when the noise is driven by a Brownian motion and an independent Poisson point process when the solution is forced to stay above a right continuous left-hand limited obstacle. We prove existence and uniqueness of the solution by using a penalization method combined with a monotonic limit theorem.
Resumo:
Estudi realitzat a partir d’una estada a Roma entre el 7 de gener i el 28 de febrer de 2006. S’ estudia la influència de les produccions bizantines i orientals a la península Ibèrica, a l’època visigoda i més enllà, fins i tot justificant una cronologia dels segles VIII-X dC per a molts dels capitells tradicionalment denominats mossàrabs del nord-oest peninsular. A més, s’enuncia una via per la investigació de les possibles influències llombardes a la península Ibèrica. També es comenten les relacions entre els capitells del nord-est peninsular i els de la Gàl.lia.
Resumo:
Estudi realitzat a partir d’una estada a l’Institut National de Recherche Scientifique, de Montreal, entre l’1 de setembre i el 30 de desembre de 2005. S’analitza el model d’organització de l’àrea metropolitana de Montreal (Canadà) després de la reforma realitzada entre 2000 i 2002, així com les causes que van conduïr a adoptar-lo.
Resumo:
Transcripció de la intervenció del Sr. Gabriel Colomé en el Curs Universitari sobre Olimpisme que va organitzar el Centre d'Estudis Olímpics (CEO-UAB) el febrer de 1992. L'autor amb aquest text es proposa dos objectius principals: d'una banda, analitzar la influència de l'entorn sociopolític sobre l'estructura organitzativa del Comitè Organitzador dels Jocs; de l'altra, veure com afecta el tipus de finançament en l'estructura i la infrastructura dels mateixos Jocs, i quines diferències hi ha entre els Jocs de 1972 i els següents fins a arribar a Barcelona.
Resumo:
Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Since conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. Monte Carlo results show that the estimator performs well in comparison to other estimators that have been proposed for estimation of general DLV models.
Resumo:
In the literature on risk, one generally assume that uncertainty is uniformly distributed over the entire working horizon, when the absolute risk-aversion index is negative and constant. From this perspective, the risk is totally exogenous, and thus independent of endogenous risks. The classic procedure is "myopic" with regard to potential changes in the future behavior of the agent due to inherent random fluctuations of the system. The agent's attitude to risk is rigid. Although often criticized, the most widely used hypothesis for the analysis of economic behavior is risk-neutrality. This borderline case must be envisaged with prudence in a dynamic stochastic context. The traditional measures of risk-aversion are generally too weak for making comparisons between risky situations, given the dynamic �complexity of the environment. This can be highlighted in concrete problems in finance and insurance, context for which the Arrow-Pratt measures (in the small) give ambiguous.
Resumo:
There is recent interest in the generalization of classical factor models in which the idiosyncratic factors are assumed to be orthogonal and there are identification restrictions on cross-sectional and time dimensions. In this study, we describe and implement a Bayesian approach to generalized factor models. A flexible framework is developed to determine the variations attributed to common and idiosyncratic factors. We also propose a unique methodology to select the (generalized) factor model that best fits a given set of data. Applying the proposed methodology to the simulated data and the foreign exchange rate data, we provide a comparative analysis between the classical and generalized factor models. We find that when there is a shift from classical to generalized, there are significant changes in the estimates of the structures of the covariance and correlation matrices while there are less dramatic changes in the estimates of the factor loadings and the variation attributed to common factors.
Resumo:
We show a standard model where the optimal tax reform is to cut labor taxes and leave capital taxes very high in the short and medium run. Only in the very long run would capital taxes be zero. Our model is a version of Chamley??s, with heterogeneous agents, without lump sum transfers, an upper bound on capital taxes, and a focus on Pareto improving plans. For our calibration labor taxes should be low for the first ten to twenty years, while capital taxes should be at their maximum. This policy ensures that all agents benefit from the tax reform and that capital grows quickly after when the reform begins. Therefore, the long run optimal tax mix is the opposite from the short and medium run tax mix. The initial labor tax cut is financed by deficits that lead to a positive long run level of government debt, reversing the standard prediction that government accumulates savings in models with optimal capital taxes. If labor supply is somewhat elastic benefits from tax reform are high and they can be shifted entirely to capitalists or workers by varying the length of the transition. With inelastic labor supply there is an increasing part of the equilibrium frontier, this means that the scope for benefitting the workers is limited and the total benefits from reforming taxes are much lower.