57 resultados para Crossed Classification Models
Resumo:
This comment corrects the errors in the estimation process that appear in Martins (2001). The first error is in the parametric probit estimation, as the previously presented results do not maximize the log-likelihood function. In the global maximum more variables become significant. As for the semiparametric estimation method, the kernel function used in Martins (2001) can take on both positive and negative values, which implies that the participation probability estimates may be outside the interval [0,1]. We have solved the problem by applying local smoothing in the kernel estimation, as suggested by Klein and Spady (1993).
Resumo:
This paper provides empirical evidence that continuous time models with one factor of volatility, in some conditions, are able to fit the main characteristics of financial data. It also reports the importance of the feedback factor in capturing the strong volatility clustering of data, caused by a possible change in the pattern of volatility in the last part of the sample. We use the Efficient Method of Moments (EMM) by Gallant and Tauchen (1996) to estimate logarithmic models with one and two stochastic volatility factors (with and without feedback) and to select among them.
Resumo:
Expectations are central to behaviour. Despite the existence of subjective expectations data, the standard approach is to ignore these, to hypothecate a model of behaviour and to infer expectations from realisations. In the context of income models, we reveal the informational gain obtained from using both a canonical model and subjective expectations data. We propose a test for this informational gain, and illustrate our approach with an application to the problem of measuring income risk.
Resumo:
The objective of this paper is to measure the impact of different kinds of knowledge and external economies on urban growth in an intraregional context. The main hypothesis is that knowledge leads to growth, and that this knowledge is related to the existence of agglomeration and network externalities in cities. We develop a three-tage methodology: first, we measure the amount and growth of knowledge in cities using the OCDE (2003) classification and employment data; second, we identify the spatial structure of the area of analysis (networks of cities); third, we combine the Glaeser - Henderson - De Lucio models with spatial econometric specifications in order to contrast the existence of spatially static (agglomeration) and spatially dynamic (network) external economies in an urban growth model. Results suggest that higher growth rates are associated to higher levels of technology and knowledge. The growth of the different kinds of knowledge is related to local and spatial factors (agglomeration and network externalities) and each knowledge intensity shows a particular response to these factors. These results have implications for policy design, since we can forecast and intervene on local knowledge development paths.
Resumo:
Estudi realitzat a partir d’una estada a Roma entre el 7 de gener i el 28 de febrer de 2006. S’ estudia la influència de les produccions bizantines i orientals a la península Ibèrica, a l’època visigoda i més enllà, fins i tot justificant una cronologia dels segles VIII-X dC per a molts dels capitells tradicionalment denominats mossàrabs del nord-oest peninsular. A més, s’enuncia una via per la investigació de les possibles influències llombardes a la península Ibèrica. També es comenten les relacions entre els capitells del nord-est peninsular i els de la Gàl.lia.
Resumo:
Estudi realitzat a partir d’una estada a l’Institut National de Recherche Scientifique, de Montreal, entre l’1 de setembre i el 30 de desembre de 2005. S’analitza el model d’organització de l’àrea metropolitana de Montreal (Canadà) després de la reforma realitzada entre 2000 i 2002, així com les causes que van conduïr a adoptar-lo.
Resumo:
The relationship between competition and performance-related pay has been analysed in single-principal-single-agent models. While this approach yields good predictions for managerial pay schemes, the predictions fail to apply for employees at lower tiers of a firm's hierarchy. In this paper, a principal-multi-agent model of incentive pay is developed which makes it possible to analyze the effect of changes in the competitiveness of markets on lower tier incentive payment schemes. The results explain why the payment schemes of agents located at low and mid tiers are less sensitive to changes in competition when aggregated firm data is used. JEL classification numbers: D82, J21, L13, L22. Keywords: Cournot competition, Contract delegation, Moral hazard, Entry, Market size, Wage cost.
Resumo:
Transcripció de la intervenció del Sr. Gabriel Colomé en el Curs Universitari sobre Olimpisme que va organitzar el Centre d'Estudis Olímpics (CEO-UAB) el febrer de 1992. L'autor amb aquest text es proposa dos objectius principals: d'una banda, analitzar la influència de l'entorn sociopolític sobre l'estructura organitzativa del Comitè Organitzador dels Jocs; de l'altra, veure com afecta el tipus de finançament en l'estructura i la infrastructura dels mateixos Jocs, i quines diferències hi ha entre els Jocs de 1972 i els següents fins a arribar a Barcelona.
Resumo:
We give sufficient conditions for existence, uniqueness and ergodicity of invariant measures for Musiela's stochastic partial differential equation with deterministic volatility and a Hilbert space valued driving Lévy noise. Conditions for the absence of arbitrage and for the existence of mild solutions are also discussed.
Resumo:
Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Since conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. Monte Carlo results show that the estimator performs well in comparison to other estimators that have been proposed for estimation of general DLV models.
Resumo:
There is recent interest in the generalization of classical factor models in which the idiosyncratic factors are assumed to be orthogonal and there are identification restrictions on cross-sectional and time dimensions. In this study, we describe and implement a Bayesian approach to generalized factor models. A flexible framework is developed to determine the variations attributed to common and idiosyncratic factors. We also propose a unique methodology to select the (generalized) factor model that best fits a given set of data. Applying the proposed methodology to the simulated data and the foreign exchange rate data, we provide a comparative analysis between the classical and generalized factor models. We find that when there is a shift from classical to generalized, there are significant changes in the estimates of the structures of the covariance and correlation matrices while there are less dramatic changes in the estimates of the factor loadings and the variation attributed to common factors.
Resumo:
Empirical studies on industrial location do not typically distinguish between new and relocated establishments. This paper addresses this shortcoming using data on the frequency of these events in municipalities of the same economic-administrative region. This enables us to test not only for differences in their determinants but also for interrelations between start-ups and relocations. Estimates from count regression models for cross-section and panel data show that, although partial effects differ, common patterns arise in “institutional” and “neoclassical” explanatory factors. Also, start-ups and relocations are positive but asymmetrically related. JEL classification: C25, R30, R10. Keywords: cities, count data models, industrial location
Resumo:
One of the main implications of the efficient market hypothesis (EMH) is that expected future returns on financial assets are not predictable if investors are risk neutral. In this paper we argue that financial time series offer more information than that this hypothesis seems to supply. In particular we postulate that runs of very large returns can be predictable for small time periods. In order to prove this we propose a TAR(3,1)-GARCH(1,1) model that is able to describe two different types of extreme events: a first type generated by large uncertainty regimes where runs of extremes are not predictable and a second type where extremes come from isolated dread/joy events. This model is new in the literature in nonlinear processes. Its novelty resides on two features of the model that make it different from previous TAR methodologies. The regimes are motivated by the occurrence of extreme values and the threshold variable is defined by the shock affecting the process in the preceding period. In this way this model is able to uncover dependence and clustering of extremes in high as well as in low volatility periods. This model is tested with data from General Motors stocks prices corresponding to two crises that had a substantial impact in financial markets worldwide; the Black Monday of October 1987 and September 11th, 2001. By analyzing the periods around these crises we find evidence of statistical significance of our model and thereby of predictability of extremes for September 11th but not for Black Monday. These findings support the hypotheses of a big negative event producing runs of negative returns in the first case, and of the burst of a worldwide stock market bubble in the second example. JEL classification: C12; C15; C22; C51 Keywords and Phrases: asymmetries, crises, extreme values, hypothesis testing, leverage effect, nonlinearities, threshold models
Resumo:
In this article we develop a theoretical microstructure model of coordinated central bank intervention based on asymmetric information. We study the economic implications of coordination on some measures of market quality and show that the model predicts higher volatility and more significant exchange rate changes when central banks coordinate compared to when they intervene unilaterally. Both these predictions are in line with empirical evidence. Keywords: coordinated foreign exchange intervention, market microstructure. JEL Classification: D82, E58, F31, G14
Resumo:
We analyze (non-deterministic) contests with anonymous contest success functions. There is no restriction on the number of contestants or on their valuations for the prize. We provide intuitive and easily verifiable conditions for the existence of an equilibrium with properties similar to the one of the (deterministic) all-pay auction. Since these conditions are fulfilled for a wide array of situations, the predictions of this equilibrium are very robust to the specific details of the contest. An application of this result contributes to fill a gap in the analysis of the popular Tullock rent- seeking game because it characterizes properties of an equilibrium for increasing returns to scale larger than two, for any number of contestants and in contests with or without a common value. Keywords: (non-) deterministic contest, all-pay auction, contest success functions. JEL Classification Numbers: C72 (Noncooperative Games), D72 (Economic Models of Political Processes: Rent-Seeking, Elections), D44 (Auctions).