980 resultados para Stochastic Frontier Models
Resumo:
This paper investigates the role of institutions in determining per capita income levels and growth. It contributes to the empirical literature by using different variables as proxies for institutions and by developing a deeper analysis of the issues arising from the use of weak and too many instruments in per capita income and growth regressions. The cross-section estimation suggests that institutions seem to matter, regardless if they are the only explanatory variable or are combined with geographical and integration variables, although most models suffer from the issue of weak instruments. The results from the growth models provides some interesting results: there is mixed evidence on the role of institutions and such evidence is more likely to be associated with law and order and investment profile; government spending is an important policy variable; collapsing the number of instruments results in fewer significant coefficients for institutions.
Resumo:
Isolated cytostatic lung perfusion (ILP) is an attractive technique allowing delivery of a high-dose of cytostatic agents to the lungs while limiting systemic toxicity. In developing a rat model of ILP, we have analysed the effect of the route of tumour cell injection on the source of tumour vessels. Pulmonary sarcomas were established by injecting a sarcoma cell suspension either by the intravenous (i.v.) route or directly into the lung parenchyma. Ink perfusion through either pulmonary artery (PA) or bronchial arteries (BA) was performed and the characteristics of the tumour deposits defined. i.v. and direct injection methods induced pulmonary sarcoma nodules, with similar histological features. The intraparenchymal injection of tumour cells resulted in more reliable and reproducible tumour growth and was associated with a longer survival of the animals. i.v. injected tumours developed a PA-derived vascular tree whereas directly injected tumours developed a BA-derived vasculature.
Resumo:
El projecte exposat té com a propòsit definir i implementar un model de simulació basat en la coordinació i assignació dels serveis d’emergència en accidents de trànsit. La definició del model s’ha realitzat amb l’ús de les Xarxes de Petri Acolorides i la implementació amb el software Rockwell Arena 7.0. El modelatge de la primera simulació ens mostra un model teòric basat en cues mentre que el segon, mostra un model més complet i real gràcies a la connexió mitjançant la plataforma Corba a una base de dades amb informació geogràfica de les flotes i de les rutes. Com a resultat de l’estudi i amb l’ajuda de GoogleEarth, podem realitzar simulacions gràfiques per veure els accidents generats, les flotes dels serveis i el moviment dels vehicles des de les bases fins als accidents.
Resumo:
This paper compares the forecasting performance of different models which have been proposed for forecasting in the presence of structural breaks. These models differ in their treatment of the break process, the parameters defining the model which applies in each regime and the out-of-sample probability of a break occurring. In an extensive empirical evaluation involving many important macroeconomic time series, we demonstrate the presence of structural breaks and their importance for forecasting in the vast majority of cases. However, we find no single forecasting model consistently works best in the presence of structural breaks. In many cases, the formal modeling of the break process is important in achieving good forecast performance. However, there are also many cases where simple, rolling OLS forecasts perform well.
Resumo:
This paper compares the forecasting performance of different models which have been proposed for forecasting in the presence of structural breaks. These models differ in their treatment of the break process, the parameters defining the model which applies in each regime and the out-of-sample probability of a break occurring. In an extensive empirical evaluation involving many important macroeconomic time series, we demonstrate the presence of structural breaks and their importance for forecasting in the vast majority of cases. However, we find no single forecasting model consistently works best in the presence of structural breaks. In many cases, the formal modeling of the break process is important in achieving good forecast performance. However, there are also many cases where simple, rolling OLS forecasts perform well.
Resumo:
Block factor methods offer an attractive approach to forecasting with many predictors. These extract the information in these predictors into factors reflecting different blocks of variables (e.g. a price block, a housing block, a financial block, etc.). However, a forecasting model which simply includes all blocks as predictors risks being over-parameterized. Thus, it is desirable to use a methodology which allows for different parsimonious forecasting models to hold at different points in time. In this paper, we use dynamic model averaging and dynamic model selection to achieve this goal. These methods automatically alter the weights attached to different forecasting model as evidence comes in about which has forecast well in the recent past. In an empirical study involving forecasting output and inflation using 139 UK monthly time series variables, we find that the set of predictors changes substantially over time. Furthermore, our results show that dynamic model averaging and model selection can greatly improve forecast performance relative to traditional forecasting methods.
Resumo:
Spatial heterogeneity, spatial dependence and spatial scale constitute key features of spatial analysis of housing markets. However, the common practice of modelling spatial dependence as being generated by spatial interactions through a known spatial weights matrix is often not satisfactory. While existing estimators of spatial weights matrices are based on repeat sales or panel data, this paper takes this approach to a cross-section setting. Specifically, based on an a priori definition of housing submarkets and the assumption of a multifactor model, we develop maximum likelihood methodology to estimate hedonic models that facilitate understanding of both spatial heterogeneity and spatial interactions. The methodology, based on statistical orthogonal factor analysis, is applied to the urban housing market of Aveiro, Portugal at two different spatial scales.
Resumo:
This paper introduces a new model of trend (or underlying) inflation. In contrast to many earlier approaches, which allow for trend inflation to evolve according to a random walk, ours is a bounded model which ensures that trend inflation is constrained to lie in an interval. The bounds of this interval can either be fixed or estimated from the data. Our model also allows for a time-varying degree of persistence in the transitory component of inflation. The bounds placed on trend inflation mean that standard econometric methods for estimating linear Gaussian state space models cannot be used and we develop a posterior simulation algorithm for estimating the bounded trend inflation model. In an empirical exercise with CPI inflation we find the model to work well, yielding more sensible measures of trend inflation and forecasting better than popular alternatives such as the unobserved components stochastic volatility model.
Resumo:
The authors investigated the dimensionality of the French version of the Rosenberg Self-Esteem Scale (RSES; Rosenberg, 1965) using confirmatory factor analysis. We tested models of 1 or 2 factors. Results suggest the RSES is a 1-dimensional scale with 3 highly correlated items. Comparison with the Revised NEO-Personality Inventory (NEO-PI-R; Costa, McCrae, & Rolland, 1998) demonstrated that Neuroticism correlated strongly and Extraversion and Conscientiousness moderately with the RSES. Depression accounted for 47% of the variance of the RSES. Other NEO-PI-R facets were also moderately related with self-esteem.
Resumo:
This paper examines both the in-sample and out-of-sample performance of three monetary fundamental models of exchange rates and compares their out-of-sample performance to that of a simple Random Walk model. Using a data-set consisting of five currencies at monthly frequency over the period January 1980 to December 2009 and a battery of newly developed performance measures, the paper shows that monetary models do better (in-sample and out-of-sample forecasting) than a simple Random Walk model.
Resumo:
This paper considers the lag structures of dynamic models in economics, arguing that the standard approach is too simple to capture the complexity of actual lag structures arising, for example, from production and investment decisions. It is argued that recent (1990s) developments in the the theory of functional differential equations provide a means to analyse models with generalised lag structures. The stability and asymptotic stability of two growth models with generalised lag structures are analysed. The paper concludes with some speculative discussion of time-varying parameters.
Resumo:
We study the incentive to invest to improve marriage prospects, in a frictionless marriage market with non-transferable utility. Stochastic returns to investment eliminate the multiplicity of equilibria in models with deterministic returns, and a unique equilibrium exists under reasonable conditions. Equilibrium investment is efficient when the sexes are symmetric. However, when there is any asymmetry, including an unbalanced sex ratio, investments are generically excessive. For example, if there is an excess of boys, then there is parental over-investment in boys and under-investment in girls, and total investment will be excessive.
Resumo:
We present a stylized intertemporal forward-looking model able that accommodates key regional economic features, an area where the literature is not well developed. The main difference, from the standard applications, is the role of saving and its implication for the balance of payments. Though maintaining dynamic forward-looking behaviour for agents, the rate of private saving is exogenously determined and so no neoclassical financial adjustment is needed. Also, we focus on the similarities and the differences between myopic and forward-looking models, highlighting the divergences among the main adjustment equations and the resulting simulation outcomes.
Resumo:
Faced with the problem of pricing complex contingent claims, an investor seeks to make his valuations robust to model uncertainty. We construct a notion of a model- uncertainty-induced utility function and show that model uncertainty increases the investor's eff ective risk aversion. Using the model-uncertainty-induced utility function, we extend the \No Good Deals" methodology of Cochrane and Sa a-Requejo [2000] to compute lower and upper good deal bounds in the presence of model uncertainty. We illustrate the methodology using some numerical examples.
Resumo:
We model a boundedly rational agent who suffers from limited attention. The agent considers each feasible alternative with a given (unobservable) probability, the attention parameter, and then chooses the alternative that maximises a preference relation within the set of considered alternatives. We show that this random choice rule is the only one for which the impact of removing an alternative on the choice probability of any other alternative is asymmetric and menu independent. Both the preference relation and the attention parameters are identi fied uniquely by stochastic choice data.