978 resultados para conditional autoregressive models
Resumo:
We analyze how unemployment, job finding and job separation rates reactto neutral and investment-specific technology shocks. Neutral shocks increaseunemployment and explain a substantial portion of it volatility; investment-specificshocks expand employment and hours worked and contribute to hoursworked volatility. Movements in the job separation rates are responsible for theimpact response of unemployment while job finding rates for movements alongits adjustment path. The evidence warns against using models with exogenousseparation rates and challenges the conventional way of modelling technologyshocks in search and sticky price models.
Resumo:
This Article breaks new ground toward contractual and institutional innovation in models of homeownership, equity building, and mortgage enforcement. Inspired by recent developments in the affordable housing sector and other types of public financing schemes, we suggest extending institutional and financial strategies such as time- and place-based division of property rights, conditional subsidies, and credit mediation to alleviate the systemic risks of mortgage foreclosure. Two new solutions offer a broad theoretical basis for such developments in the economic and legal institution of homeownership: a for-profit shared equity scheme led by local governments alongside a private market shared equity model, one of "bootstrapping home buying with purchase options".
Resumo:
This paper describes a methodology to estimate the coefficients, to test specification hypothesesand to conduct policy exercises in multi-country VAR models with cross unit interdependencies, unit specific dynamics and time variations in the coefficients. The framework of analysis is Bayesian: a prior flexibly reduces the dimensionality of the model and puts structure on the time variations; MCMC methods are used to obtain posterior distributions; and marginal likelihoods to check the fit of various specifications. Impulse responses and conditional forecasts are obtained with the output of MCMC routine. The transmission of certain shocks across countries is analyzed.
Resumo:
Statistical models allow the representation of data sets and the estimation and/or prediction of the behavior of a given variable through its interaction with the other variables involved in a phenomenon. Among other different statistical models, are the autoregressive state-space models (ARSS) and the linear regression models (LR), which allow the quantification of the relationships among soil-plant-atmosphere system variables. To compare the quality of the ARSS and LR models for the modeling of the relationships between soybean yield and soil physical properties, Akaike's Information Criterion, which provides a coefficient for the selection of the best model, was used in this study. The data sets were sampled in a Rhodic Acrudox soil, along a spatial transect with 84 points spaced 3 m apart. At each sampling point, soybean samples were collected for yield quantification. At the same site, soil penetration resistance was also measured and soil samples were collected to measure soil bulk density in the 0-0.10 m and 0.10-0.20 m layers. Results showed autocorrelation and a cross correlation structure of soybean yield and soil penetration resistance data. Soil bulk density data, however, were only autocorrelated in the 0-0.10 m layer and not cross correlated with soybean yield. The results showed the higher efficiency of the autoregressive space-state models in relation to the equivalent simple and multiple linear regression models using Akaike's Information Criterion. The resulting values were comparatively lower than the values obtained by the regression models, for all combinations of explanatory variables.
Resumo:
Geophysical techniques can help to bridge the inherent gap with regard to spatial resolution and the range of coverage that plagues classical hydrological methods. This has lead to the emergence of the new and rapidly growing field of hydrogeophysics. Given the differing sensitivities of various geophysical techniques to hydrologically relevant parameters and their inherent trade-off between resolution and range the fundamental usefulness of multi-method hydrogeophysical surveys for reducing uncertainties in data analysis and interpretation is widely accepted. A major challenge arising from such endeavors is the quantitative integration of the resulting vast and diverse database in order to obtain a unified model of the probed subsurface region that is internally consistent with all available data. To address this problem, we have developed a strategy towards hydrogeophysical data integration based on Monte-Carlo-type conditional stochastic simulation that we consider to be particularly suitable for local-scale studies characterized by high-resolution and high-quality datasets. Monte-Carlo-based optimization techniques are flexible and versatile, allow for accounting for a wide variety of data and constraints of differing resolution and hardness and thus have the potential of providing, in a geostatistical sense, highly detailed and realistic models of the pertinent target parameter distributions. Compared to more conventional approaches of this kind, our approach provides significant advancements in the way that the larger-scale deterministic information resolved by the hydrogeophysical data can be accounted for, which represents an inherently problematic, and as of yet unresolved, aspect of Monte-Carlo-type conditional simulation techniques. We present the results of applying our algorithm to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the local-scale porosity structure. Our procedure is first tested on pertinent synthetic data and then applied to corresponding field data collected at the Boise Hydrogeophysical Research Site near Boise, Idaho, USA.
Resumo:
Abstract: Asthma prevalence in children and adolescents in Spain is 10-17%. It is the most common chronic illness during childhood. Prevalence has been increasing over the last 40 years and there is considerable evidence that, among other factors, continued exposure to cigarette smoke results in asthma in children. No statistical or simulation model exist to forecast the evolution of childhood asthma in Europe. Such a model needs to incorporate the main risk factors that can be managed by medical authorities, such as tobacco (OR = 1.44), to establish how they affect the present generation of children. A simulation model using conditional probability and discrete event simulation for childhood asthma was developed and validated by simulating realistic scenario. The parameters used for the model (input data) were those found in the bibliography, especially those related to the incidence of smoking in Spain. We also used data from a panel of experts from the Hospital del Mar (Barcelona) related to actual evolution and asthma phenotypes. The results obtained from the simulation established a threshold of a 15-20% smoking population for a reduction in the prevalence of asthma. This is still far from the current level in Spain, where 24% of people smoke. We conclude that more effort must be made to combat smoking and other childhood asthma risk factors, in order to significantly reduce the number of cases. Once completed, this simulation methodology can realistically be used to forecast the evolution of childhood asthma as a function of variation in different risk factors.
Resumo:
We propose new methods for evaluating predictive densities that focus on the models' actual predictive ability in finite samples. The tests offer a simple way of evaluatingthe correct specification of predictive densities, either parametric or non-parametric.The results indicate that our tests are well sized and have good power in detecting mis-specification in predictive densities. An empirical application to the Survey ofProfessional Forecasters and a baseline Dynamic Stochastic General Equilibrium modelshows the usefulness of our methodology.
Resumo:
The increasing interest aroused by more advanced forecasting techniques, together with the requirement for more accurate forecasts of tourismdemand at the destination level due to the constant growth of world tourism, has lead us to evaluate the forecasting performance of neural modelling relative to that of time seriesmethods at a regional level. Seasonality and volatility are important features of tourism data, which makes it a particularly favourable context in which to compare the forecasting performance of linear models to that of nonlinear alternative approaches. Pre-processed official statistical data of overnight stays and tourist arrivals fromall the different countries of origin to Catalonia from 2001 to 2009 is used in the study. When comparing the forecasting accuracy of the different techniques for different time horizons, autoregressive integrated moving average models outperform self-exciting threshold autoregressions and artificial neural network models, especially for shorter horizons. These results suggest that the there is a trade-off between the degree of pre-processing and the accuracy of the forecasts obtained with neural networks, which are more suitable in the presence of nonlinearity in the data. In spite of the significant differences between countries, which can be explained by different patterns of consumer behaviour,we also find that forecasts of tourist arrivals aremore accurate than forecasts of overnight stays.
Resumo:
Improving educational quality is an important public policy goal. However, its success requires identifying factors associated with student achievement. At the core of these proposals lies the principle that increased public school quality can make school system more efficient, resulting in correspondingly stronger performance by students. Nevertheless, the public educational system is not devoid of competition which arises, among other factors, through the efficiency of management and the geographical location of schools. Moreover, families in Spain appear to choose a school on the grounds of location. In this environment, the objective of this paper is to analyze whether geographical space has an impact on the relationship between the level of technical quality of public schools (measured by the efficiency score) and the school demand index. To do this, an empirical application is performed on a sample of 1,695 public schools in the region of Catalonia (Spain). This application shows the effects of spatial autocorrelation on the estimation of the parameters and how these problems are addressed through spatial econometrics models. The results confirm that space has a moderating effect on the relationship between efficiency and school demand, although only in urban municipalities.
Resumo:
This study investigates the relationship between the time-varying risk premiums and conditional market risk in the stock markets of the ten member countries of Economy and Monetary Union. Second, it examines whether the conditional second moments change over time and are there asymmetric effects in the conditional covariance matrix. Third, it analyzes the possible effects of the chosen testing framework. Empirical analysis is conducted using asymmetric univariate and multivariate GARCH-in-mean models and assuming three different degrees of market integration. For a daily sample period from 1999 to 2007, the study shows that the time-varying market risk alone is not enough to explain the dynamics of risk premiums and indications are found that the market risk is detected only when its price is allowed to change over time. Also asymmetric effects in the conditional covariance matrix, which is found to be time-varying, are clearly present and should be recognized in empirical asset pricing analyses.
Resumo:
This thesis examines whether global, local and exchange risks are priced in Scandinavian countries’ equity markets by using conditional international asset pricing models. The employed international asset pricing models are the world capital asset pricing model, the international asset pricing model augmented with the currency risk, and the partially segmented model augmented with the currency risk. Moreover, this research traces estimated equity risk premiums for the Scandinavian countries. The empirical part of the study is performed using generalized method of moments approach. Monthly observations from February 1994 to June 2007 are used. Investors’ conditional expectations are modeled using several instrumental variables. In order to keep system parsimonious the prices of risk are assumed to be constant whereas expected returns and conditional covariances vary over time. The empirical findings of this thesis suggest that the prices of global and local market risk are priced in the Scandinavian countries. This indicates that the Scandinavian countries are mildly segmented from the global markets. Furthermore, the results show that the exchange risk is priced in the Danish and Swedish stock markets when the partially segmented model is augmented with the currency risk factor.
Resumo:
The theme of this thesis is context-speci c independence in graphical models. Considering a system of stochastic variables it is often the case that the variables are dependent of each other. This can, for instance, be seen by measuring the covariance between a pair of variables. Using graphical models, it is possible to visualize the dependence structure found in a set of stochastic variables. Using ordinary graphical models, such as Markov networks, Bayesian networks, and Gaussian graphical models, the type of dependencies that can be modeled is limited to marginal and conditional (in)dependencies. The models introduced in this thesis enable the graphical representation of context-speci c independencies, i.e. conditional independencies that hold only in a subset of the outcome space of the conditioning variables. In the articles included in this thesis, we introduce several types of graphical models that can represent context-speci c independencies. Models for both discrete variables and continuous variables are considered. A wide range of properties are examined for the introduced models, including identi ability, robustness, scoring, and optimization. In one article, a predictive classi er which utilizes context-speci c independence models is introduced. This classi er clearly demonstrates the potential bene ts of the introduced models. The purpose of the material included in the thesis prior to the articles is to provide the basic theory needed to understand the articles.
Resumo:
This thesis studies the impact of the latest Russian crisis on global markets, and especially Central and Eastern Europe. The results are compared to other shocks and crises over the last twenty years to see how significant they have been. The cointegration process of Central and Eastern European financial markets is also reviewed and updated. Using three separate conditional correlation GARCH models, the latest crisis is not found to have initiated similar surges in conditional correlations to previous crises over the last two decades. Market cointegration for Central and Eastern Europe is found to have stalled somewhat after initial correlation increases post EU accession.
Resumo:
This Master’s Thesis analyses the effectiveness of different hedging models on BRICS (Brazil, Russia, India, China, and South Africa) countries. Hedging performance is examined by comparing two different dynamic hedging models to conventional OLS regression based model. The dynamic hedging models being employed are Constant Conditional Correlation (CCC) GARCH(1,1) and Dynamic Conditional Correlation (DCC) GARCH(1,1) with Student’s t-distribution. In order to capture the period of both Great Moderation and the latest financial crisis, the sample period extends from 2003 to 2014. To determine whether dynamic models outperform the conventional one, the reduction of portfolio variance for in-sample data with contemporaneous hedge ratios is first determined and then the holding period of the portfolios is extended to one and two days. In addition, the accuracy of hedge ratio forecasts is examined on the basis of out-of-sample variance reduction. The results are mixed and suggest that dynamic hedging models may not provide enough benefits to justify harder estimation and daily portfolio adjustment. In this sense, the results are consistent with the existing literature.
Resumo:
A wide range of tests for heteroskedasticity have been proposed in the econometric and statistics literature. Although a few exact homoskedasticity tests are available, the commonly employed procedures are quite generally based on asymptotic approximations which may not provide good size control in finite samples. There has been a number of recent studies that seek to improve the reliability of common heteroskedasticity tests using Edgeworth, Bartlett, jackknife and bootstrap methods. Yet the latter remain approximate. In this paper, we describe a solution to the problem of controlling the size of homoskedasticity tests in linear regression contexts. We study procedures based on the standard test statistics [e.g., the Goldfeld-Quandt, Glejser, Bartlett, Cochran, Hartley, Breusch-Pagan-Godfrey, White and Szroeter criteria] as well as tests for autoregressive conditional heteroskedasticity (ARCH-type models). We also suggest several extensions of the existing procedures (sup-type of combined test statistics) to allow for unknown breakpoints in the error variance. We exploit the technique of Monte Carlo tests to obtain provably exact p-values, for both the standard and the new tests suggested. We show that the MC test procedure conveniently solves the intractable null distribution problem, in particular those raised by the sup-type and combined test statistics as well as (when relevant) unidentified nuisance parameter problems under the null hypothesis. The method proposed works in exactly the same way with both Gaussian and non-Gaussian disturbance distributions [such as heavy-tailed or stable distributions]. The performance of the procedures is examined by simulation. The Monte Carlo experiments conducted focus on : (1) ARCH, GARCH, and ARCH-in-mean alternatives; (2) the case where the variance increases monotonically with : (i) one exogenous variable, and (ii) the mean of the dependent variable; (3) grouped heteroskedasticity; (4) breaks in variance at unknown points. We find that the proposed tests achieve perfect size control and have good power.