931 resultados para Spatial conditional autoregressive model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditionally, siting and sizing decisions for parks and reserves reflected ecological characteristics but typically failed to consider ecological costs created from displaced resource collection, welfare costs on nearby rural people, and enforcement costs. Using a spatial game-theoretic model that incorporates the interaction of socioeconomic and ecological settings, we show how incorporating more recent mandates that include rural welfare and surrounding landscapes can result in very different optimal sizing decisions. The model informs our discussion of recent forest management in Tanzania, reserve sizing and siting decisions, estimating reserve effectiveness, and determining patterns of avoided forest degradation in Reduced Emissions from Deforestation and Forest Degradation programs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linear models of market performance may be misspecified if the market is subdivided into distinct regimes exhibiting different behaviour. Price movements in the US Real Estate Investment Trusts and UK Property Companies Markets are explored using a Threshold Autoregressive (TAR) model with regimes defined by the real rate of interest. In both US and UK markets, distinctive behaviour emerges, with the TAR model offering better predictive power than a more conventional linear autoregressive model. The research points to the possibility of developing trading rules to exploit the systematically different behaviour across regimes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate decadal climate predictions could be used to inform adaptation actions to a changing climate. The skill of such predictions from initialised dynamical global climate models (GCMs) may be assessed by comparing with predictions from statistical models which are based solely on historical observations. This paper presents two benchmark statistical models for predicting both the radiatively forced trend and internal variability of annual mean sea surface temperatures (SSTs) on a decadal timescale based on the gridded observation data set HadISST. For both statistical models, the trend related to radiative forcing is modelled using a linear regression of SST time series at each grid box on the time series of equivalent global mean atmospheric CO2 concentration. The residual internal variability is then modelled by (1) a first-order autoregressive model (AR1) and (2) a constructed analogue model (CA). From the verification of 46 retrospective forecasts with start years from 1960 to 2005, the correlation coefficient for anomaly forecasts using trend with AR1 is greater than 0.7 over parts of extra-tropical North Atlantic, the Indian Ocean and western Pacific. This is primarily related to the prediction of the forced trend. More importantly, both CA and AR1 give skillful predictions of the internal variability of SSTs in the subpolar gyre region over the far North Atlantic for lead time of 2 to 5 years, with correlation coefficients greater than 0.5. For the subpolar gyre and parts of the South Atlantic, CA is superior to AR1 for lead time of 6 to 9 years. These statistical forecasts are also compared with ensemble mean retrospective forecasts by DePreSys, an initialised GCM. DePreSys is found to outperform the statistical models over large parts of North Atlantic for lead times of 2 to 5 years and 6 to 9 years, however trend with AR1 is generally superior to DePreSys in the North Atlantic Current region, while trend with CA is superior to DePreSys in parts of South Atlantic for lead time of 6 to 9 years. These findings encourage further development of benchmark statistical decadal prediction models, and methods to combine different predictions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract: Following a workshop exercise, two models, an individual-based landscape model (IBLM) and a non-spatial life-history model were used to assess the impact of a fictitious insecticide on populations of skylarks in the UK. The chosen population endpoints were abundance, population growth rate, and the chances of population persistence. Both models used the same life-history descriptors and toxicity profiles as the basis for their parameter inputs. The models differed in that exposure was a pre-determined parameter in the life-history model, but an emergent property of the IBLM, and the IBLM required a landscape structure as an input. The model outputs were qualitatively similar between the two models. Under conditions dominated by winter wheat, both models predicted a population decline that was worsened by the use of the insecticide. Under broader habitat conditions, population declines were only predicted for the scenarios where the insecticide was added. Inputs to the models are very different, with the IBLM requiring a large volume of data in order to achieve the flexibility of being able to integrate a range of environmental and behavioural factors. The life-history model has very few explicit data inputs, but some of these relied on extensive prior modelling needing additional data as described in Roelofs et al.(2005, this volume). Both models have strengths and weaknesses; hence the ideal approach is that of combining the use of both simple and comprehensive modeling tools.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Geophysical time series sometimes exhibit serial correlations that are stronger than can be captured by the commonly used first‐order autoregressive model. In this study we demonstrate that a power law statistical model serves as a useful upper bound for the persistence of total ozone anomalies on monthly to interannual timescales. Such a model is usually characterized by the Hurst exponent. We show that the estimation of the Hurst exponent in time series of total ozone is sensitive to various choices made in the statistical analysis, especially whether and how the deterministic (including periodic) signals are filtered from the time series, and the frequency range over which the estimation is made. In particular, care must be taken to ensure that the estimate of the Hurst exponent accurately represents the low‐frequency limit of the spectrum, which is the part that is relevant to long‐term correlations and the uncertainty of estimated trends. Otherwise, spurious results can be obtained. Based on this analysis, and using an updated equivalent effective stratospheric chlorine (EESC) function, we predict that an increase in total ozone attributable to EESC should be detectable at the 95% confidence level by 2015 at the latest in southern midlatitudes, and by 2020–2025 at the latest over 30°–45°N, with the time to detection increasing rapidly with latitude north of this range.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We use Hasbrouck's (1991) vector autoregressive model for prices and trades to empirically test and assess the role played by the waiting time between consecutive transactions in the process of price formation. We find that as the time duration between transactions decreases, the price impact of trades, the speed of price adjustment to trade‐related information, and the positive autocorrelation of signed trades all increase. This suggests that times when markets are most active are times when there is an increased presence of informed traders; we interpret such markets as having reduced liquidity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we investigate the price discovery process in single-name credit spreads obtained from bond, credit default swap (CDS), equity and equity option prices. We analyse short term price discovery by modelling daily changes in credit spreads in the four markets with a vector autoregressive model (VAR). We also look at price discovery in the long run with a vector error correction model (VECM). We find that in the short term the option market clearly leads the other markets in the sub-prime crisis (2007-2009). During the less severe sovereign debt crisis (2009-2012) and the pre-crisis period, options are still important but CDSs become more prominent. In the long run, deviations from the equilibrium relationship with the option market still lead to adjustments in the credit spreads observed or implied from other markets. However, options no longer dominate price discovery in any of the periods considered. Our findings have implications for traders, credit risk managers and financial regulators.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The EU FP7 Project MEGAPOLI: "Megacities: Emissions, urban, regional and Global Atmospheric POLlution and climate effects, and Integrated tools for assessment and mitigation" (http://megapoli.info) brings together leading European research groups, state-of-the-art scientific tools and key players from non-European countries to investigate the interactions among megacities, air quality and climate. MEGAPOLI bridges the spatial and temporal scales that connect local emissions, air quality and weather with global atmospheric chemistry and climate. The suggested concept of multi-scale integrated modelling of megacity impact on air quality and climate and vice versa is discussed in the paper. It requires considering different spatial and temporal dimensions: time scales from seconds and hours (to understand the interaction mechanisms) up to years and decades (to consider the climate effects); spatial resolutions: with model down- and up-scaling from street- to global-scale; and two-way interactions between meteorological and chemical processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantile forecasts are central to risk management decisions because of the widespread use of Value-at-Risk. A quantile forecast is the product of two factors: the model used to forecast volatility, and the method of computing quantiles from the volatility forecasts. In this paper we calculate and evaluate quantile forecasts of the daily exchange rate returns of five currencies. The forecasting models that have been used in recent analyses of the predictability of daily realized volatility permit a comparison of the predictive power of different measures of intraday variation and intraday returns in forecasting exchange rate variability. The methods of computing quantile forecasts include making distributional assumptions for future daily returns as well as using the empirical distribution of predicted standardized returns with both rolling and recursive samples. Our main findings are that the Heterogenous Autoregressive model provides more accurate volatility and quantile forecasts for currencies which experience shifts in volatility, such as the Canadian dollar, and that the use of the empirical distribution to calculate quantiles can improve forecasts when there are shifts

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines the predictability of real estate asset returns using a number of time series techniques. A vector autoregressive model, which incorporates financial spreads, is able to improve upon the out of sample forecasting performance of univariate time series models at a short forecasting horizon. However, as the forecasting horizon increases, the explanatory power of such models is reduced, so that returns on real estate assets are best forecast using the long term mean of the series. In the case of indirect property returns, such short-term forecasts can be turned into a trading rule that can generate excess returns over a buy-and-hold strategy gross of transactions costs, although none of the trading rules developed could cover the associated transactions costs. It is therefore concluded that such forecastability is entirely consistent with stock market efficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper employs a vector autoregressive model to investigate the impact of macroeconomic and financial variables on a UK real estate return series. The results indicate that unexpected inflation, and the interest rate term spread have explanatory powers for the property market. However, the most significant influence on the real estate series are the lagged values of the real estate series themselves. We conclude that identifying the factors that have determined UK property returns over the past twelve years remains a difficult task.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates whether bank integration measured by cross-border bank flows can capture the co-movements across housing markets in developed countries by using a spatial dynamic panel model. The transmission can occur through a global banking channel in which global banks intermediate wholesale funding to local banks. Changes in financial conditions are passed across borders through the banks’ balance-sheet exposure to credit, currency, maturity, and funding risks resulting in house price spillovers. While controlling for country-level and global factors, we find significant co-movement across housing markets of countries with proportionally high bank integration. Bank integration can better capture house price co-movements than other measures of economic integration. Once we account for bank exposure, other spatial linkages traditionally used to account for return co-movements across region – such as trade, foreign direct investment, portfolio investment, geographic proximity, etc. – become insignificant. Moreover, we find that the co-movement across housing markets decreases for countries with less developed mortgage markets characterized by fixed mortgage rate contracts, low limits of loan-to-value ratios and no mortgage equity withdrawal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study covers a period when society changed from a pre-industrial agricultural society to a post-industrial service-producing society. Parallel with this social transformation, major population changes took place. In this study, we analyse how local population changes are affected by neighbouring populations. To do so we use the last 200 years of local population change that redistributed population in Sweden. We use literature to identify several different processes and spatial dependencies in the redistribution between a parish and its surrounding parishes. The analysis is based on a unique unchanged historical parish division, and we use an index of local spatial correlation to describe different kinds of spatial dependencies that have influenced the redistribution of the population. To control inherent time dependencies, we introduce a non-separable spatial temporal correlation model into the analysis of population redistribution. Hereby, several different spatial dependencies can be observed simultaneously over time. The main conclusions are that while local population changes have been highly dependent on the neighbouring populations in the 19th century, this spatial dependence have become insignificant already when two parishes is separated by 5 kilometres in the late 20th century. Another conclusion is that the time dependency in the population change is higher when the population redistribution is weak, as it currently is and as it was during the 19th century until the start of industrial revolution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper contributes to the debate on whether the Brazilian public debt is sustainable or not in the long run by considering threshold effects on the Brazilian Budget Deficit. Using data from 1947 to 1999 and a threshold autoregressive model, we find evidence of delays in fiscal stabilization. As suggested in Alesina (1991), delayed stabilizations reflect the existence of political constraints blocking deficit cuts, which are relaxed only when the budget deficit reaches a sufficiently high level, deemed to be unsustainable. In particular, our results suggest that, in the absence of seignorage, only when the increase in the budget deficit reaches 1.74% of the GDP will fiscal authorities intervene to reduce the deficit. If seignorage is allowed, the threshold increases to 2.2%, suggesting that seignorage makes government more tolerant to fiscal imbalances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O objetivo geral desta tese é estimar a elasticidade-preço da demanda de forma conjunta, decomposta em elasticidade-preço da escolha da marca e da quantidade comprada, e discutir as implicações desta decomposição para uma categoria específica de produto. Para isto foram usados dados escaneados de uma amostra de domicílios no contexto varejista brasileiro. Oito hipóteses foram testadas por meio de dois modelos. O primeiro refere-se à decisão de escolha da marca, em que foi empregado o modelo logit condicional baseado na maximização da utilidade do domicílio. O segundo envolveu equações de demanda, obtidas pelo modelo clássico de regressão linear. Ambos foram especificados de forma que se pudesse testar a dependência das duas decisões de compra. No que diz respeito à validação, o modelo de escolha da marca demonstrou uma satisfatória capacidade de previsão, comparativamente aos modelos analisados na literatura. Implicações gerenciais incluem específicas decisões e ações de preço para as marcas, já que a natureza da decomposição das elasticidades-preço varia entre marcas.