12 resultados para Threshold models
em CentAUR: Central Archive University of Reading - UK
Resumo:
We test the expectations theory of the term structure of U.S. interest rates in nonlinear systems. These models allow the response of the change in short rates to past values of the spread to depend upon the level of the spread. The nonlinear system is tested against a linear system, and the results of testing the expectations theory in both models are contrasted. We find that the results of tests of the implications of the expectations theory depend on the size and sign of the spread. The long maturity spread predicts future changes of the short rate only when it is high.
Resumo:
Recent studies into price transmission have recognized the important role played by transport and transaction costs. Threshold models are one approach to accommodate such costs. We develop a generalized Threshold Error Correction Model to test for the presence and form of threshold behavior in price transmission that is symmetric around equilibrium. We use monthly wheat, maize, and soya prices from the United States, Argentina, and Brazil to demonstrate this model. Classical estimation of these generalized models can present challenges but Bayesian techniques avoid many of these problems. Evidence for thresholds is found in three of the five commodity price pairs investigated.
Resumo:
We assessed the vulnerability of blanket peat to climate change in Great Britain using an ensemble of 8 bioclimatic envelope models. We used 4 published models that ranged from simple threshold models, based on total annual precipitation, to Generalised Linear Models (GLMs, based on mean annual temperature). In addition, 4 new models were developed which included measures of water deficit as threshold, classification tree, GLM and generalised additive models (GAM). Models that included measures of both hydrological conditions and maximum temperature provided a better fit to the mapped peat area than models based on hydrological variables alone. Under UKCIP02 projections for high (A1F1) and low (B1) greenhouse gas emission scenarios, 7 out of the 8 models showed a decline in the bioclimatic space associated with blanket peat. Eastern regions (Northumbria, North York Moors, Orkney) were shown to be more vulnerable than higher-altitude, western areas (Highlands, Western Isles and Argyle, Bute and The Trossachs). These results suggest a long-term decline in the distribution of actively growing blanket peat, especially under the high emissions scenario, although it is emphasised that existing peatlands may well persist for decades under a changing climate. Observational data from long-term monitoring and manipulation experiments in combination with process-based models are required to explore the nature and magnitude of climate change impacts on these vulnerable areas more fully.
Resumo:
Threshold Error Correction Models are used to analyse the term structure of interest Rates. The paper develops and uses a generalisation of existing models that encompasses both the Band and Equilibrium threshold models of [Balke and Fomby ((1997) Threshold cointegration. Int Econ Rev 38(3):627–645)] and estimates this model using a Bayesian approach. Evidence is found for threshold effects in pairs of longer rates but not in pairs of short rates. The Band threshold model is supported in preference to the Equilibrium model.
Resumo:
Although financial theory rests heavily upon the assumption that asset returns are normally distributed, value indices of commercial real estate display significant departures from normality. In this paper, we apply and compare the properties of two recently proposed regime switching models for value indices of commercial real estate in the US and the UK, both of which relax the assumption that observations are drawn from a single distribution with constant mean and variance. Statistical tests of the models' specification indicate that the Markov switching model is better able to capture the non-stationary features of the data than the threshold autoregressive model, although both represent superior descriptions of the data than the models that allow for only one state. Our results have several implications for theoretical models and empirical research in finance.
Resumo:
This paper presents an approach for automatic classification of pulsed Terahertz (THz), or T-ray, signals highlighting their potential in biomedical, pharmaceutical and security applications. T-ray classification systems supply a wealth of information about test samples and make possible the discrimination of heterogeneous layers within an object. In this paper, a novel technique involving the use of Auto Regressive (AR) and Auto Regressive Moving Average (ARMA) models on the wavelet transforms of measured T-ray pulse data is presented. Two example applications are examined - the classi. cation of normal human bone (NHB) osteoblasts against human osteosarcoma (HOS) cells and the identification of six different powder samples. A variety of model types and orders are used to generate descriptive features for subsequent classification. Wavelet-based de-noising with soft threshold shrinkage is applied to the measured T-ray signals prior to modeling. For classi. cation, a simple Mahalanobis distance classi. er is used. After feature extraction, classi. cation accuracy for cancerous and normal cell types is 93%, whereas for powders, it is 98%.
Resumo:
Summary 1. Agent-based models (ABMs) are widely used to predict how populations respond to changing environments. As the availability of food varies in space and time, individuals should have their own energy budgets, but there is no consensus as to how these should be modelled. Here, we use knowledge of physiological ecology to identify major issues confronting the modeller and to make recommendations about how energy budgets for use in ABMs should be constructed. 2. Our proposal is that modelled animals forage as necessary to supply their energy needs for maintenance, growth and reproduction. If there is sufficient energy intake, an animal allocates the energy obtained in the order: maintenance, growth, reproduction, energy storage, until its energy stores reach an optimal level. If there is a shortfall, the priorities for maintenance and growth/reproduction remain the same until reserves fall to a critical threshold below which all are allocated to maintenance. Rates of ingestion and allocation depend on body mass and temperature. We make suggestions for how each of these processes should be modelled mathematically. 3. Mortality rates vary with body mass and temperature according to known relationships, and these can be used to obtain estimates of background mortality rate. 4. If parameter values cannot be obtained directly, then values may provisionally be obtained by parameter borrowing, pattern-oriented modelling, artificial evolution or from allometric equations. 5. The development of ABMs incorporating individual energy budgets is essential for realistic modelling of populations affected by food availability. Such ABMs are already being used to guide conservation planning of nature reserves and shell fisheries, to assess environmental impacts of building proposals including wind farms and highways and to assess the effects on nontarget organisms of chemicals for the control of agricultural pests. Keywords: bioenergetics; energy budget; individual-based models; population dynamics.
Resumo:
This paper investigates the application and use of development viability models in the formation of planning policies in the UK. Particular attention is paid to three key areas; the assumed development scheme in development viability models, the use of forecasts and the debate concerning Threshold Land Value. The empirical section reports on the results of an interview survey involving the main producers of development viability models and appraisals. It is concluded that, although development viability models have intrinsic limitations associated with model composition and input uncertainties, the most significant limitations are related to the ways that they have been adapted for use in the planning system. In addition, it is suggested that the contested nature of Threshold Land Value is an example of calculative practices providing a façade of technocratic rationality in the planning system.
Resumo:
This mixed-method study tracked social interaction and adaptation among 20 international postgraduates on a 1-year programme in the UK, examining assumptions that language proficiency and interactional engagement directly underpin sociocultural adaptation. Participants remained frustrated by a perceived ‘threshold’ barring successful interaction with English speakers, while reporting reluctance to take up available opportunities, independent of language proficiency and sociocultural adaptation. We challenge linear models of adaptation and call for assistance to international students in crossing the threshold to successful interaction.
Resumo:
This paper combines and generalizes a number of recent time series models of daily exchange rate series by using a SETAR model which also allows the variance equation of a GARCH specification for the error terms to be drawn from more than one regime. An application of the model to the French Franc/Deutschmark exchange rate demonstrates that out-of-sample forecasts for the exchange rate volatility are also improved when the restriction that the data it is drawn from a single regime is removed. This result highlights the importance of considering both types of regime shift (i.e. thresholds in variance as well as in mean) when analysing financial time series.
Resumo:
Simulation of the lifting of dust from the planetary surface is of substantially greater importance on Mars than on Earth, due to the fundamental role that atmospheric dust plays in the former’s climate, yet the dust emission parameterisations used to date in martian global climate models (MGCMs) lag, understandably, behind their terrestrial counterparts in terms of sophistication. Recent developments in estimating surface roughness length over all martian terrains and in modelling atmospheric circulations at regional to local scales (less than O(100 km)) presents an opportunity to formulate an improved wind stress lifting parameterisation. We have upgraded the conventional scheme by including the spatially varying roughness length in the lifting parameterisation in a fully consistent manner (thereby correcting a possible underestimation of the true threshold level for wind stress lifting), and used a modification to account for deviations from neutral stability in the surface layer. Following these improvements, it is found that wind speeds at typical MGCM resolution never reach the lifting threshold at most gridpoints: winds fall particularly short in the southern midlatitudes, where mean roughness is large. Sub-grid scale variability, manifested in both the near-surface wind field and the surface roughness, is then considered, and is found to be a crucial means of bridging the gap between model winds and thresholds. Both forms of small-scale variability contribute to the formation of dust emission ‘hotspots’: areas within the model gridbox with particularly favourable conditions for lifting, namely a smooth surface combined with strong near-surface gusts. Such small-scale emission could in fact be particularly influential on Mars, due both to the intense positive radiative feedbacks that can drive storm growth and a strong hysteresis effect on saltation. By modelling this variability, dust lifting is predicted at the locations at which dust storms are frequently observed, including the flushing storm sources of Chryse and Utopia, and southern midlatitude areas from which larger storms tend to initiate, such as Hellas and Solis Planum. The seasonal cycle of emission, which includes a double-peaked structure in northern autumn and winter, also appears realistic. Significant increases to lifting rates are produced for any sensible choices of parameters controlling the sub-grid distributions used, but results are sensitive to the smallest scale of variability considered, which high-resolution modelling suggests should be O(1 km) or less. Use of such models in future will permit the use of a diagnosed (rather than prescribed) variable gustiness intensity, which should further enhance dust lifting in the southern hemisphere in particular.
Resumo:
We propose a geoadditive negative binomial model (Geo-NB-GAM) for regional count data that allows us to address simultaneously some important methodological issues, such as spatial clustering, nonlinearities, and overdispersion. This model is applied to the study of location determinants of inward greenfield investments that occurred during 2003–2007 in 249 European regions. After presenting the data set and showing the presence of overdispersion and spatial clustering, we review the theoretical framework that motivates the choice of the location determinants included in the empirical model, and we highlight some reasons why the relationship between some of the covariates and the dependent variable might be nonlinear. The subsequent section first describes the solutions proposed by previous literature to tackle spatial clustering, nonlinearities, and overdispersion, and then presents the Geo-NB-GAM. The empirical analysis shows the good performance of Geo-NB-GAM. Notably, the inclusion of a geoadditive component (a smooth spatial trend surface) permits us to control for spatial unobserved heterogeneity that induces spatial clustering. Allowing for nonlinearities reveals, in keeping with theoretical predictions, that the positive effect of agglomeration economies fades as the density of economic activities reaches some threshold value. However, no matter how dense the economic activity becomes, our results suggest that congestion costs never overcome positive agglomeration externalities.