118 resultados para Exponential Smoothing
Resumo:
The extinction of dinosaurs at the Cretaceous/Paleogene (K/Pg) boundary was the seminal event that opened the door for the subsequent diversification of terrestrial mammals. Our compilation of maximum body size at the ordinal level by sub-epoch shows a near-exponential increase after the K/Pg. On each continent, the maximum size of mammals leveled off after 40 million years ago and thereafter remained approximately constant. There was remarkable congruence in the rate, trajectory, and upper limit across continents, orders, and trophic guilds, despite differences in geological and climatic history, turnover of lineages, and ecological variation. Our analysis suggests that although the primary driver for the evolution of giant mammals was diversification to fill ecological niches, environmental temperature and land area may have ultimately constrained the maximum size achieved.
Resumo:
Greater attention has been focused on the use of CDMA for future cellular mobile communications. CA near-far resistant detector for asynchronous code-division multiple-access (CDMA) systems operating in additive white Gaussian noise (AWGN) channels is presented. The multiuser interference caused by K users transmitting simultaneously, each with a specific signature sequence, is completely removed at the receiver. The complexity of this detector grows only linearly with the number of users, as compared to the optimum multiuser detector which requires exponential complexity in the number of users. A modified algorithm based on time diversity is described. It performs detection on a bit-by-bit basis and overcomes the complexity of using a sequence detector. The performance of this detector is shown to be superior to that of the conventional receiver.
Resumo:
This paper presents the theoretical development of a nonlinear adaptive filter based on a concept of filtering by approximated densities (FAD). The most common procedures for nonlinear estimation apply the extended Kalman filter. As opposed to conventional techniques, the proposed recursive algorithm does not require any linearisation. The prediction uses a maximum entropy principle subject to constraints. Thus, the densities created are of an exponential type and depend on a finite number of parameters. The filtering yields recursive equations involving these parameters. The update applies the Bayes theorem. Through simulation on a generic exponential model, the proposed nonlinear filter is implemented and the results prove to be superior to that of the extended Kalman filter and a class of nonlinear filters based on partitioning algorithms.
Resumo:
Volatility, or the variability of the underlying asset, is one of the key fundamental components of property derivative pricing and in the application of real option models in development analysis. There has been relatively little work on volatility in real terms of its application to property derivatives and the real options analysis. Most research on volatility stems from investment performance (Nathakumaran & Newell (1995), Brown & Matysiak 2000, Booth & Matysiak 2001). Historic standard deviation is often used as a proxy for volatility and there has been a reliance on indices, which are subject to valuation smoothing effects. Transaction prices are considered to be more volatile than the traditional standard deviations of appraisal based indices. This could lead, arguably, to inefficiencies and mis-pricing, particularly if it is also accepted that changes evolve randomly over time and where future volatility and not an ex-post measure is the key (Sing 1998). If history does not repeat, or provides an unreliable measure, then estimating model based (implied) volatility is an alternative approach (Patel & Sing 2000). This paper is the first of two that employ alternative approaches to calculating and capturing volatility in UK real estate for the purposes of applying the measure to derivative pricing and real option models. It draws on a uniquely constructed IPD/Gerald Eve transactions database, containing over 21,000 properties over the period 1983-2005. In this first paper the magnitude of historic amplification associated with asset returns by sector and geographic spread is looked at. In the subsequent paper the focus will be upon model based (implied) volatility.
Resumo:
A One-Dimensional Time to Explosion (ODTX) apparatus has been used to study the times to explosion of a number of compositions based on RDX and HMX over a range of contact temperatures. The times to explosion at any given temperature tend to increase from RDX to HMX and with the proportion of HMX in the composition. Thermal ignition theory has been applied to time to explosion data to calculate kinetic parameters. The apparent activation energy for all of the compositions lay between 127 kJ mol−1 and 146 kJ mol−1. There were big differences in the pre-exponential factor and this controlled the time to explosion rather than the activation energy for the process.
Resumo:
This study was designed to determine the response of in vitro fermentation parameters to incremental levels of polyethylene glycol (PEG) when tanniniferous tree fruits (Dichrostachys cinerea, Acacia erioloba, A. erubiscens, A. nilotica and Piliostigma thonningii) were fermented using the Reading Pressure Technique. The trivalent ytterbium precipitable phenolics content of fruit substrates ranged from 175 g/kg DM in A. erubiscens to 607 g/kg DM in A. nilotica, while the soluble condensed tannin content ranged from 0.09 AU550nm/40mg in A. erioloba to 0.52 AU550nm/40 mg in D. cinerea. The ADF was highest in P. thonningii fruits (402 g/kg DM) and lowest in A. nilotica fruits (165 g/kg DM). Increasing the level of PEG caused an exponential rise to a maximum (asymptotic) for cumulative gas production, rate of gas production and nitrogen degradability in all substrates except P. thonningii fruits. Dry matter degradability for fruits containing higher levels of soluble condensed tannins (D. cinerea and P. thonningii), showed little response to incremental levels of PEG after incubation for 24 h. The minimum levels of PEG required to maximize in vitro fermentation of tree fruits was found to be 200 mg PEG/g DM of sample for all tree species except A. erubiscens fruits, which required 100 mg PEG/g DM sample. The study provides evidence that PEG levels lower than 1 g/g DM sample can be used for in vitro tannin bioassays to reduce the cost of evaluating non-conventional tanniniferous feedstuffs used in developing countries in the tropics and subtopics. The use of in vitro nitrogen degradability in place of the favoured dry matter degradability improved the accuracy of PEG as a diagnostic tool for tannins in in vitro fermentation systems.
Resumo:
Exponential spectra are found to characterize variability of the Northern Annular Mode (NAM) for periods less than 36 days. This corresponds to the observed rounding of the autocorrelation function at lags of a few days. The characteristic persistence timescales during winter and summer is found to be ∼5 days for these high frequencies. Beyond periods of 36 days the characteristic decorrelation timescale is ∼20 days during winter and ∼6 days in summer. We conclude that the NAM cannot be described by autoregressive models for high frequencies; the spectra are more consistent with low-order chaos. We also propose that the NAM exhibits regime behaviour, however the nature of this has yet to be identified.
Resumo:
The ASTER Global Digital Elevation Model (GDEM) has made elevation data at 30 m spatial resolution freely available, enabling reinvestigation of morphometric relationships derived from limited field data using much larger sample sizes. These data are used to analyse a range of morphometric relationships derived for dunes (between dune height, spacing, and equivalent sand thickness) in the Namib Sand Sea, which was chosen because there are a number of extant studies that could be used for comparison with the results. The relative accuracy of GDEM for capturing dune height and shape was tested against multiple individual ASTER DEM scenes and against field surveys, highlighting the smoothing of the dune crest and resultant underestimation of dune height, and the omission of the smallest dunes, because of the 30 m sampling of ASTER DEM products. It is demonstrated that morphometric relationships derived from GDEM data are broadly comparable with relationships derived by previous methods, across a range of different dune types. The data confirm patterns of dune height, spacing and equivalent sand thickness mapped previously in the Namib Sand Sea, but add new detail to these patterns.
Resumo:
A UK field experiment compared a complete factorial combination of three backgrounds (cvs Mercia, Maris Huntsman and Maris Widgeon), three alleles at the Rht-B1 locus as Near Isogenic Lines (NILs: rht-B1a (tall), Rht-B1b (semi-dwarf), Rht-B1c (severe dwarf)) and four nitrogen (N) fertilizer application rates (0, 100, 200 and 350 kg N/ha). Linear+exponential functions were fitted to grain yield (GY) and nitrogen-use efficiency (NUE; GY/available N) responses to N rate. Averaged over N rate and background Rht-B1b conferred significantly (P<0.05) greater GY, NUE, N uptake efficiency (NUpE; N in above ground crop / available N) and N utilization efficiency (NUtEg; GY / N in above ground crop) compared with rht-B1a and Rht-B1c. However the economically optimal N rate (Nopt) for N:grain price ratios of 3.5:1 to 10:1 were also greater for Rht-B1b, and because NUE, NUpE and NUtE all declined with N rate, Rht-Blb failed to increase NUE or its components at Nopt. The adoption of semi-dwarf lines in temperate and humid regions, and the greater N rates that such adoption justifies economically, greatly increases land-use efficiency, but not necessarily, NUE.
Resumo:
This study proposes a utility-based framework for the determination of optimal hedge ratios (OHRs) that can allow for the impact of higher moments on hedging decisions. We examine the entire hyperbolic absolute risk aversion family of utilities which include quadratic, logarithmic, power, and exponential utility functions. We find that for both moderate and large spot (commodity) exposures, the performance of out-of-sample hedges constructed allowing for nonzero higher moments is better than the performance of the simpler OLS hedge ratio. The picture is, however, not uniform throughout our seven spot commodities as there is one instance (cotton) for which the modeling of higher moments decreases welfare out-of-sample relative to the simpler OLS. We support our empirical findings by a theoretical analysis of optimal hedging decisions and we uncover a novel link between OHRs and the minimax hedge ratio, that is the ratio which minimizes the largest loss of the hedged position. © 2011 Wiley Periodicals, Inc. Jrl Fut Mark
Resumo:
The case for property has typically rested on the application of modern portfolio theory (MPT), in that property has been shown to offer increased diversification benefits within a multi asset portfolio without hurting portfolio returns especially for lower risk portfolios. However this view is based upon the use of historic, usually appraisal based, data for property. Recent research suggests strongly that such data significantly underestimates the risk characteristics of property, because appraisals explicitly or implicitly smooth out much of the real volatility in property returns. This paper examines the portfolio diversification effects of including property in a multi-asset portfolio, using UK appraisal based (smoothed) data and several derived de-smoothed series. Having considered the effects of de-smoothing, we then consider the inclusion of a further low risk asset (cash) in order to investigate further whether property's place in a low risk portfolio is maintained. The conclusions of this study are that the previous supposed benefits of including property have been overstated. Although property may still have a place in a 'balanced' institutional portfolio, the case for property needs to be reassessed and not be based simplistically on the application of MPT.
Resumo:
Methods for producing nonuniform transformations, or regradings, of discrete data are discussed. The transformations are useful in image processing, principally for enhancement and normalization of scenes. Regradings which “equidistribute” the histogram of the data, that is, which transform it into a constant function, are determined. Techniques for smoothing the regrading, dependent upon a continuously variable parameter, are presented. Generalized methods for constructing regradings such that the histogram of the data is transformed into any prescribed function are also discussed. Numerical algorithms for implementing the procedures and applications to specific examples are described.
Resumo:
By eliminating the short range negative divergence of the Debye–Hückel pair distribution function, but retaining the exponential charge screening known to operate at large interparticle separation, the thermodynamic properties of one-component plasmas of point ions or charged hard spheres can be well represented even in the strong coupling regime. Predicted electrostatic free energies agree within 5% of simulation data for typical Coulomb interactions up to a factor of 10 times the average kinetic energy. Here, this idea is extended to the general case of a uniform ionic mixture, comprising an arbitrary number of components, embedded in a rigid neutralizing background. The new theory is implemented in two ways: (i) by an unambiguous iterative algorithm that requires numerical methods and breaks the symmetry of cross correlation functions; and (ii) by invoking generalized matrix inverses that maintain symmetry and yield completely analytic solutions, but which are not uniquely determined. The extreme computational simplicity of the theory is attractive when considering applications to complex inhomogeneous fluids of charged particles.
Resumo:
Grasslands restoration is a key management tool contributing to the long-term maintenance of insect populations, providing functional connectivity and mitigating against extinction debt across landscapes. As knowledge of grassland insect communities is limited, the lag between the initiation of restoration and the ability of these new habitats to contribute to such processes is unclear. Using ten data sets, ranging from 3 to 14 years, we investigate the lag between restoration and the establishment of phytophagous beetle assemblages typical of species rich grasslands. We used traits and ecological characteristics to determine factors limiting beetle colonisation, and also considered how food-web structure changed during restoration. For sites where seed addition of host-plants occurred the success in replicating beetle assemblages increased over time following a negative exponential function. Extrapolation beyond the existing data set tentatively suggested that success would plateau after 20 years, representing a c. 60% increase in assemblage similarity to target grasslands. In the absence of seed addition, similarity to the target grasslands showed no increase over time. Where seed addition was used the connectance of plant-herbivore food webs decreased over time, approaching values typical of species rich grasslands after c. 7 years. This trend was, however, dependent on the inclusion of a single site containing data in excess of 6 years of restoration management. Beetles not capable of flight, those showing high degrees of host-plant specialisation and species feeding on nationally rare host plants take between 1 and 3 years longer to colonise. Successful grassland restoration is underpinned by the establishment of host-plants, although individual species traits compound the effects of poor host-plant establishment to slow colonisation. The use of pro-active grassland restoration to mitigate against future environmental change should account for lag periods in excess of 10 years if the value of these habitats is to be fully realised.
Resumo:
Small propagules like pollen or fungal spores may be dispersed by the wind over distances of hundreds or thousands of kilometres,even though the median dispersal may be only a few metres. Such long-distance dispersal is a stochastic event which may be exceptionally important in shaping a population. It has been found repeatedly in field studies that subpopulations of wind-dispersed fungal pathogens virulent on cultivars with newly introduced, effective resistance genes are dominated by one or very few genotypes. The role of propagule dispersal distributions with distinct behaviour at long distances in generating this characteristic population structure was studied by computer simulation of dispersal of clonal organisms in a heterogeneous environment with fields of unselective and selective hosts. Power-law distributions generated founder events in which new, virulent genotypes rapidly colonized fields of resistant crop varieties and subsequently dominated the pathogen population on both selective and unselective varieties, in agreement with data on rust and powdery mildew fungi. An exponential dispersal function, with extremely rare dispersal over long distances, resulted in slower colonization of resistant varieties by virulent pathogens or even no colonization if the distance between susceptible source and resistant target fields was sufficiently large. The founder events resulting from long-distance dispersal were highly stochastic and exact quantitative prediction of genotype frequencies will therefore always be difficult.