545 resultados para smoothing


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Capturing the pattern of structural change is a relevant task in applied demand analysis, as consumer preferences may vary significantly over time. Filtering and smoothing techniques have recently played an increasingly relevant role. A dynamic Almost Ideal Demand System with random walk parameters is estimated in order to detect modifications in consumer habits and preferences, as well as changes in the behavioural response to prices and income. Systemwise estimation, consistent with the underlying constraints from economic theory, is achieved through the EM algorithm. The proposed model is applied to UK aggregate consumption of alcohol and tobacco, using quarterly data from 1963 to 2003. Increased alcohol consumption is explained by a preference shift, addictive behaviour and a lower price elasticity. The dynamic and time-varying specification is consistent with the theoretical requirements imposed at each sample point. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A fully automated procedure to extract and to image local fibre orientation in biological tissues from scanning X-ray diffraction is presented. The preferred chitin fibre orientation in the flow sensing system of crickets is determined with high spatial resolution by applying synchrotron radiation based X-ray microbeam diffraction in conjunction with advanced sample sectioning using a UV micro-laser. The data analysis is based on an automated detection of azimuthal diffraction maxima after 2D convolution filtering (smoothing) of the 2D diffraction patterns. Under the assumption of crystallographic fibre symmetry around the morphological fibre axis, the evaluation method allows mapping the three-dimensional orientation of the fibre axes in space. The resulting two-dimensional maps of the local fibre orientations - together with the complex shape of the flow sensing system - may be useful for a better understanding of the mechanical optimization of such tissues.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two experiments examine the effect on an immediate recall test of simulating a reverberant auditory environment in which auditory distracters in the form of speech are played to the participants (the 'irrelevant sound effect'). An echo-intensive environment simulated by the addition of reverberation to the speech reduced the extent of 'changes in state' in the irrelevant speech stream by smoothing the profile of the waveform. In both experiments, the reverberant auditory environment produced significantly smaller irrelevant sound distraction effects than an echo-free environment. Results are interpreted in terms of changing-state hypothesis, which states that acoustic content of irrelevant sound, rather than phonology or semantics, determines the extent of the irrelevant sound effect (ISE). Copyright (C) 2007 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The usefulness of any simulation of atmospheric tracers using low-resolution winds relies on both the dominance of large spatial scales in the strain and time dependence that results in a cascade in tracer scales. Here, a quantitative study on the accuracy of such tracer studies is made using the contour advection technique. It is shown that, although contour stretching rates are very insensitive to the spatial truncation of the wind field, the displacement errors in filament position are sensitive. A knowledge of displacement characteristics is essential if Lagrangian simulations are to be used for the inference of airmass origin. A quantitative lower estimate is obtained for the tracer scale factor (TSF): the ratio of the smallest resolved scale in the advecting wind field to the smallest “trustworthy” scale in the tracer field. For a baroclinic wave life cycle the TSF = 6.1 ± 0.3 while for the Northern Hemisphere wintertime lower stratosphere the TSF = 5.5 ± 0.5, when using the most stringent definition of the trustworthy scale. The similarity in the TSF for the two flows is striking and an explanation is discussed in terms of the activity of potential vorticity (PV) filaments. Uncertainty in contour initialization is investigated for the stratospheric case. The effect of smoothing initial contours is to introduce a spinup time, after which wind field truncation errors take over from initialization errors (2–3 days). It is also shown that false detail from the proliferation of finescale filaments limits the useful lifetime of such contour advection simulations to 3σ−1 days, where σ is the filament thinning rate, unless filaments narrower than the trustworthy scale are removed by contour surgery. In addition, PV analysis error and diabatic effects are so strong that only PV filaments wider than 50 km are at all believable, even for very high-resolution winds. The minimum wind field resolution required to accurately simulate filaments down to the erosion scale in the stratosphere (given an initial contour) is estimated and the implications for the modeling of atmospheric chemistry are briefly discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Volatility, or the variability of the underlying asset, is one of the key fundamental components of property derivative pricing and in the application of real option models in development analysis. There has been relatively little work on volatility in real terms of its application to property derivatives and the real options analysis. Most research on volatility stems from investment performance (Nathakumaran & Newell (1995), Brown & Matysiak 2000, Booth & Matysiak 2001). Historic standard deviation is often used as a proxy for volatility and there has been a reliance on indices, which are subject to valuation smoothing effects. Transaction prices are considered to be more volatile than the traditional standard deviations of appraisal based indices. This could lead, arguably, to inefficiencies and mis-pricing, particularly if it is also accepted that changes evolve randomly over time and where future volatility and not an ex-post measure is the key (Sing 1998). If history does not repeat, or provides an unreliable measure, then estimating model based (implied) volatility is an alternative approach (Patel & Sing 2000). This paper is the first of two that employ alternative approaches to calculating and capturing volatility in UK real estate for the purposes of applying the measure to derivative pricing and real option models. It draws on a uniquely constructed IPD/Gerald Eve transactions database, containing over 21,000 properties over the period 1983-2005. In this first paper the magnitude of historic amplification associated with asset returns by sector and geographic spread is looked at. In the subsequent paper the focus will be upon model based (implied) volatility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ASTER Global Digital Elevation Model (GDEM) has made elevation data at 30 m spatial resolution freely available, enabling reinvestigation of morphometric relationships derived from limited field data using much larger sample sizes. These data are used to analyse a range of morphometric relationships derived for dunes (between dune height, spacing, and equivalent sand thickness) in the Namib Sand Sea, which was chosen because there are a number of extant studies that could be used for comparison with the results. The relative accuracy of GDEM for capturing dune height and shape was tested against multiple individual ASTER DEM scenes and against field surveys, highlighting the smoothing of the dune crest and resultant underestimation of dune height, and the omission of the smallest dunes, because of the 30 m sampling of ASTER DEM products. It is demonstrated that morphometric relationships derived from GDEM data are broadly comparable with relationships derived by previous methods, across a range of different dune types. The data confirm patterns of dune height, spacing and equivalent sand thickness mapped previously in the Namib Sand Sea, but add new detail to these patterns.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The case for property has typically rested on the application of modern portfolio theory (MPT), in that property has been shown to offer increased diversification benefits within a multi asset portfolio without hurting portfolio returns especially for lower risk portfolios. However this view is based upon the use of historic, usually appraisal based, data for property. Recent research suggests strongly that such data significantly underestimates the risk characteristics of property, because appraisals explicitly or implicitly smooth out much of the real volatility in property returns. This paper examines the portfolio diversification effects of including property in a multi-asset portfolio, using UK appraisal based (smoothed) data and several derived de-smoothed series. Having considered the effects of de-smoothing, we then consider the inclusion of a further low risk asset (cash) in order to investigate further whether property's place in a low risk portfolio is maintained. The conclusions of this study are that the previous supposed benefits of including property have been overstated. Although property may still have a place in a 'balanced' institutional portfolio, the case for property needs to be reassessed and not be based simplistically on the application of MPT.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Methods for producing nonuniform transformations, or regradings, of discrete data are discussed. The transformations are useful in image processing, principally for enhancement and normalization of scenes. Regradings which “equidistribute” the histogram of the data, that is, which transform it into a constant function, are determined. Techniques for smoothing the regrading, dependent upon a continuously variable parameter, are presented. Generalized methods for constructing regradings such that the histogram of the data is transformed into any prescribed function are also discussed. Numerical algorithms for implementing the procedures and applications to specific examples are described.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The plume of Ice Shelf Water (ISW) flowing into the Weddell Sea over the Filchner sill contributes to the formation of Antarctic Bottom Water. The Filchner overflow is simulated using a hydrostatic, primitive equation three-dimensional ocean model with a 0.5–2 Sv ISW influx above the Filchner sill. The best fit to mooring temperature observations is found with influxes of 0.5 and 1 Sv, below a previous estimate of 1.6 ± 0.5 Sv based on sparse mooring velocities. The plume first moves north over the continental shelf, and then turns west, along slope of the continental shelf break where it breaks up into subplumes and domes, some of which then move downslope. Other subplumes run into the eastern submarine ridge and propagate along the ridge downslope in a chaotic manner. The next, western ridge is crossed by the plume through several paths. Despite a number of discrepancies with observational data, the model reproduces many attributes of the flow. In particular, we argue that the temporal variability shown by the observations can largely be attributed to the unstable structure of the flow, where the temperature fluctuations are determined by the motion of the domes past the moorings. Our sensitivity studies show that while thermobaricity plays a role, its effect is small for the flows considered. Smoothing the ridges out demonstrate that their presence strongly affects the plume shape around the ridges. An increase in the bottom drag or viscosity leads to slowing down, and hence thickening and widening of the plume

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a new sparse shape modeling framework on the Laplace-Beltrami (LB) eigenfunctions. Traditionally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes by forming a Fourier series expansion. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we propose to filter out only the significant eigenfunctions by imposing l1-penalty. The new sparse framework can further avoid additional surface-based smoothing often used in the field. The proposed approach is applied in investigating the influence of age (38-79 years) and gender on amygdala and hippocampus shapes in the normal population. In addition, we show how the emotional response is related to the anatomy of the subcortical structures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Evolutionary meta-algorithms for pulse shaping of broadband femtosecond duration laser pulses are proposed. The genetic algorithm searching the evolutionary landscape for desired pulse shapes consists of a population of waveforms (genes), each made from two concatenated vectors, specifying phases and magnitudes, respectively, over a range of frequencies. Frequency domain operators such as mutation, two-point crossover average crossover, polynomial phase mutation, creep and three-point smoothing as well as a time-domain crossover are combined to produce fitter offsprings at each iteration step. The algorithm applies roulette wheel selection; elitists and linear fitness scaling to the gene population. A differential evolution (DE) operator that provides a source of directed mutation and new wavelet operators are proposed. Using properly tuned parameters for DE, the meta-algorithm is used to solve a waveform matching problem. Tuning allows either a greedy directed search near the best known solution or a robust search across the entire parameter space.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sea surface temperature (SST) can be estimated from day and night observations of the Spinning Enhanced Visible and Infra-Red Imager (SEVIRI) by optimal estimation (OE). We show that exploiting the 8.7 μm channel, in addition to the “traditional” wavelengths of 10.8 and 12.0 μm, improves OE SST retrieval statistics in validation. However, the main benefit is an improvement in the sensitivity of the SST estimate to variability in true SST. In a fair, single-pixel comparison, the 3-channel OE gives better results than the SST estimation technique presently operational within the Ocean and Sea Ice Satellite Application Facility. This operational technique is to use SST retrieval coefficients, followed by a bias-correction step informed by radiative transfer simulation. However, the operational technique has an additional “atmospheric correction smoothing”, which improves its noise performance, and hitherto had no analogue within the OE framework. Here, we propose an analogue to atmospheric correction smoothing, based on the expectation that atmospheric total column water vapour has a longer spatial correlation length scale than SST features. The approach extends the observations input to the OE to include the averaged brightness temperatures (BTs) of nearby clear-sky pixels, in addition to the BTs of the pixel for which SST is being retrieved. The retrieved quantities are then the single-pixel SST and the clear-sky total column water vapour averaged over the vicinity of the pixel. This reduces the noise in the retrieved SST significantly. The robust standard deviation of the new OE SST compared to matched drifting buoys becomes 0.39 K for all data. The smoothed OE gives SST sensitivity of 98% on average. This means that diurnal temperature variability and ocean frontal gradients are more faithfully estimated, and that the influence of the prior SST used is minimal (2%). This benefit is not available using traditional atmospheric correction smoothing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of Bayesian inference in the inference of time-frequency representations has, thus far, been limited to offline analysis of signals, using a smoothing spline based model of the time-frequency plane. In this paper we introduce a new framework that allows the routine use of Bayesian inference for online estimation of the time-varying spectral density of a locally stationary Gaussian process. The core of our approach is the use of a likelihood inspired by a local Whittle approximation. This choice, along with the use of a recursive algorithm for non-parametric estimation of the local spectral density, permits the use of a particle filter for estimating the time-varying spectral density online. We provide demonstrations of the algorithm through tracking chirps and the analysis of musical data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Flood simulation models and hazard maps are only as good as the underlying data against which they are calibrated and tested. However, extreme flood events are by definition rare, so the observational data of flood inundation extent are limited in both quality and quantity. The relative importance of these observational uncertainties has increased now that computing power and accurate lidar scans make it possible to run high-resolution 2D models to simulate floods in urban areas. However, the value of these simulations is limited by the uncertainty in the true extent of the flood. This paper addresses that challenge by analyzing a point dataset of maximum water extent from a flood event on the River Eden at Carlisle, United Kingdom, in January 2005. The observation dataset is based on a collection of wrack and water marks from two postevent surveys. A smoothing algorithm for identifying, quantifying, and reducing localized inconsistencies in the dataset is proposed and evaluated showing positive results. The proposed smoothing algorithm can be applied in order to improve flood inundation modeling assessment and the determination of risk zones on the floodplain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Considerable effort is presently being devoted to producing high-resolution sea surface temperature (SST) analyses with a goal of spatial grid resolutions as low as 1 km. Because grid resolution is not the same as feature resolution, a method is needed to objectively determine the resolution capability and accuracy of SST analysis products. Ocean model SST fields are used in this study as simulated “true” SST data and subsampled based on actual infrared and microwave satellite data coverage. The subsampled data are used to simulate sampling errors due to missing data. Two different SST analyses are considered and run using both the full and the subsampled model SST fields, with and without additional noise. The results are compared as a function of spatial scales of variability using wavenumber auto- and cross-spectral analysis. The spectral variance at high wavenumbers (smallest wavelengths) is shown to be attenuated relative to the true SST because of smoothing that is inherent to both analysis procedures. Comparisons of the two analyses (both having grid sizes of roughly ) show important differences. One analysis tends to reproduce small-scale features more accurately when the high-resolution data coverage is good but produces more spurious small-scale noise when the high-resolution data coverage is poor. Analysis procedures can thus generate small-scale features with and without data, but the small-scale features in an SST analysis may be just noise when high-resolution data are sparse. Users must therefore be skeptical of high-resolution SST products, especially in regions where high-resolution (~5 km) infrared satellite data are limited because of cloud cover.