545 resultados para smoothing
Resumo:
Trends in sample extremes are of interest in many contexts, an example being environmental statistics. Parametric models are often used to model trends in such data, but they may not be suitable for exploratory data analysis. This paper outlines a semiparametric approach to smoothing example extremes, based on local polynomial fitting of the generalized extreme value distribution and related models. The uncertainty of fits is assessed by using resampling methods. The methods are applied to data on extreme temperatures and on record times for the womens 3000m race.
Resumo:
The onset of filamentation, following the interaction of a relatively long (tau(L) similar or equal to 1 ns) and intense (I-L similar or equal to 5 x 10(14) W/cm(2)) laser pulse with a neopentane filled gas bag target, has been experimentally studied via the proton radiography technique, in conditions of direct relevance to the indirect drive inertial confinement fusion scheme. The density gradients associated with filamentation onset have been spatially resolved yielding direct and unambiguous evidence of filament formation and quantitative information about the filamentation mechanism in agreement with previous theoretical modelings. Experimental data confirm that, once spatially smoothed laser beams are used, filamentation is not a relevant phenomenon during the heating laser beams propagation through typical hohlraum gas fills.
Resumo:
Efficient production of coherent harmonic radiation from solid targets relies critically on the formation of smooth, short density scalelength plasmas. Recent experimental results (Dromey et al 2009 Nat. Phys. 5 146) suggest, however, that the target roughness on the scale of the emitted harmonic wavelength does not result in diffuse reflection-in apparent contradiction to the Rayleigh criterion for coherent reflection. In this paper we show, for the first time, using analytic theory and 2D PIC simulations, that the interaction of relativistically strong laser pulses with corrugated target surfaces results in a highly effective smoothing of the interaction surface and consequently the generation of highly collimated and temporally confined XUV pulses from rough targets, in excellent agreement with experimental observations.
Resumo:
In this work we solve Mathematical Programs with Complementarity Constraints using the hyperbolic smoothing strategy. Under this approach, the complementarity condition is relaxed through the use of the hyperbolic smoothing function, involving a positive parameter that can be decreased to zero. An iterative algorithm is implemented in MATLAB language and a set of AMPL problems from MacMPEC database were tested.
Resumo:
In this paper we investigate the commonly used autoregressive filter method of adjusting appraisal-based real estate returns to correct for the perceived biases induced in the appraisal process. Since the early work by Geltner (1989), many papers have been written on this topic but remarkably few have considered the relationship between smoothing at the individual property level and the amount of persistence in the aggregate appraised-based index. To investigate this issue in more detail we analyse a sample of individual property level appraisal data from the Investment Property Database (IPD). We find that commonly used unsmoothing estimates overstate the extent of smoothing that takes place at the individual property level. There is also strong support for an ARFIMA representation of appraisal returns.
Resumo:
There is a substantial literature which suggests that appraisals are smoothed and lag the true level of prices. This study combines a qualitative interview survey of the leading fund manager/owners in the UK and their appraisers with a empirical study of the number of appraisals which change each month within the IPD Monthly Index. The paper concentrates on how the appraisal process operates for commercial property performance measurement purposes. The survey interviews suggest that periodic appraisal services are consolidating in fewer firms and, within these major firms, appraisers adopt different approaches to changing appraisals on a period by period basis, with some wanting hard transaction evidence while others act on "softer' signals. The survey also indicates a seasonal effect with greater effort and information being applied to annual and quarterly appraisals than monthly. The analysis of the appraisals within the Investment Property Databank Monthly Index confirms this effect with around 5% more appraisals being moved at each quarter day than the other months. January and August have significantly less appraisal changes than other months.
Resumo:
There is a substantial literature which suggests that appraisals are smoothed and lag the true level of prices. This study combines a qualitative interview survey of the leading fund manager/owners in the UK and their appraisers with a empirical study of the number of appraisals which change each month within the IPD Monthly Index. The paper concentrates on how the appraisal process operates for commercial real estate performance measurement purposes. The survey interviews suggest that periodic appraisal services are consolidating in fewer firms and, within these major firms, appraisers adopt different approaches to changing appraisals on a period by period basis, with some wanting hard transaction evidence while others act on ‘softer’ signals. The survey also indicates a seasonal effect with greater effort and information being applied to annual and quarterly appraisals than monthly. The analysis of the appraisals within the IPD Monthly Index confirms this effect with around 5% more appraisals being moved at each quarter day than the other months. More November appraisals change than expected and this suggests that the increased information flows for the December end year appraisals are flowing through into earlier appraisals, especially as client/appraiser draft appraisal meetings for the December appraisals, a regular occurrence in the UK, can occur in November. January illustrates significantly less activity than other months, a seasonal effect after the exertions of the December appraisals.
Resumo:
In this article, we investigate the commonly used autoregressive filter method of adjusting appraisal-based real estate returns to correct for the perceived biases induced in the appraisal process. Many articles have been written on appraisal smoothing but remarkably few have considered the relationship between smoothing at the individual property level and the amount of persistence in the aggregate appraisal-based index. To investigate this issue we analyze a large sample of appraisal data at the individual property level from the Investment Property Databank. We find that commonly used unsmoothing estimates at the index level overstate the extent of smoothing that takes place at the individual property level. There is also strong support for an ARFIMA representation of appraisal returns at the index level and an ARMA model at the individual property level.
Resumo:
We examine the strategies interwar working-class British households used to “smooth” consumption over time and guard against negative contingencies such as illness, unemployment, and death. Newly discovered returns from the U.K. Ministry of Labour's 1937/38 Household Expenditure Survey are used to fully categorize expenditure smoothing via nineteen credit/savings vehicles. We find that households made extensive use of expenditure-smoothing devices. Families' reliance on expenditure-smoothing is shown to be inversely related to household income, while households also used these mechanisms more intensively during expenditure crisis phases of the family life cycle, especially the years immediately after new household formation.
Resumo:
We present a new subcortical structure shape modeling framework using heat kernel smoothing constructed with the Laplace-Beltrami eigenfunctions. The cotan discretization is used to numerically obtain the eigenfunctions of the Laplace-Beltrami operator along the surface of subcortical structures of the brain. The eigenfunctions are then used to construct the heat kernel and used in smoothing out measurements noise along the surface. The proposed framework is applied in investigating the influence of age (38-79 years) and gender on amygdala and hippocampus shape. We detected a significant age effect on hippocampus in accordance with the previous studies. In addition, we also detected a significant gender effect on amygdala. Since we did not find any such differences in the traditional volumetric methods, our results demonstrate the benefit of the current framework over traditional volumetric methods.
Resumo:
Purpose: To quantify to what extent the new registration method, DARTEL (Diffeomorphic Anatomical Registration Through Exponentiated Lie Algebra), may reduce the smoothing kernel width required and investigate the minimum group size necessary for voxel-based morphometry (VBM) studies. Materials and Methods: A simulated atrophy approach was employed to explore the role of smoothing kernel, group size, and their interactions on VBM detection accuracy. Group sizes of 10, 15, 25, and 50 were compared for kernels between 0–12 mm. Results: A smoothing kernel of 6 mm achieved the highest atrophy detection accuracy for groups with 50 participants and 8–10 mm for the groups of 25 at P < 0.05 with familywise correction. The results further demonstrated that a group size of 25 was the lower limit when two different groups of participants were compared, whereas a group size of 15 was the minimum for longitudinal comparisons but at P < 0.05 with false discovery rate correction. Conclusion: Our data confirmed DARTEL-based VBM generally benefits from smaller kernels and different kernels perform best for different group sizes with a tendency of smaller kernels for larger groups. Importantly, the kernel selection was also affected by the threshold applied. This highlighted that the choice of kernel in relation to group size should be considered with care.