956 resultados para Exponential smoothing
Resumo:
In this paper, we propose to study a class of neural networks with recent-history distributed delays. A sufficient condition is derived for the global exponential periodicity of the proposed neural networks, which has the advantage that it assumes neither the differentiability nor monotonicity of the activation function of each neuron nor the symmetry of the feedback matrix or delayed feedback matrix. Our criterion is shown to be valid by applying it to an illustrative system. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
In this paper we investigate the commonly used autoregressive filter method of adjusting appraisal-based real estate returns to correct for the perceived biases induced in the appraisal process. Since the early work by Geltner (1989), many papers have been written on this topic but remarkably few have considered the relationship between smoothing at the individual property level and the amount of persistence in the aggregate appraised-based index. To investigate this issue in more detail we analyse a sample of individual property level appraisal data from the Investment Property Database (IPD). We find that commonly used unsmoothing estimates overstate the extent of smoothing that takes place at the individual property level. There is also strong support for an ARFIMA representation of appraisal returns.
Resumo:
There is a substantial literature which suggests that appraisals are smoothed and lag the true level of prices. This study combines a qualitative interview survey of the leading fund manager/owners in the UK and their appraisers with a empirical study of the number of appraisals which change each month within the IPD Monthly Index. The paper concentrates on how the appraisal process operates for commercial property performance measurement purposes. The survey interviews suggest that periodic appraisal services are consolidating in fewer firms and, within these major firms, appraisers adopt different approaches to changing appraisals on a period by period basis, with some wanting hard transaction evidence while others act on "softer' signals. The survey also indicates a seasonal effect with greater effort and information being applied to annual and quarterly appraisals than monthly. The analysis of the appraisals within the Investment Property Databank Monthly Index confirms this effect with around 5% more appraisals being moved at each quarter day than the other months. January and August have significantly less appraisal changes than other months.
Resumo:
There is a substantial literature which suggests that appraisals are smoothed and lag the true level of prices. This study combines a qualitative interview survey of the leading fund manager/owners in the UK and their appraisers with a empirical study of the number of appraisals which change each month within the IPD Monthly Index. The paper concentrates on how the appraisal process operates for commercial real estate performance measurement purposes. The survey interviews suggest that periodic appraisal services are consolidating in fewer firms and, within these major firms, appraisers adopt different approaches to changing appraisals on a period by period basis, with some wanting hard transaction evidence while others act on ‘softer’ signals. The survey also indicates a seasonal effect with greater effort and information being applied to annual and quarterly appraisals than monthly. The analysis of the appraisals within the IPD Monthly Index confirms this effect with around 5% more appraisals being moved at each quarter day than the other months. More November appraisals change than expected and this suggests that the increased information flows for the December end year appraisals are flowing through into earlier appraisals, especially as client/appraiser draft appraisal meetings for the December appraisals, a regular occurrence in the UK, can occur in November. January illustrates significantly less activity than other months, a seasonal effect after the exertions of the December appraisals.
Resumo:
In this article, we investigate the commonly used autoregressive filter method of adjusting appraisal-based real estate returns to correct for the perceived biases induced in the appraisal process. Many articles have been written on appraisal smoothing but remarkably few have considered the relationship between smoothing at the individual property level and the amount of persistence in the aggregate appraisal-based index. To investigate this issue we analyze a large sample of appraisal data at the individual property level from the Investment Property Databank. We find that commonly used unsmoothing estimates at the index level overstate the extent of smoothing that takes place at the individual property level. There is also strong support for an ARFIMA representation of appraisal returns at the index level and an ARMA model at the individual property level.
Resumo:
We examine the strategies interwar working-class British households used to “smooth” consumption over time and guard against negative contingencies such as illness, unemployment, and death. Newly discovered returns from the U.K. Ministry of Labour's 1937/38 Household Expenditure Survey are used to fully categorize expenditure smoothing via nineteen credit/savings vehicles. We find that households made extensive use of expenditure-smoothing devices. Families' reliance on expenditure-smoothing is shown to be inversely related to household income, while households also used these mechanisms more intensively during expenditure crisis phases of the family life cycle, especially the years immediately after new household formation.
Resumo:
Pardo, Patie, and Savov derived, under mild conditions, a Wiener-Hopf type factorization for the exponential functional of proper Lévy processes. In this paper, we extend this factorization by relaxing a finite moment assumption as well as by considering the exponential functional for killed Lévy processes. As a by-product, we derive some interesting fine distributional properties enjoyed by a large class of this random variable, such as the absolute continuity of its distribution and the smoothness, boundedness or complete monotonicity of its density. This type of results is then used to derive similar properties for the law of maxima and first passage time of some stable Lévy processes. Thus, for example, we show that for any stable process with $\rho\in(0,\frac{1}{\alpha}-1]$, where $\rho\in[0,1]$ is the positivity parameter and $\alpha$ is the stable index, then the first passage time has a bounded and non-increasing density on $\mathbb{R}_+$. We also generate many instances of integral or power series representations for the law of the exponential functional of Lévy processes with one or two-sided jumps. The proof of our main results requires different devices from the one developed by Pardo, Patie, Savov. It relies in particular on a generalization of a transform recently introduced by Chazal et al together with some extensions to killed Lévy process of Wiener-Hopf techniques. The factorizations developed here also allow for further applications which we only indicate here also allow for further applications which we only indicate here.
Resumo:
We present a new subcortical structure shape modeling framework using heat kernel smoothing constructed with the Laplace-Beltrami eigenfunctions. The cotan discretization is used to numerically obtain the eigenfunctions of the Laplace-Beltrami operator along the surface of subcortical structures of the brain. The eigenfunctions are then used to construct the heat kernel and used in smoothing out measurements noise along the surface. The proposed framework is applied in investigating the influence of age (38-79 years) and gender on amygdala and hippocampus shape. We detected a significant age effect on hippocampus in accordance with the previous studies. In addition, we also detected a significant gender effect on amygdala. Since we did not find any such differences in the traditional volumetric methods, our results demonstrate the benefit of the current framework over traditional volumetric methods.
Resumo:
Purpose: To quantify to what extent the new registration method, DARTEL (Diffeomorphic Anatomical Registration Through Exponentiated Lie Algebra), may reduce the smoothing kernel width required and investigate the minimum group size necessary for voxel-based morphometry (VBM) studies. Materials and Methods: A simulated atrophy approach was employed to explore the role of smoothing kernel, group size, and their interactions on VBM detection accuracy. Group sizes of 10, 15, 25, and 50 were compared for kernels between 0–12 mm. Results: A smoothing kernel of 6 mm achieved the highest atrophy detection accuracy for groups with 50 participants and 8–10 mm for the groups of 25 at P < 0.05 with familywise correction. The results further demonstrated that a group size of 25 was the lower limit when two different groups of participants were compared, whereas a group size of 15 was the minimum for longitudinal comparisons but at P < 0.05 with false discovery rate correction. Conclusion: Our data confirmed DARTEL-based VBM generally benefits from smaller kernels and different kernels perform best for different group sizes with a tendency of smaller kernels for larger groups. Importantly, the kernel selection was also affected by the threshold applied. This highlighted that the choice of kernel in relation to group size should be considered with care.
Resumo:
For a Lévy process ξ=(ξt)t≥0 drifting to −∞, we define the so-called exponential functional as follows: Formula Under mild conditions on ξ, we show that the following factorization of exponential functionals: Formula holds, where × stands for the product of independent random variables, H− is the descending ladder height process of ξ and Y is a spectrally positive Lévy process with a negative mean constructed from its ascending ladder height process. As a by-product, we generate an integral or power series representation for the law of Iξ for a large class of Lévy processes with two-sided jumps and also derive some new distributional properties. The proof of our main result relies on a fine Markovian study of a class of generalized Ornstein–Uhlenbeck processes, which is itself of independent interest. We use and refine an alternative approach of studying the stationary measure of a Markov process which avoids some technicalities and difficulties that appear in the classical method of employing the generator of the dual Markov process.
Resumo:
We study the approximation of harmonic functions by means of harmonic polynomials in two-dimensional, bounded, star-shaped domains. Assuming that the functions possess analytic extensions to a delta-neighbourhood of the domain, we prove exponential convergence of the approximation error with respect to the degree of the approximating harmonic polynomial. All the constants appearing in the bounds are explicit and depend only on the shape-regularity of the domain and on delta. We apply the obtained estimates to show exponential convergence with rate O(exp(−b square root N)), N being the number of degrees of freedom and b>0, of a hp-dGFEM discretisation of the Laplace equation based on piecewise harmonic polynomials. This result is an improvement over the classical rate O(exp(−b cubic root N )), and is due to the use of harmonic polynomial spaces, as opposed to complete polynomial spaces.
Resumo:
Radar refractivity retrievals have the potential to accurately capture near-surface humidity fields from the phase change of ground clutter returns. In practice, phase changes are very noisy and the required smoothing will diminish large radial phase change gradients, leading to severe underestimates of large refractivity changes (ΔN). To mitigate this, the mean refractivity change over the field (ΔNfield) must be subtracted prior to smoothing. However, both observations and simulations indicate that highly correlated returns (e.g., when single targets straddle neighboring gates) result in underestimates of ΔNfield when pulse-pair processing is used. This may contribute to reported differences of up to 30 N units between surface observations and retrievals. This effect can be avoided if ΔNfield is estimated using a linear least squares fit to azimuthally averaged phase changes. Nevertheless, subsequent smoothing of the phase changes will still tend to diminish the all-important spatial perturbations in retrieved refractivity relative to ΔNfield; an iterative estimation approach may be required. The uncertainty in the target location within the range gate leads to additional phase noise proportional to ΔN, pulse length, and radar frequency. The use of short pulse lengths is recommended, not only to reduce this noise but to increase both the maximum detectable refractivity change and the number of suitable targets. Retrievals of refractivity fields must allow for large ΔN relative to an earlier reference field. This should be achievable for short pulses at S band, but phase noise due to target motion may prevent this at C band, while at X band even the retrieval of ΔN over shorter periods may at times be impossible.